Server :: Get The Files Onto HTTP - Pulled By The CDN?

Apr 25, 2011

I'm working on an application that requires a large amount of storage space and I want to handle storage `in-house` (Much cheaper than, say, S3) so we will have multiple servers (Initially 4) with large amounts of storage (6TB each). The storage will need to be very flexible and configurable, each piece of data should be replicated on at least 2 servers and must be easily readable/writable from ether an API of a UNIX device/file/folder like a normal drive, I don't mind which. We must also be able to easily offload content to our HTTP CDN (Edgecast), it doesn't need to have built in HTTP support but if it doesn't I'm going to have to write something to get the files onto HTTP so they can be pulled by the CDN.

I've looked at a lot of solutions including
Eucalyptus Walrus
OpenStack Object Storage
MogileFS
MongoDB GridFS (I'm not sure why, it just sounded cool =) )
and some others which I can't remember

All the servers will be running RHEL 6, they have 4x1.5TB drives which will be RAID1'd into a single partition. All the servers have 1GB/s connections between them and 100MB/s connections to the internet with unlimited bandwidth. They have 2x2.66ghz processors. I understand there isn't a single, perfect answer but it would be nice to get some pointers.

View 2 Replies


ADVERTISEMENT

Ubuntu :: Download Files From HTTP Server Remotely

Dec 13, 2010

I have a question about using ubuntu to download files from an HTTP server remotely and didn't know where to put it, so hopefully it falls under general support. Anyway, I am about to move into a place with an incredibly slow internet connection and a tiny data allowance and my brother has said that, if possible, I can use his internet connection to download any large files to a box I can just leave at his place, then I can simply come over to his place every few weeks and copy said files to a hard drive and all will be well. The problem is that I am not sure how to do this.

Today I went out and bought a few parts and built a cheap computer with a HDD big enough to hold whatever I need, however when I got home I realised I had no idea how I was going to handle the software aspect of this. Is there any way that I can access that computer remotely over the internet and schedule fairly large downloads from an http server? Also after talking to a friend I was told that I need to install the server version of ubuntu if this is to work, is this correct? Also, if its relevant the specs of the computer I have for this is using an "Intel Desktop Board D510M0 + Intel Atom Processor D510" which uses 64 bit architecture.

View 5 Replies View Related

Server :: Start Download Files Through Http Protocol With Apache 2?

Jul 2, 2010

I'd like to permit to start download file when I click over some links. How can I to start download files through http protocol with apache 2?

View 5 Replies View Related

Ubuntu Servers :: Nagios On 10.04 Server Using Apt-get - HTTP WARNING: HTTP/1.1 404 Not Found

Aug 4, 2010

I installed Nagios on my Ubuntu 10.04 server using apt-get and when I accessed the web console, everything was OK. I made some changes to apache (creating some new virtual sites) and since then Nagios gives me a warning message for HTTP with the message, HTTP WARNING: HTTP/1.1 404 Not Found. The sites that I created are working perfectly. I noticed that the attemps are 4/4. Does this need to be reset or does Nagios automatically reset that once it detects the issue is resolved?

View 1 Replies View Related

Server :: Send Files From A Unix Using Http / Curl To A Webserver Running Apache

Jun 9, 2010

I'm trying to send files from a Unix server using http/curl to a Linux webserver running Apache. I get the following PUT error message when and the file does not send:

<title>405 Method Not Allowed</title>
</head><body>
<h1>Method Not Allowed</h1>
<p>The requested method PUT is not allowed for the URL

View 2 Replies View Related

Hardware :: USB Harddrive Won't Mount After Cord Was Pulled Out

Jul 16, 2010

I was copying some files over to my Seagate FreeAgent external drive and someone (->me<-) tripped over the usb cord while walking through the area. Now the drive won't mount.

View 4 Replies View Related

Ubuntu Security :: Patch For Sudo That Allows Sudoers Information To Be Pulled From MySQL?

Apr 12, 2011

This may be a stupid (?) question, but does any one know of a patch for sudo that allows the sudoers information to be pulled from mySQL?
I run multiple servers with multiple people working on them and would like a one-stop update of permissions.
Yes, I could use rsync or the like, but I'm just wondering if this has been done, or could be done.

(Sorry if this is the wrong forum, I'm kinda new around here, posting wise and this seemed to fit. Feel free to move it if it's not)

View 3 Replies View Related

Server :: Http Server In Red Hat 6 - Extend Web Server Through Virtual Hosting?

Aug 17, 2011

how do i able to allow some users that are able to create content in directory of http server. For example: i have configured a web server which have default document root /var/www/html, now i want to extend my web server through virtual hosting , i have enable virtual hosting, but i want that user sumit is able to create content in /var/www/html/secret. which is the document root for my virtual site?

View 5 Replies View Related

CentOS 5 Server :: Gateway Server To Redirect Traffic For Http/smpt/pop3

Apr 6, 2010

I have been beating my head for the last few weeks on this problem, (although I have been taking the wrong approach, it seems).

I need a gateway to direct web traffic to three separate servers/domains. I have been trying to do this with both a dns server and , (seperatly), apache server to forward requests. The dns server was a no go, and <i can only get apache to redirect http and ftp.

After Googling this ALOT, I believe that what I need is a gateway server to redirect my traffic to the 3 different servers. I have been reading about using using nat and iptables for this and was wondering if anyone had any advice/suggestions on this. The other thought I had was to use something like pfSense to create the gateway, but I am still reading the documentation, and I am unsure if this approach will work.

View 1 Replies View Related

Server :: Server Offers PHP File For Download On HTTP But Is Fine On HTTPS

Mar 8, 2011

I have a debian box running Apache2 and PHP5.2.6 lenny.

When a request is made via https, php displays the content fine. If the request is made over HTTP the file is offered for download, rather than displaying it.

I know its probably something trivial but I've never seen this issue.

The plot thickens, I can display PHP over HTTP in some directories but not others (which offer the file for download)?

View 9 Replies View Related

Server :: Squid Accepts Only HTTP Requests But Speaks FTP On The Server Side?

Apr 26, 2011

Here is my query:

Squid document says that Squid accepts only HTTP requests but speaks FTP on the server side when FTP object are requested.

We call Squid HTTP and FTP caching proxy server. Does it also caches FTP contents? Is it possible to configure FTP clients to use Squid cache? When we make an FTP request to an FTP site via Squid will it be bypassed?

View 5 Replies View Related

Fedora Networking :: Download All The Files In An Http:// Folder?

Sep 18, 2009

Yahoo! is shutting down Geocities and I need to download all the files in my webfolder there, is there a program that will download all the files there automatically

View 1 Replies View Related

Debian :: Nginx - View Files In Folder Via HTTP

Feb 13, 2011

How would a make files in /home/user01/file available on the web as [URL]? Is it possible for me to have anyone to access that link to log in as user01?

View 3 Replies View Related

Programming :: Pull Http Links Out Of Text Files?

Aug 15, 2010

I am trying to figure out a way to pull http links out of text files and then output the results in a log. The text files are in folders like this inside a source directory.

/source
./folder1
...folder1.txt
./folder2
...folder1.txt

[Code]....

View 4 Replies View Related

Networking :: Craft A Valid Http/1.1 Request For Getting Http Headers (not The Html File Itself)

Sep 27, 2010

Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.

nc -v www dot aalto dot fi 8080
HEAD / HTML/1.1
host: www dot aalto dot fi
And it returns:
200 OK
Content-Length: 858
Content-Type: text/html
Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT
[Code]....

I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.

View 2 Replies View Related

Server :: Make Auto Replace Http Url In A Web Server?

Jul 2, 2010

I setting up a web server on my Linux (Centos) using "Apache" web server. And its working well, it will show my websites. But when i try to put my url in a internet browser (ie. only type "myweb.com" ) is only be [URL]. Usually as i know like the other websites (ie. google.com) it will go to auto replace the name be [URL]. But my url here it's not be replacing like that. How to do this configuration. I don't know where the services that i need to look. (ie; named (bind) or in Apache web server it self)?

View 2 Replies View Related

OpenSUSE Network :: Dot Desktop Files Have To Contain Access Rights For Smb/http?

May 27, 2011

Problems with launching data files of the nas and saving to them is a kde problem. The dot desktop files have to contain access rights for smb/http etc and even when given these it still will not work. I have mainly concentrated on getting the VLC video player to work as it is capable of playing from just about any source, comes with codecs etc etc. Amazing package really.

Pure K apps such as kwrite at least work fine. I tried setting up samba but to no avail.

As dropping a file into VLC's focus didn't do anything I created a vlc desktop icon and dragged the nas file onto that. It plays and a kde error message pops up from plasma shell - can't find file!

I enable kde automount. The content of that when it starts is disturbing. It shows my system disks a detachable and not attached! No need to worry though. I selected mount on log in and attachment where the server was shown. VLC still wouldn't work.

Next I enable NFS file transfers on the NAS. This has allowed me to use open with directly onto an avi file on the nas. I can also click launch them. Remaining problem is opening files on the nas from within VLC. Up pops the kde message "you can only select local files". The file manager here seems to be an instance of dolphin. This suggest that there is going to be a problem saving files to the nas as well. Looks to be the case. VLC can convert formats and all sorts of things. If I select a file locally and try and convert it and save to the nas up pops the "you can only select local files" as soon as I select ok having set the path and file name.

Strange thing is that working transfers seem to be using CIF even though it took an NFS enable to get it partly working via KDE's automount. Dolphin only allows a CIF set up which has a distinct advantage as a direct ip address can be entered. The automount has introduced a very very long delay before kde is up and running following a log in. Samba is even worse in this respect and both seem to lack a method of direct ip input which means they have to find the server.

One other aspect. As far as NFS is concerned from a very recent post elsewhere nautilus works. Pass on CIF. And of course it's all instantaneous and ok on windows even on vista. Enabling the TV protocol on the nas has confused Vista as it only wants to connect like that and needs drivers. Might also be down to having NFS enabled though. MS might not like that.

I have filed all of this on bugzilla if anyone would like to vote - bug number 695648. Seems to me that the CIFs route should be the default for ease with many users on home networks. I'm also sure that the problem is basically KDE preventing aps from accessing the nas.

View 2 Replies View Related

General :: Make Some Files On Machine Accessible Via HTTP Using Apache?

Mar 6, 2011

I did a wget on the source and built the apache binaries correctly. Now what do I need to do to get some documents accessible using HTTP (start some services?)? Also, do I need to group all the files I want to make accessible in some directory and make the directory and its contents accessible or can I just make the individual documents available? I will be providing these links to my colleagues and do not want them to be down, so need to make sure that the apache services are up automatically after a reboot. Does apache have some inbuilt support for this?

View 2 Replies View Related

Ubuntu Installation :: Network Install From ISO - Cannot Transfer Files With Apache HTTP

Feb 24, 2010

I have tftpd-hpa and dhcp3-server up and running. I just want to install server edition via network, from the host machine (my laptop, running ubuntu 9.10) with an ISO file (ubuntu 8.04 32-bit server edition). I managed to boot the client machine with pxe-netboot technique, but instead downloading all the files from internet, I need to do this process directly from ISO. To transfer ISO from host to client, I also installed Apache. I unpacked ISO file into /var/lib/tftpboot/server/. I created a link to the Apache root: /var/www

Code:
ubuntu@ubuntu:/var/www$ ls
returns => index.html server
server folder is the place where I unpacked the ISO.

My dhcp3-server has this setup and it works well with netboot, but I don't know how to add Apache to the formula to transfer the iso file from host to client. Firewall is disabled. This is my edited /etc/dhcp3/dhcpd.conf file.

Code:
host pxeinstall {
hardware ethernet 00:06:29:DE:E3:CD;
fixed-address 192.168.2.4; (client IP)
next-server 192.168.2.2; (host IP)
filename "/server/install/netboot/pxelinux.0"; (relative to tftpboot)
} subnet 192.168.2.0 netmask 255.255.255.0 {
range 192.168.2.2 192.168.2.5;
option routers 192.168.2.1; }

When I pxe-boot the client, the process comes to a halt when tftp server is trying to access to pxelinux.0 file. I got thls error:
PXE-T00: Permission denied
PXE-E36: Error received from TFTP server
I have no experience with Apache... so I think there is a problem with my IP addresses.. Do I need to use 127.0.1.1 instead of 192.168.2.1 (my routers IP)?

View 3 Replies View Related

Fedora Servers :: Cannot Start Apache - No Read / Write Access To HTTP Files

Jan 14, 2009

I am trying to setup my webserver and I am trying to make a website to run under suexec but somehow I cannot start my apache it directly fails and SELinux is giving me errors and don't really know what to do with it, it is giving me some command to type but not sure if this will make my server less secure. The SELinux error is as follow:

Code:
Summary:
SELinux prevented httpd reading and writing access to http files.

Detailed Description:
SELinux prevented httpd reading and writing access to http files. Ordinarily httpd is allowed full access to all files labeled with http file context. This machine has a tightened security policy with the httpd_unified turned off, this requires explicit labeling of all files. If a file is a cgi script it needs to be labeled with httpd_TYPE_script_exec_t in order to be executed. If it is read-only content, it needs to be labeled httpd_TYPE_content_t, it is writable content. it needs to be labeled httpd_TYPE_script_rw_t or httpd_TYPE_script_ra_t. You can use the chcon command to change these contexts. Please refer to the man page "man httpd_selinux" or FAQ [URL] "TYPE" refers to one of "sys", "user" or "staff" or potentially other script types.

Allowing Access:
Changing the "httpd_unified" boolean to true will allow this access: "setsebool
-P httpd_unified=1"

Fix Command:
setsebool -P httpd_unified=1

I will write down how I did setup my server so maybe you can see a mistake I did. First I changed my Apache httpd.conf I added the following to it:
Code:
NameVirtualHost 192.168.1.2:80
<VirtualHost 192.168.1.2:80>
ServerName localhost
DocumentRoot /var/www/html
DirectoryIndex index.html index.html index.shtml index.php
</VirtualHost>

<VirtualHost 192.168.1.2:80>
SuexecUserGroup ulyaoth ulyaoth
ServerAdmin webmaster@ulyaoth.org
ServerName test.ulyaoth.org
DocumentRoot /var/www/ulyaoth/www/html
ErrorLog /var/www/ulyaoth/logs/error_log
CustomLog /var/www/ulyaoth/logs/access_log common
DirectoryIndex index.html index.htm index.shtml index.php
ScriptAlias /cgi-bin/ /var/www/ulyaoth/www/cgi-bin/
<Directory /var/www/ulyaoth/www/cgi-bin/>
AllowOverride none
Order allow,deny
Allow from all
Options +execCGI
AddHandler cgi-script .cgi .pl
</Directory>
</VirtualHost>

Then I created the username "ulyaoth" with the group "ulyaoth" as I specified with my suexec, then I created all the directories as specified in my httpd.conf and "chown ulyaoth:ulyaoth (dirname)" them to the right group and username.

View 10 Replies View Related

Networking :: Pulled Off - When Click "Print" And Result Comes Out Of A Printer Connected To Some Other Computer On A LAN

Mar 17, 2010

how this is pulled off - when you click "Print" on a computer, and the result comes out of a printer connected to some other computer on a LAN, what goes on? I would like to know this at the deep *hacker* level, and also how to do this in Linux alone, i.e. without Samba or whatever - that is, without Windoze.

View 1 Replies View Related

Ubuntu :: Make A FTP And HTTP Server?

Mar 24, 2010

i need to make a HTTP and FTP server in ubuntu,how to install and configure

View 1 Replies View Related

Server :: Access_log Does Not Log All Http Accesses?

May 16, 2010

I have some photos posted in [URL] Into each caption I've added a link to my server to let friends download them in larger sizes. tail -f access_log only displays some of those accesses, I don't understand why. If I reload large image page an input is recorded and displayed from access_log What can be happend?

View 7 Replies View Related

Server :: HTTP Auth From Outside + Allow From Local?

Jul 29, 2010

Been a while but have a few scripts that need to hit a website that's local to that network, but also a public site. Currently there is an .htaccess in that folder with this lockdown;

AuthType Basic
AuthName "Restricated"
Require valid-user

Now, can I break that somehow and say (here is my english translation)

[Code]..

View 1 Replies View Related

Server :: How Many Http Request Are Sent And Received

Aug 20, 2010

are there any linux command to have how many http request are sent and received??

View 4 Replies View Related

Server :: HTTP To HTTPS On The Same Port ?

Mar 22, 2011

I want to ENABLE SSL on a PORT 2222 :

Now this works fine. But I also want the HTTP URL to work and redirect it to HTTPS.

When I visit http://IP:2222 I get :

Quote:

Bad Request

Your browser sent a request that this server could not understand.

Reason: You're speaking plain HTTP to an SSL-enabled server port.

Instead use the HTTPS scheme to access this URL, please.

Hint: [url]

How should I make this request of [url] CT to [url]

View 14 Replies View Related

Server :: Https And Http For One Domain Name ?

Mar 9, 2011

I'm using a box running CentOS 5.5 powered with Apache2. In this machine I hosted several domains and sub domains, managed by Apache's virtual host.

Due to security issue, one sub domain needs to be able to be accessed either using http or https.

My question is: Is it possible to set a sub domain to be able to be reached using both http and https? If it's possible, how to make it happens?

View 4 Replies View Related

Server :: Relaying HTTP Request Sent Fro Outside The LAN?

Apr 27, 2010

I can ssh to my server which is on a LAN accessing the'Net through a Linksys modem/router.I want to be able to configure the Router by using the it's web interface, but the server only has a Command Line Interface and I can only run text browsers like Lynx,hich, although I can log onto the router, the Javascript routines mean that I can't configure the router.I can't access the router's web interface from the 'Net because the router is set up to pass any requests on port 80 to the server.Is there any way I can communicate with the router by sending HTTP requests from my browser external to the LANhaving these relayed to the router by the server and then the server relaying the responses back to my browser.

View 2 Replies View Related

CentOS 5 Server :: Using Localdomain's YUM With Http?

Feb 10, 2010

I want to use http protocol for my localdomain's yum. This is the base tag of current local.repo which is using ftp.

[base]
name=Base repository for localdomain
enabled=1
baseurl=ftp://192.168.100.1/pub/os/i386
gpgcheck=1
gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-5

View 8 Replies View Related

Ubuntu :: Corrupt Usb-stick - Close Gtk Button And Pulled The Stick Out Of Pc

Oct 1, 2010

i was writing a .img file to my usb stick with ImageWriter, but it didn't seem to do anything so i clicked the close gtk button and pulled the stick out of my pc. now my pc gives my an when i try to open the stick. is there any way to fix this. I can use win xp pro, win xp media center, win 7 starter, ubuntu 9.10 and ubuntu 10.04

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved