Ubuntu :: Make A FTP And HTTP Server?
Mar 24, 2010i need to make a HTTP and FTP server in ubuntu,how to install and configure
View 1 Repliesi need to make a HTTP and FTP server in ubuntu,how to install and configure
View 1 RepliesI setting up a web server on my Linux (Centos) using "Apache" web server. And its working well, it will show my websites. But when i try to put my url in a internet browser (ie. only type "myweb.com" ) is only be [URL]. Usually as i know like the other websites (ie. google.com) it will go to auto replace the name be [URL]. But my url here it's not be replacing like that. How to do this configuration. I don't know where the services that i need to look. (ie; named (bind) or in Apache web server it self)?
View 2 Replies View RelatedHow can I set my server to listen at a different port for http access. I would like to use port 8080 (to circumnavigate isp blocks). Also can I do the same thing for sftp connections?
View 3 Replies View RelatedI installed Nagios on my Ubuntu 10.04 server using apt-get and when I accessed the web console, everything was OK. I made some changes to apache (creating some new virtual sites) and since then Nagios gives me a warning message for HTTP with the message, HTTP WARNING: HTTP/1.1 404 Not Found. The sites that I created are working perfectly. I noticed that the attemps are 4/4. Does this need to be reset or does Nagios automatically reset that once it detects the issue is resolved?
View 1 Replies View RelatedI've set up an FTP and it works fine. Now the idea is to make it viewable from http://..... I've redirected the DNS to the specific IP, I've installed apache2 (using debian), I've configured the apache2.conf (as much as i know how ), but still nothing happens... The idea is, the files to be viewable/downloadable throught [URL], but the directory's in [URL] not to be seen (restricted)... The FTP directory is /ftp/username Tried this in apache2.conf
Code:
NameVirtualHost xxxx:80
<VirtualHost xxxx:80>
ServerName ftp.xxxx.org
ServerAlias ftp.xxxx.org
ServerPath /ftp/username
<Directory /ftp/username>
Options +Indexes
[Code].......
I did a wget on the source and built the apache binaries correctly. Now what do I need to do to get some documents accessible using HTTP (start some services?)? Also, do I need to group all the files I want to make accessible in some directory and make the directory and its contents accessible or can I just make the individual documents available? I will be providing these links to my colleagues and do not want them to be down, so need to make sure that the apache services are up automatically after a reboot. Does apache have some inbuilt support for this?
View 2 Replies View Relatedhow do i able to allow some users that are able to create content in directory of http server. For example: i have configured a web server which have default document root /var/www/html, now i want to extend my web server through virtual hosting , i have enable virtual hosting, but i want that user sumit is able to create content in /var/www/html/secret. which is the document root for my virtual site?
View 5 Replies View RelatedI have been beating my head for the last few weeks on this problem, (although I have been taking the wrong approach, it seems).
I need a gateway to direct web traffic to three separate servers/domains. I have been trying to do this with both a dns server and , (seperatly), apache server to forward requests. The dns server was a no go, and <i can only get apache to redirect http and ftp.
After Googling this ALOT, I believe that what I need is a gateway server to redirect my traffic to the 3 different servers. I have been reading about using using nat and iptables for this and was wondering if anyone had any advice/suggestions on this. The other thought I had was to use something like pfSense to create the gateway, but I am still reading the documentation, and I am unsure if this approach will work.
I have a debian box running Apache2 and PHP5.2.6 lenny.
When a request is made via https, php displays the content fine. If the request is made over HTTP the file is offered for download, rather than displaying it.
I know its probably something trivial but I've never seen this issue.
The plot thickens, I can display PHP over HTTP in some directories but not others (which offer the file for download)?
Here is my query:
Squid document says that Squid accepts only HTTP requests but speaks FTP on the server side when FTP object are requested.
We call Squid HTTP and FTP caching proxy server. Does it also caches FTP contents? Is it possible to configure FTP clients to use Squid cache? When we make an FTP request to an FTP site via Squid will it be bypassed?
Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.
nc -v www dot aalto dot fi 8080
HEAD / HTML/1.1
host: www dot aalto dot fi
And it returns:
200 OK
Content-Length: 858
Content-Type: text/html
Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT
[Code]....
I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.
I'd like to report an issue I've had with Ubuntu server ISO. I downloaded ubuntu-9.10-server-i386.iso by HTTP on ubuntu's website and burned it on a CD. It doesn't work well. I got an error in udevadm sys/devices/pci0000 etc. it was a problem with the hardware, but it seems that it's the ISO that is corrupted. I checked the MD5 checksum and it's not good. Then I download the same ISO a second time (by HTTP) and same problem.
So it seems to me that the ubuntu-9.10-server-i386.iso that we can download by HTTP is not the same as the torrent one. Maybe I'm wrong. Anyway, if I'm right I hope this information will be useful for administrators.
Cannot get vmware server to work properly running on ubuntu server 9.04
Trying to access the web interface have to highlight the url and keep hitting enter several times to get to the login and after logging in it is real slow and nothing works cannot create virtual machines
I added vmware server to an existing LAMP server
I have a registered domain that resolves (via dyndns) xxx.mydomain.com to my external facing router -easy no issues there. Behind the router I have several machines (some VMs) running webservers, mail etc... What I want to be able to do is redirect the external traffic based on xxx to the relevant internal machine and serve the content back to the external world.
I have tried using a http rewrite of xxx.mydomain.com to the relevant machine an it works fine from within my network, however externally the re-direct fails as the master DNS servers have no record of internal DNS setup in my network (obviously).
So is there anything I can do to get xxx. recognized externally? I'm only just starting to get my head around how DNS, HTTP, TCP etc all hang together. Am I barking up the wrong tree with rewrite? Should I be looking at proxys?
I have a question about using ubuntu to download files from an HTTP server remotely and didn't know where to put it, so hopefully it falls under general support. Anyway, I am about to move into a place with an incredibly slow internet connection and a tiny data allowance and my brother has said that, if possible, I can use his internet connection to download any large files to a box I can just leave at his place, then I can simply come over to his place every few weeks and copy said files to a hard drive and all will be well. The problem is that I am not sure how to do this.
Today I went out and bought a few parts and built a cheap computer with a HDD big enough to hold whatever I need, however when I got home I realised I had no idea how I was going to handle the software aspect of this. Is there any way that I can access that computer remotely over the internet and schedule fairly large downloads from an http server? Also after talking to a friend I was told that I need to install the server version of ubuntu if this is to work, is this correct? Also, if its relevant the specs of the computer I have for this is using an "Intel Desktop Board D510M0 + Intel Atom Processor D510" which uses 64 bit architecture.
I've been looking for the last few days for a good how to on setting up a home server using Ubuntu 10.10. I have found several and looked them over and have installed the server on to a system and started to get it setup. Now though I cannot access anything via http, even the php info. I want to start from scratch on this but finding the best tutorial on how to do this. The idea on what I plan to do is set it up to be able to hold mp3's on and allow them to be downloaded or played on other computers on the home network. Also will be looking to setup to have other items done as well such as a database that can accessed. I would like to be able to use both Http and Ftp on this if possible.
View 3 Replies View RelatedI have some photos posted in [URL] Into each caption I've added a link to my server to let friends download them in larger sizes. tail -f access_log only displays some of those accesses, I don't understand why. If I reload large image page an input is recorded and displayed from access_log What can be happend?
View 7 Replies View RelatedI'm working on an application that requires a large amount of storage space and I want to handle storage `in-house` (Much cheaper than, say, S3) so we will have multiple servers (Initially 4) with large amounts of storage (6TB each). The storage will need to be very flexible and configurable, each piece of data should be replicated on at least 2 servers and must be easily readable/writable from ether an API of a UNIX device/file/folder like a normal drive, I don't mind which. We must also be able to easily offload content to our HTTP CDN (Edgecast), it doesn't need to have built in HTTP support but if it doesn't I'm going to have to write something to get the files onto HTTP so they can be pulled by the CDN.
I've looked at a lot of solutions including
Eucalyptus Walrus
OpenStack Object Storage
MogileFS
MongoDB GridFS (I'm not sure why, it just sounded cool =) )
and some others which I can't remember
All the servers will be running RHEL 6, they have 4x1.5TB drives which will be RAID1'd into a single partition. All the servers have 1GB/s connections between them and 100MB/s connections to the internet with unlimited bandwidth. They have 2x2.66ghz processors. I understand there isn't a single, perfect answer but it would be nice to get some pointers.
Been a while but have a few scripts that need to hit a website that's local to that network, but also a public site. Currently there is an .htaccess in that folder with this lockdown;
AuthType Basic
AuthName "Restricated"
Require valid-user
Now, can I break that somehow and say (here is my english translation)
[Code]..
are there any linux command to have how many http request are sent and received??
View 4 Replies View RelatedI want to ENABLE SSL on a PORT 2222 :
Now this works fine. But I also want the HTTP URL to work and redirect it to HTTPS.
When I visit http://IP:2222 I get :
Quote:
Bad Request
Your browser sent a request that this server could not understand.
Reason: You're speaking plain HTTP to an SSL-enabled server port.
Instead use the HTTPS scheme to access this URL, please.
Hint: [url]
How should I make this request of [url] CT to [url]
I'm using a box running CentOS 5.5 powered with Apache2. In this machine I hosted several domains and sub domains, managed by Apache's virtual host.
Due to security issue, one sub domain needs to be able to be accessed either using http or https.
My question is: Is it possible to set a sub domain to be able to be reached using both http and https? If it's possible, how to make it happens?
I can ssh to my server which is on a LAN accessing the'Net through a Linksys modem/router.I want to be able to configure the Router by using the it's web interface, but the server only has a Command Line Interface and I can only run text browsers like Lynx,hich, although I can log onto the router, the Javascript routines mean that I can't configure the router.I can't access the router's web interface from the 'Net because the router is set up to pass any requests on port 80 to the server.Is there any way I can communicate with the router by sending HTTP requests from my browser external to the LANhaving these relayed to the router by the server and then the server relaying the responses back to my browser.
View 2 Replies View RelatedI want to use http protocol for my localdomain's yum. This is the base tag of current local.repo which is using ftp.
[base]
name=Base repository for localdomain
enabled=1
baseurl=ftp://192.168.100.1/pub/os/i386
gpgcheck=1
gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-5
I'm thinking about some ways to limit access to my web-server. It runs Nginx and php in FCGI. The server contains a large amount of information. The data is freely available and no authentication is required but other companies might like to mirror it and use on their own servers.
The requests could be limited on different levels: IP, TCP, HTTP (by nginx) or by the php application. I found some solutions (like Nginx's limit_req_zone directive), but they do not solve the second part of the problem: there's no way to define a whitelist of clients who are allowed to use the data.
I thought about an intellectual firewall that would limit the requests on IP basis, but I'm yet to find such device. Another way was to hack some scripts that would parse the log file every minute and modify the iptables to ban suspicious IPs. It would take days and I doubt this system will survive, say, 1000 requests per second.
Perhaps, some HTTP proxy, like Squid, could do this?
Ok so I've been all over Google trying to find an answer as to why my MPD server won't properly output over httpd. When I try to connect to it from any source, it gives me a "404:Entity Not Found" error. There hasn't been any useful info out there to solve this problem. Help please! Attached is my mpd.conf file. All my permissions and firewalls are set up properly AFAIK.
View 5 Replies View RelatedI am running apache httpd-2.2.3-43.el5.centos.3 When i restart the http, it says the following error "Invalid command 'JkSet', perhaps misspelled or defined by a module not included in the server configuration "
Do I need to install anything like tomcat? or include any configuration setting in apache? kernel version: 2.6.18-194.32.1.el5
My question is rather simple, but i couldn`t find any answer yet i have a debian box connected to the internet through an ad-hoc wireless connection with a win7 box.Could I run a http server on the linux box and access it from the "outside" somehow, since my linux box has a "private network" type IP, ie: 192.168.137.12 ?
View 5 Replies View RelatedI have a DVB-T (USB) card on a Debian Lenny EEE nettop PC. This works fine.I'd like to know if if it would be possible to use the DVB-T stream over a http server.I mean I'd like to be able to select a channel from a web page and to view it from an embedded player in this page.I do not searching a streaming solution. I'd like all is done on server side.Does anyone know if it is possible and how to achieve this?
View 1 Replies View RelatedI need to have Opensuse 11.2 use my proxy server here in the office and it is by hostname/ip:8080 only not HTTP. The problem is using Yast2 I don't have the option of using the proxy that way it wants http. I've been using opensuse on and off since 9 (great flavor BTW my favorite) Easy as you need it to be and just as complicated as you want it to be, a perfect mix.
View 8 Replies View Related