I have apache2 installed along with php5 and mysql5.0. I have installed them by themselves along with just installing the full LAMP install through terminal. My problem is that I cannot parse php files, it will just download them instead. I did change my apache server to nick/projects (nick is my username). If I navigate to localhost/index.html it works fine but will not parse the php files I have put there. I also have rails working fine in that folder too using apache2.
I've got an apache-svn server up and running fine but I'm struggling with an irritating problem. I need the apache server to display .vbs, .cs., vb., .sh, .pl., .c, .h, .cpp, etc, etc files in the browser. Whenever our users click on a script they get a download dialog instead of the script being displayed in the browser as plain/text. I have added:
Code: AddType text/plain .vbs Into /etc/apache2/mods-enabled/mime.conf but it seems to be getting ignored. how I can tell apache to treat script files as plain-text?
I'm having a very strange problem with my ubuntu apache2 server running wordpress. i want do download media files (from within a flash-mp3-player onsite or by link [url]) but the file transfer just stops after a while. (at least sometimes) at random positions. after that i have to clear the browsers cache and try again.
It is really annoying, though it is my band's website and we want to share our songs with our friends. i checked from several clients, seems to happen everywhere (linux, mac or windows clients)
I would like to set up tcpdump to rotate log file every 1 hour and retain files for the lat 14 days but I don't think any combination of -C and -W would allow me to do that (Atleast I haven't been able to figure it out), so I am trying to rotate the files every X number of MB and retain the last 20 files. This seems to be fairly simple with the '-C X -W 20' option but I am having some trouble in customizing the names of the log files. I have tried '-w capture-$(date +%Y-%M-%d-%H:%M-)' thinking that each file would start with the current date and time but all files are using the date and time when the capture was started so the only difference is the number at the end (which is done by -W). if I can customize the names of the file so that it has the date and time when the capture in started. In fact if I can do that, I dont need the numbers that '-W' appends at the end but I dont know how to get rid of them.
how I should go about rotating files that end with a date stamp. This is the configuration I have to rotate my Apache access files, but it is not working:
I am trying to configure logrotate on APP/DB servers.As per my backup policy,logs will compress in daily basis and and will be moved to a Central storage device.
My tomcat generate several application logs with date extension as well as .log extension.For eg app.log,app.log.2010-10-23-14,catalina.out,catalina.2010-10-25.log etc.
Currently my tomcat logrotation /etc/logrote.d/ #cat /etc/logroate.d/tomcat/ /usr/local/tomcat/logs/*log {
[code]....
But its rotating logs only with .log extension..ie app.log.2010-10-23-14 (with date extension) is not rotating.If i put "*" instead of "*log",its rotating all files including rotated files. How can i rotate files which is having date extension.Also i dont want to keep rotated logs for more than 3 days.
writing a script that would keep the last three versions of tcpdump files.Due to the version of tcpdump I must use -C and cannot use -G. Using -C generates a new file after X MB's have been written and adds a .x after each new one. The problem is that these files are filling up the disk too quickly. The main part of the script will kill tcpdump when a certain condition is met but in the meantime I need to purge and only keep say the three last iterations of the dump file. So for example, there is dump.pcap.1, dump.pcap.2, dump.pcap.3, dump.pcap.4 and dump.pcap.5. I'd like the script to look at the datestamps and delete dump.pcap.1 and dump.pcap2 since the other three are the three newest files. comparing files based on dump.pcap.*, check the dates and only keep the three 'youngest' files?
I think my apache2 is owned and running as root. I don't know if I installed it like a noob a while ago but I would like to secure it now especially since I just figured out how setup virtual hosts and I may want to eventually let people host sites on my server and I obviously don't want to have to give them root access. How can I confirm that apache2 is running as root and how do I take it away from the root user?
I am using Ubuntu Server 10.04 64bit. I am getting an Apache2 start up error after I setup my certificates and configure Apache2 for HTTPS. At the point on start up where I need to type in my certificate pass code I get a lockup due to the Apache2 process being in a hung state. I reboot if needed and switch to console tty2. When I type in the command.
Code:
I get the following error.
Code:
To correct the problem I do this.
Code:
From the output of the command above I look for the apache2 process and make note of its process ID. Then, I kill that process ID. For example, the command below has an apache2 process ID of 1131.
Code:
Next, when I run this command.
Code:
The apache2 server starts up and asks for my certificate password, accepts it when I type it in, and runs perfectly fine afterward.
Fortunately this instance of Ubuntu Server is running inside a VMWare virtual machine. I can just "pause" the virtual machine if I need to rather than going through this crude and tedious start up process too frequently.
Crude and tedious are feelings I'm having too frequently lately with Ubuntu Server.
So I've got a box running 10.04LTS Server, and on it is running the latest build of Apache2.It's a home box - a server set up for the sole purpose of experimenting and having fun with. So far, the fun is mostly breaking it over SSH and then fixing it when I get home and can log into it via recovery.What I really want to do is to get this box set up with Apache2 the way it seems like it was designed - with Apache2 serving web pages from its default file location (/var/www) but also being able to log in and upload/download new web page files to that directory over SCP or SFTP. I keep hitting snags.Here's what I've done so far:
1. The server is set up in a DMZ at home and my router updates a Dynamic DNS record; so far I can SSH into it no problem.
2. Apache is working. I get my "it works!" page when I enter either the IP or the dyndns domain name.
3. SFTP is sort of working. I can log on using WinSCP and see the files and download them, but I can't upload to the default directory with my normal login.
Here's my issues: 1. I want to set up Apache *correctly*. To me, that means leaving it pointing to the default directory, but still being able to upload to that directory. I have not (and probably will not) enable the root account. I've set the permissions to 755. I've tried chown'ing the directory, but then it seems I can't view the webpages.As a workaround, I created a www folder in my default user home directory and pointed Apache2 there in the 'default' file in /etc/apache2/sites-available. The changes read as follows:
That gives me a workaround for the default directory SFTP issue, but I would rather learn what I need to do to have it set up and working with default values.
2. You may notice I added Includes under the Options. My goal is to get server-side includes working. But they aren't. I have some existing webpages my work has set up - I using these as a template to use to adapt a flash movie I made to a specific resolution, as well as to learn how to optimize my flash for a webpage. Our webhost uses virtual hosting; I am not yet doing so. I'm not sure what they've done to set up the server-side includes, but the files they are using are all html files - no shtml. The include files themselves have either .htm or .html extensions. All of the pages have .html extensions.
My reading said that I need mod_include installed in Apache2. Where can I check to see that it is installed? Where do I need to add the Includes under Options?Is it in the right place? And finally, where do I need to add XBitHack on to enable it? This is the method that Apache suggests, but the documentation offers no clue as to where to put it. Most of the documentation out there refers to apache.conf, but that's the Apache 1.3 way of doing things.I really just want this to be set up according to the defaults as much as possible.I want to have a good working knowledge of Apache and of how to set it up and configure it, but dang it if it isn't a frustrating process.
I'm very baffled by this, I was setting up my mumble server since my old one had a hard disk failure. I was messing around with web registration but decided I didn't want it since this server is really only for a few friends.
So I removed all the mumble-django and apache2 packages required for web registration, but for some reason apache2 is still on my system and listening for connections.
terminal output below to further clarify.
Code: Linux voiceserv 2.6.35-27-generic #48-Ubuntu SMP Tue Feb 22 20:25:29 UTC 2011 i686 GNU/Linux Ubuntu 10.10 Welcome to Ubuntu!
I have a problem with apache server. The thing is - I can connect to my server on localhost, but when I try to connect from local network, it doesn't reply.
I can ping my web server from lan. I can ssh my web server from lan.
I haven't modified iptables since install except these commands>
Code: iptables -A INPUT -p tcp -s 0/0 --sport 1024:65535 -d $MY_SERVER_IP --dport 80 -m state --state NEW,ESTABLISHED -j ACCEPT iptables -A OUTPUT -p tcp -s $MY_SERVER_IP --sport 80 -d 0/0 --dport 1024:65535 -m state --state ESTABLISHED -j ACCEPT
How to resolve this apache2 problem I keep getting, I have installed a debian LAMP server with ispconfig3 and everytime I reboot apache fails to start, I don't know and I have done some research but nothing seems to work, which leads me to reinstall everything from scratch again, I think I have reinstalled at least 10x or more.
Very new to Ubuntu (linux) in general and read various forum and threads to get .pl file to run on a new Ubuntu 9.10.My cgi-bin is in /var/www/site/cgi-bin. The server will run 2 websites site1 and site 2. Site2 is html only and site1 has some cgi and pl files. Everytime i get to the cgi-bin/other_director the browser tries download the file or even if going directly to index.pl. Please find attached my site-available for both default and site1 for your consideration. I also ran a2ensite site1 and a2ensite site2, the reloaded the apache2 serverQuote:
I'm a newb when it comes to linux operating systems so I'm attempting to get better through experience. I work for a web development company and we use Ubuntu for our operating systems (the programmers at least). Anyways, I'm trying to install LAMP services and get them working. I have all L.A.M.P. services installed... but Apache2 is giving me a problem. I have an .htaccess file installed in a directory under my document root. But Apache2 is not interpreting it. I have AllowOverride All on but I can't figure it out. I did make a bogus .htaccess file attempting to make apache give me a error, nothing.
I'm configuring an Ubuntu server that's running x64 9.1. I installed the LAMP stack during the os installation and now I need to build an apache2 module. It appears as though apxs is missing. I've installed apache2-threaded-dev but am still missing apxs2.
I have had a LAMP setup on my computer for a while without any trouble but I have suddenly become unable to access it through either localhost or my IP address. I have tried removing and reinstalling the packages but it still doesn't work and the /var/log/apache2/error.log does not give me any errors.
I'm using Ubuntu 9.10 with Firefox 3.5. I am trying to open a local php file in Firefox. I already installed Apache2. I already installed the PHP5 modules for Apache2. I already tried restarting Apache2 and I cleared Firefox's cookies and cache. Apache2 is currently running and PHP is in the "mods-enabled" folder. Yet every single time I try to open a local PHP file, Firefox asks me what program I want to use to open it. I have no idea how to proceed at this point.
I recently put a new install of Ubuntu 10.04 Server on an old dell I had sitting around the house with the intention of using it as an all purpose server machine (website, teamspeak, ftp, storage, etc.). I am running apache2 for the web server and want to use vsftpd to edit the website contents through FTP. The issue I am running into is that vsftp doesn't have permissions to write /var/www/ which is the default website directory. The flip-side issue is that apache2 doesn't have permission to access the /home/user/ directory which vsftp has permissions to write.
Because I am only interested in running a single website (no virtual machines/no extra IPs) fixing either apache2 permission or vsftpd permissions would fix the problem.Changing DocumentRoot to /home/user/ will make apache2 point to the new directory but I get a 403 error. I have no idea how to change application permissions.I need to use to give permissions for vsftp to write /var/www/ (vsftpd.conf is configured to allow write in home directory) OR the command I need to give permissions for apache2 to access /home/user/? I assume it is a chmod command but I have never used this and don't know what the permission codes (777, etc.) do in terms of giving me the necessary andsufficient app permissions. I don't want to give the app. too much access and risk creating some security flaw.Here are my vsftpd.conf, default (website config file) and apache2.conf.apache2.conf
Code: # # Based upon the NCSA server configuration files originally by Rob McCool.
I have an apache2 web server running on ubuntu server (Ver 2.6.27-7-server) running just a basic Mediawiki site for our group internally. It's just a basic info wiki, and it also hosts a bunch of large .iso files for the group to download if they need them. The latter part is where my latest issue has come up, and I'll try to keep it brief.
We moved all the .iso files to a FTP server that we just got up and running, since the FTP site has goads more space available on it's NAS shared drive than my wiki server whose hard disk is rapidly shrinking with all the .iso files on it. Now, the FTP server is outward facing unlike the wiki server, so obviously it has to have user/password authentication unlike the wiki, which is only internal.
I want to be able to link from the wiki server directly to the files on the FTP server without the user being prompted for a password, while still keeping a decent transfer speed.
So far I've tried 2 methods that have "worked".
First, I mounted a local folder on the wiki to the FTP server using CurlFTPfs by adding a line in the /etc/fstab file.
This actually works perfectly, I can link to the files and never get prompted for credentials, however the transfer rate is abysmal. under 100kb/s over a gigabit connection, rather than 10MB/s I can get if using an FTP client directly.
The other method I used was enabling the mod_proxy_ftp and simply redirecting to the FTP using the wiki server as a proxy.
I applogies if this doesn't fit or if this is right in from of my face but, I'm a bit confused.I'm trying to configure a few VirtualHost in apache2. I currently a default virtualhost and 2 other sites. No matter which domain I visit I still get the default page.. My Confusion is, I've been reading and some places seem to say that my virtual hosts should be in /etc/apache2/sites-enabled/*Config File* and other seem to say it should be in /etc/apache2/httpd.conf what the awsner... I've tried both ways with no success .... If I put NameVirtualHost in httpd.conf I get an error saying their are no virtual hosts.
Basically I've made a right **** up of my apache2 configuration and I just want to un-install apache2 and re-install so I can start again.
I've done research on this but the guides have always been for people installing using "apt-get" or "rpm" packages. I've tried these methods but nothing has worked.
The way I installed lamp was from selecting it in the different options when installing ubuntu-server from disk.
Is there any way I can do this through the terminal and not through synaptic, only reason being is that I'm ssh'ing across a local network.