Ubuntu Security :: PHP Is Not Running Under Apache 2 And Limited By The Www-data Filesystem Access?
Jun 30, 2010
I'm about to have a web server at home for the first time. I've always missed having full control and not having to contact my hosting company when I need to do some specific changes - and some changes they won't do for you at all.I've chosen the non-GUI Ubuntu Server with LAMP, and nothing more is installed really except for a couple of command line tools from the repository. The LAMP software has been locked down as good as I can by following some guides on the net and using common sense. Like Apache 2 don't have access to the file system except for the www folder, and setting the headers to Prod. MySQL has skip-networking and I've commented out the listen string to localhost. PHP has a truckload of functions that I've disabled in the php.ini, also by following some guides on the net, among some other security enhancing php.ini editing.
The only thing the server will serve is a well known PHP forum and some html docs, and that's all. Nothing advanced or complicated stuff, and I'm definitely not programming PHP myself or letting anyone do it for me.But I do want to sleep well at night knowing that my server is always on and sitting on the edge of my home network! And can I do that? I've heard that you don't need to be worried about getting your Linux server box hacked, but you should be worried about anyone getting root access to it. But is it really that simple? Ubuntu is shipped without root account and you must have the sudo password, right? What's the odds for anyone to get full access to my system?An issue: I've heard that Apache never must run as root. When I do a ps -ef, I see that there are several www-data processes running apache, but there's one root process running apache too. Is this normal and is it safe?An issue: I've heard that PHP can fail pretty easily. But isn't PHP running under apache 2 and limited by the www-data filesystem access?An issue: MySQL is running as a MySQL user, and I guess that's an unprivileged user right?
I have SSH running on a computer I use as a server at home and login to it for my own purposes but am needing to share access to this server with someone else, and I'd like to do it in a way so that when they sign in all they see is the contents of one folder and nothing outside of it. So I'd like them to have full access to this folder and do anything they want with it, but not be able to browse outside of it at all via something like WinSCP (they're using Windows). I'm thinking I need to create a new account for them to sign in with but beyond that I'm not sure what I need to do. The only other special thing is that the folder I'd like them to be presented with is actually on an external hard drive. We're going to be doing a lot of online music collaboration and I need to give him lots of free space to drop files and the internal hard drive doesn't have a lot to spare right now.
In CGI scripts, there are certain files that are getting "permission denied" when it seems they should be accessible by the apache user. I am running the default package install of apache under fedora. Here is an example:The following is /var/www/cgi-bin/test.pl
Recently, I started protecting all user-accessible filesystems on my Sidux desktop machine with LUKS. Before that, I would regularly erase traces of deleted data, and I wonder if this is still necessary.
It would be most valuable to me to be pointed towards a good introductory article on the underlying mechanics of LUKS and cryptsetup, as there are a few more minor questions to be answered. Unfortunately, I lack the necessary mathematic and cryptographic background to understand scientific papers.
Does anyone know of any software that can monitor the Apache logs for certain phrases or keywords then send an alert when found? For example I know an attempt to hack has been made when I see log entries like this....
/admin/ /admin/phpadmin/ /phpadmin/
But by the time I see it, the attempt has long since failed or succeeded. What I need is a way for my server to alert me WHILE someone is entering these phrases. I realize there may be a "hit" to performance but my server is not that busy anyway (except for hackers).
I'm trying to modify an existing user so that any files they create can be at least read (although writing and execution would be nice) by any other user. The reason is because I need the daemon running my Apache server to be able to access files created by a daemon running under this user, files which will be created and accessed in real-time.
Ok, so I have a few web apps that need to run shell commands. Heres a great example of one:
Code:
This is a PHP script getting my system volume. Herein lies the problem... www-data doesn't have permission to do this!
I changed my apache config to use MY account as the web user, and it does in fact work the way I want it to.
Obviously, I dont want to leave apache running as me, and want it to keep using www-data.... heres my question... how can I give permission for www-data to execute certain programs?
I happened to be looking at my Apache-2.2.8 log on an Ubuntu LTS 8.04.4 system, and noticed a few lines like this: Code: 61.160.212.242 - - [06/Mar/2010:07:04:41 -0800] "GET http://218.30.115.246/ HTTP/1.1" 200 295 "-" "-" 61.160.212.242 - - [06/Mar/2010:07:05:29 -0800] "GET http://218.30.115.246/ HTTP/1.1" 200 295 "-" "-" xxx.xxx.xxx.xxx - - [06/Mar/2010:07:56:15 -0800] "GET http://218.30.115.246/ HTTP/1.1" 400 290 "-" "-"
(The third line is me telnetting to the server and trying to issue the same request. Note that I got a 400 error response, while the guy coming from 61.160.212.242 got 200s. Also, if you just open the http://218.30.114.246/ URL, you get back "hello" (nothing else, just 5 characters). I'm presently putting together a bootable CD with chkrootkit to run on the machine. (I found a thread that mentioned in passing that this was related to PHP, which I have running on that Apache server, but my Google-fu isn't strong enough to track down the original thread.) (After checking with chkrootkit: nothing unusual found.)
about 2 weeks post-install, I find that my new debian etch machine has limited support for different filesystems. In the past, I've always seen support for riesers, xfs, ntfs, fat32, of course ext2/3,. I was trying to mount and read an old windows ntfs HDD, and there were problems, so I went to: cat /proc/filesystems (see below). Is there any way to get back support for other filesystems on my kernel, or do I need another kernel, or do i need to compile a new kernel? Maybe just install the system over again? (if I do, how do I select for filesystem support)...
I set up my first web server and it works flawlessly -when accessed from external network or from other computers on my lan. However, I cannot access it from the computer where the server is run on. I have found numerous people with similar problems but the flavour I am experiencing is somewhat different and no solutions I have found apply to it.
I have two network interfaces on my server, eth0 (public static IP connected to internet directly) and eth1 connected to LAN 192.168.1.0/24 range. The server is 192.168.1.1. It is connected directly to the internet and serves as a SNAT for other comps on the lan.I added "192.168.1.1 www_server_com" to the /etc/hosts on the server and also on the other machines on the LAN. All the other machines can open website without any problem.HOwever, the server itself only opens website if the address islhost. Internal IP, i.e. 192.168.1.1. gets a time out and so does www_server_com ( I cant use dots as I do not have more than 15 posts on the forum )Here is the firewall script I am using.
I am very new to linux, and I have a question regarding the filesystem check (fsck). The power recently went out and when I tried to restart linux the following error appears:
*/dev/sda1 contains file system w/errors, check forced it then goes on to say..
*An error occured during the file system check. Dropping you to a shell; the system will reboot when you leave the shell. Give root password for maintenance (or type Control-D to continue) I wasn't sure what to do, but checked some other online forums and they suggested running fsck manually - so I typed in the root password - and used the command, "fsck -A -V ; echo == $? ==" it then gave the following message
*WARNING!!! Running e2fsck on a mounted filesystem may cause SEVERE filesystem damage *Would you like to continue (y/n)
Again, I wasn't sure what to do so i just checked no. I then manually turned off the computer and was prompted at the beginning to press Alt-3. I was brought to another screen and it informed me one of the drives was degraded and suggested rebuilding the array. I tried doing this, but it still brings me back to the original error of, "/dev/sda1 contains file system w/errors, check forced," and the process continues.
Also, when I tried to rebuild the array, I didn't backup any of the data on our home directory before doing this (which was probably a big mistake). After being prompted to type the root password, I was able to give the ls command and look at all the directories...the home directory where our data was stored was empty and I am afraid I may have lost some information. Is there a possibility that data was lost when I was trying to rebuild using the old drives?
I am a newbie to linux of 4 weeks. I set up my first web server and it works flawlessly - when accessed from external network or from other computers on my lan. However, I cannot access it from the computer where the server is run on. I have found numerous people with similar problems but the flavour I am experiencing is somewhat different and no solutions I have found apply to it.I have two network interfaces on my server, eth0 (public static IP connected to internet) and eth1 connected to LAN 192.168.1.0/24 range. The server is 192.168.1.1
I added "192.168.1.1 www.server.com" to the /etc/hosts on the server and also on the other machines on the LAN. All the other machines can open website without any problem.
HOwever, the server itself only opens website if the address is localhost. Internal IP, i.e. 192.168.1.1. gets a time out and so does www.server.com.I do not understand why the record in etc/hosts doesn't point it in the right direction. It seems that when I open address 192.168.1.1 it still gets routed to the external network. I have seen using DNAT to deal with the problem but it didn't work in my case (maybe I didn't do it correctly). I have spent whole evening/night trying to sort it out, it's 4AM now, going to bed frustrated and angry (at myself hahaha). Still like linux very much, won't be going to windows anymore. Please help
Before the other day, I'd copied a live CD to ramdisk and run it from there before, but the disk was INX (INX is not X), a live CD based on Ubuntu that runs entirely in the text mode, no GUI. INX is a terrific product: colorful, educational, light, agile, fun to use, and and often damned useful, but when an OS only uses text, you may not notice how much running from the RAM speeds up an OS. Previously, I'd assumed that the best reason to run a Live CD from the RAM was to free up the CD ROM drive. When I started running a full KDE 3.5.10 Desktop from the RAM, it didn't take me long to notice the awesome boost in speed and performance.
The computer has the fastest access to the data that's in the RAM. (The "A" in "RAM" stands for "access", right?) So the machine is faster. As the RAM gets larger, I'm sure more and more live CDs are going to offer the RAMdisk option. Right now both INX and SLAX share the characteristic of being exceptionally small CDs, and that makes them well suited for this kind of application. The "minimal" version of Slax, the basic CD without any modules added, is less than 200 mbs, which fits very nicely on my 1024MB RAM. I now use the minimal SLAX cd to initiate the system, and I keep a collection of the modules on my hard drive to copy to the RAM and activate at will.
Here's a couple of screenshots: [URL] I'm using Wine here to run my one and only favorite Windows program, a text to speech program called READ PLEASE. Note that I am also running KTorrent, which is uploading from and downloading to my external MyBook hard drive. [URL]. Here's a shot of yakuake, which is sort of like Konsole with superpowers. I just upgraded my hard drive KDE system to 4.2.3, and they still haven't fixed Yakuake yet. I know it's been reported, so I'm sure it'll be taken care of.
I'm trying to access data from TFTP server which is running on my fedora 15, when i tried to read that file from TFTP path..i'm getting response as time out.. even i tried to get the data in localhost itself...there also i'm getting same time out.. i tried all permission mode.
I'm running a program called Synergy+ to let my keyboard and mouse control multiple computers. One of Synergy+'s features is that clipboard (copy-paste) data is able to be shared, as in copy on one machine, paste onto another. I would like this functionality removed but Synergy+ has no way to disable it. I'm looking for any ideas to block clipboard data from being transferred. Is there a way to block a program from accessing the machine's clipboard data?
I have one requirement i.e I want to call the java file from the php function using shell_exec command , i am using the chroot jail concept , if i using this command i am getting the empty file because java environment is outside the chroot jail,so how to access the the files those are out side the chroot jail.
I am working in a office where only one internet connection available. I have configured 5 other client machines to use internet through squid proxy server. Now I want to restrict the total data usage/transfer (upload+download) to say 1 GB during a calender month. How can I achieve this setting.
I am working in a office where only one internet connection available. I have configured 5 other client machines to use internet through squid proxy server. Now I want to restrict the total data usage/transfer (upload+download) to say 1 GB during a calender month. How can I achieve this setting.
I've had a look at some similar threads but as I'm very new to linux they're already a bit technical for me. Sorry, this calls for someone with patience. I gather from other threads that disconnecting an external drive without unmounting is a no-no, and this seems to be the likely cause. Now the disk is read only and I'm unable to change any settings through the usual control panel on ubuntu. I'm just not familiar with the terminal instructions. I tried to cut and past a few command lines from other threads but I got some warnings that proceding could damage data. Like this one: WARNING! Running e2fsck on a mounted filesystem may cause SEVERE filesystem damage.
I want to create a limited user, such that the user should only have the access to usb drives, cd drives and internet. And also I want to restrict the user from deleting the files from the system. How to do it..?
I have a VPS that has 512MB of ram. I'm using it as a mail/web server. It keeps running out of memory. I know amavis/clamav are memory hogs, but I checked my ps aux and found 100's of instances of "apache-init-server" running. I killed them all, and they keep spawning back. What could be causing this. I've never seen this on a webserver before. OS: CentOS 5.5
I want to have an account (beta user), on which:I can use the Internet and other programs without administrative rights without the right to install programs with a kind of sandbox for everything that is connected to the Internet, which means: everything that is associated with the web browser's processes and files that I save to hard disk I want to be separated from the rest of the system, so that whatever can catch up on this account will be locked in it, for example any (if at all) possible malicious scripts from Internet or whatever may be dangerous now or invented in the future. Sometimes, for example, I save the web page to disk with all it content.
And in case someone cracked into this account I want make it in that way that he could not do any tricks to read or change passwords, or make any other changes to the system. The best would be if a password for that user might serve only to log in without having any other powers, and I would give that user an automatic login. For now I created a beta user without administrative rights. I understand that the limiting rights of the user are associated with limiting rights to their home directory. There are also groups, and a user may be included or excluded. I excluded that user from admin group but I don't know what else I can limit and how. When I give chmod 0644 for /home of this user he cannot run Firefox. When I give him 0740 he can run applications, so I assume the x attribute must be preserved.
This is a user without sudo rights, so when I type sudo apt-get update a message shows up correctly that this user doesn't belong to the sudoers group. But still it's not what I wanted. When the user runs Gufw and wants to change the settings to disable the firewall, a message shows up asking to type in a password of alpha user = primary user, which is that belonging to the sudoers group, the first / main user that I created during system installation. I wish that there was only the message that the beta user has no power to change anything, which means even completely remove the possibility of asking for sudo.
In addition, I wish that this beta couldn't be able to change the permissions to its home directory, or go to see what is above. Because so far beta can change the file permissions for its /home, even without a sudo password. How can I do it? Do I need to create a kind of chroot jail for this user? I would like any changes to that user account could be made only after the user log off from beta account, and log in on alfa account and that beta could run only programs that ware installed by alpha. And that beta could read and write, but alfa could also read and write or remove, alter files on beta account. Basically, alfa account should be superior to beta account. Can do that?
I'm about to create a CSR and was reading this page in the Ubuntu docs: [URL] A couple of things:
* There's no date on the article. The documentation needs DATES because this information gets out of date! Check MySQL docs, for instance -- they are organized by version. * The instructions for generating a cert only specify 2048 bits. I believe that's kind of out of date? The verisign site has big red warnings saying you need 2048 if you want your cert to last past 2013 -- and that article is 4 years old! * The instructions are confusing when discussing the passphrase. We enter a passphrase only to remove it immediately. We need some clarity here. Why do this?
How to understand the current best practices for generating an HTTPS cert for apache and/or mail access?
I'd like to setup an Ubuntu LAMP server, and provide limited access to it for our in-house web developers/designers. I'm not quite sure how to go about the permissions side of things. Which user/group should "own" the /var/www directory? Is it www-data?
How do I create user accounts (for our developers) that have access to the /var/www directory - do I create accounts then add them to the www-data group? Or should I make a special 'webdev' group and give it access somehow?
I have run into a problem with my desktop using roughly 50% RAM (w/o buffers or cache) while running a limited set of applications (fbterm, tmux, weechat, ncmpc, rtorrent) on the command line. This usage only increases roughly 5-10% when starting X (an addition of xcompmgr, awesome wm, zim, parcellite, 2x conky (one replacing root-tails functionality), plus firefox and other apps that may or may not be running from time to time). (h)top is reporting programs only using roughly .1-.2% per proccess and roughly 100 processes (current look at top shows 120 processes, only 32 of which are registering any usage over 0.0%) The RAM usage when in the console (which I will add is about 150MB after boot) is totally unreasonable and I need some direction on trying to find out what is using all of this RAM.
System: Distro: Arch Linux RAM: 2G CPU: AMD 64 x2 4800+ HDD: 3x WD Black 750G (RAID 5 on partition 2 (swap) and 3 (root), RAID 1 on partition 1 (boot). LVM over root partition) GPU: Nvidia 8400 GS
Most of us know the basic security practices on Windows:
Use a limited account Set a password Disable unused services Uninstall bloatware Antivirus / Antimalware etc.
I haven't ran linux as my main desktop computer before, so I don't know how to properly secure it. I have heard linux is supposed to be more secure than Windows, but I know that the default settings of anything are rarely secure. What are some things I should do as a new Linux user to secure my desktop system from attack?
I have configured my squid that have a limited access to websites but still some website were accessable vis https so I removed transparent from squid. Now what changes do I have to make in iptbles