Ubuntu Servers :: No Servers Work Unless Logged In?
Feb 5, 2011
I have the system setup and working, but I want to put it in a corner and forget about it. Problem is, nothing starts running unless I'm logged in. And if I log out, everything stops again.It's running a LAMP server and has VNC and SSH servers as well. I want all of that to start without having to log in. That way I can remote reboot without worries and connect and login with either VNC or SSH.Everything seems to have an entry in /etc/init.dIs the way it's acting normal behavior? It's a fresh install, then I installed everything I needed from the repos.
how to bind a script to a F key (F12) that will run as root even when not logged in. I have a headless server on client premises where it'd be easier for them to press F12 to run this script that will be rarely needed than to give them SSH instructions etc. I know this must be do-able, but I can't get my Google-fu on for this question. The only way that I can possibly think of doing it is to touch a file whenever that key is pressed and have the script idly checking for that file every few seconds in a loop.
I'm a total Linux noob, but I've needed a working development server for a while so I've put together an old celeron box running Ubuntu Server 10.10.The box runs fine, as does the Apache and MySQL servers, even if they did take a little while to fine tune!The problem I have is that vsftpd doesn't respond unless I'm logged on locally or via putty. As long as a local user is logged in, it's fine. If I try to connect when noone is logged in, then the connection times out waiting for the server message, and thereafter I have to login and stop / restart the vsftpd to make it work again.I'm not sure if the vsftpd is set to run on boot or on login and I have no idea how to check. Vsftpd is set to allow only local users, of which there is only one - so I can't check if it would work with any user logged in
After a battle with Ubuntu, Django, Apache and wsgi i could reach the website i set up from another computer via ip-adress (10.37.129.6). i then restarted the server and after booting tried to access the website from outside - permission to / denied with the usual 403 error. trying to fix that, i logged in to the server and suddenly the website was available again. typed logout on the server - no access wt. 403. logged in - website can be accessed.i somehow suspect this is some strange permission problem, but i don't have a clue where to start searching. errorlogs just contain information that a / access request has been denied.
the apche2.conf and vhost file I gave the link are the machine on LAN when site is actually hosted.When some one from internet access the site then I expect a log of IP in access.log instead of which I see the IP of machine which is working as Reverse Proxy server for all such requests.What mistake did I do above.
I want to backup some data on my Fedora box to a external Hard Disk (USB). I mounted the external HD on my box. I wrote a bash script to do that and I scheduled a cronjob to execute the script. When I am online the script executes as planned. However when I am logged out the copy does not work. I also tested this with a cifs mount (via fstab) and that does not work either. I set the script to generate some output at the end and that is OK so the script does run when I am offline. I suppose the mounted locations are not reachable while logged out, is that correct? Is there a workaround so I can reach the mounted locations while logged out?
I'm trying to install phpMyAdmin on my Ubuntu 10.10 server. I type the following command (I don't use "sudo" because I'm logged in as root, I know its not safe):
Quote: apt-get install phpmyadmin and go through the installation. I allow the installation to configure the database, and I chose the correct server (Apache2), when it asks for passwords, I use the same password that I use for the rest of the server (i.e. it is the account password for root, sudo, and my account). Once the installation is complete, I try and access it from a computer on the same network. I type "http://***.***.*.*/phpmyadmin" and I get the message saying the directory isn't there. I go into Webmin to confirm that the directory isn't there, and it isn't.
My questions are (1) Why isn't the phpmyadmin directory in my Apache Server root? (2) Is it installed with apt-get, if so, where? (3) How do I know what server I selected for it to install to? (4) What do I need to do to get it to install correctly?
I was running 10.04 LTS and had decided to stick to the LTS versions as I'm now running my machine as a server and don't want to be updating regularly.Every time I logged in via SSH I got a message telling me there where packages to update including a security update. So I did a search to find out how to perform an update on Ubuntu server from the command line.What I found was to do this:sudo apt-get updatesudo apt-get dist-upgradeAfter doing that I rebooted but now my machine gives me this message:
init: ureadahead-other main process (794) terminated with status 4Your disk drives are being checked for errors, this may take some timePress C to cancel all checks currently inprogressI'm not pressing C yet and leaving it alone to finish, but I noticed when the machine booted that one of the options for booting talked about Ubuntu 10.10, so I'm worried that I've updated from 10.04 LTS to 10.10 by accident?
but I have a few Ubuntu server's that are Headless, and rather than walking to the server room with a display i thought, id rather push for a good challenge..So I would like to know how to remotely login to these Server's Desktop environments. not necessarily with more than one user at a time. just need to login to the machine's via VNC or RDC preferably VNC I do my admin work with Apple Remote Desktop. but I have RDC as well..
I set these systems up for my IT Department... and I need them to be super easy to access for the rest of my team. ( I am the only one aside from or Director privy to using the command line ) so in order to make it possible for normal humansI'm installing the desktop enjoyment on each of them now. but..From what I have found in my previous Linux adventures - it is not possible to VNC to system that is not already logged in, in other words if the machine is booted, but at the login window.. I have never been able to connect via VNC, I am only able to do this once the machine is physically logged in to it's desktop nvironment once that has happened I am able to connect with VNC, but,, something tells me this is possible - this is something I do on a regular basis with my OSX server's - and with RDC to manage my active directory server
I'm not to sure this is the place I should post this, but couldn't find any place for it to fit. So here is the deal..I am running 10.10 with the standard LAMP setup. I am running a drupal 7 site no problems. What I need to do is run a multi site sub directory. (domain.com/newsite). So far the best I can do is get a directory index of the second site (domain.com/newsite), but cant run the installer. (I did the symbolic link to the doc root of the drupal install.)If I type in the url domain.com/newsite/install.php I get not found. I placed an index.html test page and it renders fine. I think I'm close but just not there yet.
Any ideas from anyone? I can get Sobdomains to work fine on a test mache, but sadly I need the domain/site to work due to lack of DNS support at the moment. I tried to get some info from the drupal forum, but everything always points to subdomin setups. I also posted over there, but it seems like the forums have very little support and are a bit on the slow side.
I am using Ubuntu 10.04 server with apache 2. I have set up a virtual server for the web server. I got the MySQL database working and PHP working. I just can not get my CGI scripts to run. I have 2 different systems set up and I can not get the CGI to work on either.
This is what I have: Per-Directory Options Directory / Directory /var/www/ Directory /usr/lib/cgi-bin Directory /usr/share/doc/
[Code]...
This give me a "Not Found" I have the cgi script chmod at 755 and I have tried 777. If I put a copy in /usr/lib/cgi-bin/ I get an "Internal Server Error". The script works find on the Host Gator server. The top of the script has "#!/usr/bin/perl" I know this is correct too.
It works fine, except every time I try to do something that requires authentication of the key, I have to type in the password in a prompt that looks like this:
Code: $ git pull Enter passphrase for key '/home/<user>/.ssh/id_rsa': remote: Counting objects: 16, done. remote: Compressing objects: 100% (9/9), done. remote: Total 9 (delta 7), reused 0 (delta 0) ...
This is very annoying. How can I have it so I don't have to enter my password each time?
On my local machine I followed the same instructions and don't have to enter the password every time...
I cant get to my phpmyadmin on xampp I tried doing it on the terminal but i get this wierd message
XAMPP: Another web server daemon is already running. XAMPP: Another MySQL daemon is already running. XAMPP: Starting ProFTPD... XAMPP for Linux started.
But i cant get in phpmyadmin What can i do linux pros?
Today, after upgrading kernel to version 2.6.32-22 (linux-image-2.6.32-22-server) all my KVM guests stopped working. I could not reach the guests over (bridged) network or connnect to the guests with virt-mangager (well, I could but all i could see was black screen). I was able to reproduce the issue on my second server.
What is wrong with the Ubuntu installer? I am trying to install 10.04 Server from either a USB CD-ROM or a USB memory stick and no matter what I try (and I've tried every trick I've been able to find so far). I regularly install other OS's using these methods without any problems or workarounds required. Why is the Ubuntu installer so screwed up?
When I try to install from USB CDROM the installer boots, asks me all those keyboard questions, and dies at "Detect and mount CD-ROM". It askes me if I want to load CD-ROM drivers from removable media I've even tried various things such as using the command prompt to manually copy the iso to /root/ and mount the iso to install from... no dice.
So next I tried using the Universal USB Installer tool as outlined at http://www.ubuntu.com/desktop/get-ubuntu/download The USB stick is created and it will also boot up the installer as expected. This method stops at the same point as the USB CD-CDROM. Other than rip open the server and hack in an ide/sata CDROM drive (there's no room for one), what does one have to do to make this work?
I'm trying to get WOL to work on my web server built on an ASUS P4A800D-X motherboard (which supports wol).I have it enabled in my BIOS and I've tried several different tools with no results, including my dd-wrt based router which has this functionality built into the firmware.All the guides I've read say it has to be enabled in the driver settings in the Windows Control panel before it will work, but I'm running Ubuntu Server 10.04.
Is there any special steps that must be taken in the OS to enable my web server to power on from the LAN? I've never used this feature before and would like to just shove my web server in a corner some where and have 100% remote control from my main PC.
I am building a new server on Rackspace Cloud. When they create an instance, it is a raw install. I have several other instances running Ubuntu 10.04 with vsftpd running just fine.But I have an app that doesn't like php 5.3 so I created a new server instance with Ubuntu 9.1 and php 5.2.Here is my problem... I install vsftpd EXACTLY the same way I've installed it dozens of times on Ubuntu 10.04. But when I try to access it, I get an ERROR 500 "vsftpd: cannot locate user specified in 'ftp_username':ftp. I'm SURE I'm using a valid username & password.
I was with hostgator before that provided cPanel and i "believe" they used CentOS.Now that i have moved to a different host, unmanaged, i installed copy of ubuntu 10.04 LTS 32 bit.the website is www.terwax.com.I have to put files under /var/www. So i put all the files there, including index.html (a file of joomla), but if you see the website yourself, it shows up blank page with some weird stuf
I don't know if I'm posting in the right box - I'm new to both Ubuntu and this forum so please bear with me. Thing is, I spent almost 2 days trying to find a web server that meets my likes. I tried like almost every well known web server, be it Apache, Lighttpd, Nginx, and Cherokee, one by one. For each I was successfully in getting PHP up and running, but was never able to configure the virtual hosts.
I'm 99.99% sure that I - for multiple times - followed correctly the online how-to's. Especially Cherokee, I did exactly the screencast (which is just 2 steps, as Cherokee has a GUI for virtual server setups). Strange enough, with each web server the virtual hosts thing never worked for me. I always received "Server not found" error. I tried with Firefox and Google Chrome. Currently I'm on an Ubuntu 9.10 (32bit) box - I reinstalled this one over the 64bit, to vain.
Don't know if it's a problem with my DNS, but I have another machine running Windows 7 with the same DHCP settings (means same DNS and IP range). Virtual hosts work fine there on an XAMPP installation.
I'm a newb when it comes to linux operating systems so I'm attempting to get better through experience. I work for a web development company and we use Ubuntu for our operating systems (the programmers at least). Anyways, I'm trying to install LAMP services and get them working. I have all L.A.M.P. services installed... but Apache2 is giving me a problem. I have an .htaccess file installed in a directory under my document root. But Apache2 is not interpreting it. I have AllowOverride All on but I can't figure it out. I did make a bogus .htaccess file attempting to make apache give me a error, nothing.
then I connect into work through a secure vpn connection that used to be reasonably fast. Sometime in the last year or so it has slowed down dramatically as if they added a one minute delay between pages. Everything works fine otherwise. It's just slower. At work they use all microsoft software, server software, etc. and if I connect in my friend's laptop with Windows 7 it works quite well and the pages load quickly like my Kubuntu laptop used to. What I'm wondering is if this is some kind of compatibility issue between Windows explorer and Firefox browsers or just a sour grapes issue on Microsoft's behalf where they slowed things down on purpose?
I can't connect to my ssh-server (on ubuntu 10.04) from any external address. The error in putty is "connection timed out". From my internal network, everything works fine, even if I select my external IP address and port in putty as the destination.
This is what I have done so far:
installed open ssh with the ubuntu server 10.04 cd set up port forwarding (11041) in my router (seems to work ok, port is open according to "shields up") configured ssh to use port 11041 in /etc/ssh/sshd_config changed tcp and udp ssh port in /etc/services to 11041 enabled UFW and did sudo ufw allow ssh
I can't figure out how to get Calamaris to work. I'm using squid3, so I understand that there could be a problem with that, however I installed links for the logs, so that even though it uses "squid" it should work for squid3...
I even tried going into the /etc/cron.daily/calamaris file, and changing squid to squid3... But when I run the crontask, it gives no output and there's no file in /var/www/calamaris. What's going on? I need a program to analyze computer bandwidth.
I do not know a lot about Ubuntu Servers, but I made one. I wanted to set up a web server using Apache and DynDNS. So far, I have created an account on dyndns.com and installed apache. I also installed another program (i dont remember the name) that had something to do with DynDNS. Right now, I can access the page on my server by using eLinks (text based web browser) and going to the address (myaddress.dyndns.org). If I try to access it from any other computer, I get a page that says 'Unable to Connect'.
I have created the passwords file with htpasswd and defiantly have the right password for bob. However, when I try to log in the box just comes up over and over again and never authenticates. What am I doing wrong? I'm a newbie, so please bear with me if I've missed something really stupid.
I've been trying for the last week to get gitolite and gitweb running on my ubuntu 10.04.1 Server (with Apache and OpenSSH selected on install).gitolite is working fine i installed it from the git and created a user git which has all the repos lying in his home directory.
I've been trying various guides to setup the gitweb interface but they never seemed very clean nor did they support configuriong access rights for gitweb in the gitolite.conf Also i would like to enable the daemon too and configure its rights in the gitolite config. (haven't looked into that because i need the gitweb first) essentialli what i want to do is programm with some people with different rights on a projekt (gitolite) and publish it's status when its ready on the web (gitweb) and also give a quick download possibility (git-daemon)