I've setup vsftpd for my FTP on my server, I edited the vsftpd.conf to allow me user to gain access to their home directory, It doesn't seem to let me in, I'm getting the error
Quote:
530 This FTP server is anonymous only.
this is set at yes
I've set it to NO and I get
Quote:
500 OOPS: vsftpd: both local and anonymous access disabled!
I don't quiet under whats going on, I followed this tutorial [URL]
In the end I want to be able to upload files to the www directory for my web site.
I currently have 4 Linux Servers installed in a test lab that I have built for my job. I am in the process of trying to get FTP to work (vsftpd is installed). I don't need an FTP GUI or anything, I can use terminal (and I don't have an internet connection, so I probably can't get one anyway). I bring up the terminal and I type FTP and I am presented with a few problems:
1. If I try to FTP to one of the other Linux Servers on the network, I get "No route to host" error.
2. If I try to FTP to the Server I am sitting on, then I am able to successfully connect, obviously. But when I do an "ls," I don't see any available files.
I am assuming this is because I have not yet set up a folder for it (i.e. Windows uses "ftproot" folder). I am running Ubuntu Gnome 9.04 Jaunty Jackalope for a GUI, and I am running Ubuntu Server underneath (Yes I need a GUI for what I am using the server for).
I'm having difficulties with uploading files to a CentOS-server with vsftpd. I have the exact same configuration on a Fedora10 and there I have no problems...
I've been tasked with setting up a RHEL FTP server to mirror one we currently have. From what I've read, I need to install and configure VSFTPD and then configure IPTables. From what I've been able to come up with, I need to follow the steps in this article to install and setup VSFTPD. Is this a good complete article to follow you think?Also, how do I copy the iptables config from that server to my new one? I think that iptables on our current server only allows certain IPs or blocks certain IPs (not sure which), so I need to have it do that on my new server as well
I want to enable the communication b/n ibm tsm server and ESX server . for that reason i want instal /etc/init.d/vsftpd in my sys. But iam unable to find the link which can provide me this software [/COLOR].
What is the best method for backing up a VPS server? (A guest instance). I'm assuming you can't copy the image file while the VM is active. And if you stop the VM you have downtime.
At my work, we have an ISP that provides us with 2 connections with different IP addresses but at the moment they don't switch automatically if one fails, and can only work for outbound traffic.I tried to automate this with floating routing table on a CISCO 1711 router but then switching to the second link only happens when there's no longer a signal on the cable that's plugged into the router's interface directly -- and the failure most often happens somewhere in the middle. And that also does not make us available from outside.
Can anyone suggest a better way? Maybe an outside DNS server can have a second IP address recorded for our domain name?I found somewhere suggestions that a loadbalancer could solve that but these appliances are way too expensive.I also thought about using BGP but my router's RAM (128MB) is too small for the global routing table that BGP requires. And I also need an ISP (or better 2 ISPs) that provide BGP service. Before trying to convince others that we need to invest more into this, I'd like to know whether there are no easier ways.
I have a Samba server on a computer. I would like to backup the Samba files on a different computer that is a client in the Samba server network. I can easily drag and drop the Samba files onto the client. I would like to automate this process, and accomplish this using an update copy versus a copy full. How can I accomplish this as bash script? I had no luck using the 'cp' command.
I've been running my little server at home for a few months now, and I've noticed that Webmin has detected that I'm in need of over 100 updates. I'm a bit scared to run the updates because everything is working just fine right now, but a part of me still wants to run some of the updates.Now I know backing up the server would be a good idea to do before this happens. This server is a media server with videos, music and pictures taking up the majority of the hard drive space on the server. I just have one partition on the server, outside of the swap partition. I would like to back up everything on the server except the videos, music, and pictures, because if even an update messes with the server, I could always retrieve those, and the external hard drive I'm going to back up to wont be big enough to hold everything anyways.
if I were to backup up everything except for those directories that hold my videos, music, and pictures, and something were to go wrong, if I were to restore all of those, would I then be back to the state my system was at when I backed up? I've never done a back up and restore in a linux environment before. I just want to make sure that just doing that will be enough, because the last thing I want to do is hose my server after taking several weeks to get it to the working state that its at.
how would I backup a specific package. Really all I want is the configuration files.
the package I'm talking about is mumble-server, could I say rsync all the files that were installed, then if I wanted to drop it in just copy those back over?
If I had to wipe the installation and reinstall, could I install that package again then drop my backup copy back over it?
have had a server running for a very long time using Ubuntu Server 7.10, and I think it's passed time that I upgraded.I'll be installing fresh, and I've already backed up /var/www (as well as a home directory with a few files)I've only used this as a Web / SFTP / file server. Might there be any other directories that would be good to backup? I set it up so long ago and have made a few changes along the way.
I have an Ubuntu server running Lucid. I'd like to be able to back up the hard drive in the server to an external hard drive. I try to plug in a drive via a USB port and it doesn't appear to mount automatically, as it does on the desktop version. Questions: 1) What/where should I be looking for to see if the drive is mounted? (I've looked in /dev and /media; no dice.) 2) What's the mount command I should use to manually mount the external hard drive? 3) What backup commands or programs, other than rsync, are recommended? (Nothing against rsync.)
I installed vsftpd server in one of my servers using "yum install vsftpd" command. NFS server is running in the other server and mounted as "/data" in this FTP server. root in FTP server has also root authority in NFS server. All the files and sub-folders under "/data" in FTP server have 755 or 766 mode. Even I modified vsftpd setting to allow root login.
When I login as root to FTP server with FileZilla client, I can see all the file list in root home directory and move to /data directory. I can download any file in a local HDD but I can not download any file in /data directory.
I have just generated a new ssl key on my ftp server with the following command
Code:
I then put my new key onto my file server and attempted to connect to the FTP and it failed (this did work before with the default key).. I use curlFTPfs to mount the FTP directory locally as /ftpbackup, below is the command and the output.
Code:
Error connecting to ftp: server certificate not activated yet. As you see it gives an error about the certificate not being activated, I have looked this up and cant find a way to activate it.
Below is the contents of vsftpd.conf on the ftp server
I like music A lot! every CD I buy, I like to rip to my HD as WAVs (i'd do flac but I have a windows system too, and can't for the life of me find a flac codec for Media Player); but I'd also like to create images in case i lose / scratch / otherwise damage my CDs. I usually use DD, but the system has to be unmounted. Problem is IDK where audio CDs are mounted so I can't unmount it.
I have a Seagate USB drive that I'd like to use as a backup drive for my home system with two drives. One drive contains /home and /root, the other contains /var. I've read about a lot of different software for backups but I'm not really sure which one would be the best for this. I want to be able to use this backup to restore the system just in case something happens. What would be the best software to use for this? I'd like something that will basically clone the system I assume since I'd like it to not only copy the system structure but also symlinks.
I'm looking to setup a home server for the purpose of backing up and storing the files on our multiple (Windows) computers. What kind of server should I set up? Samba? Lamp?
I just installed Ubuntu server and wish to run an apache web server from it. I have that setup, with each user having their individual folder. (E.G) apache root /var/www/ LazerPhreaks folder is /var/www/LazerPhreak/ so their website would be www.mysite.com/LazerPhreak/) I wish to setup vsftp to let each user access their individual folder and upload website files via ftp. How should I go about this?
I have a vsftpd server configured and I cannot upload using anonymous account, I've trawled the net have have exhausted my search for answers. Here is my vsftpd.conf file
Code: # Example config file /etc/vsftpd/vsftpd.conf # # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults.
We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray.Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc?Do you have any other approach for creating physical back-up of your hard drives?