General :: Most Important Directories To Backup On A Server?
May 21, 2010
I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD.So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it.
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
Right now I use dd to backup mbr and the first 512 bytes of every partiton.
Code: dd if=/dev/sd? of=sd?.bin bs=512 count=1 That would take care of the partition tables and grub in the mbr, but what about the superblock, the stuff that XP stores and the superblock?
Code: dd if=/dev/sd?1 of=sd?1.bin bs=512 count=1 Is it save to assume that if grub was on sd?1 it is backed up? Does it backup the superblock so if some bug or power failure makes the partition unmountable because of bad superblock it can be fixed by copying back the first 512 bytes? I am of course not changing partition size or type of filesystem and not messing around with grub.
what directories/files should be backed up? What are you using for this job?Basic backup: /home, /etc maybe /root and /boot and often you want to backup some parts of /var such as /var/log. I can use cp and scp for simplest backup. tar, cpio for tape etc... I can use dump and restore for whole file system backup. rsync for incremental backup.
I would like to backup important files (totaling about 400GB) on my ext 4 RAID 5 array to an ext4 external hard drive over USB (external drive is mounted to /mnt. In the future I'd like to automate the process using rsync and cron so for now I'm using rsync to transfer the files. My problem is that using the rsync command like this: # rsync -Pr "/dir1" "/dir2" "/dir3" "/dir4" /mnt
rsync shows me the checks and transfers for awhile and then throws up an i/o error (wish I had a screenshot to show but I don't). When I ls /mnt I get a similar i/o error. I then check /dev for the drive and find that it no longer shows up. Originally the partition was /dev/sdc1. I tried unplugging the USB at this point, plugging it back in and mounting the drive back to /mnt, however it has now assigned it to (you guessed it) /dev/sdd1. I get the drive mounted and try the original rsync command again, hoping the first error was a fluke or some kind of one-time drive fart. This time it makes it quite a bit further and then throws up the exact same problem. Am I doing something terribly wrong here? As I said, I'm very new to bash so I'm not making some absolutely moronic, newbie mistake.
I then installed a new version of Ubuntu 10.04 from disk and copied the files in /home from the cd to the hard. I am able to open, view etc. all the files in most directories except those in /home/documents. There are text files created by gedit, OOWP and several PDF files. I cannot open or view these files, depending: gedit and pdf files gets a Err.Msg. "Don't recognize file type" (it is clearly marked PDF) . The OO files look like rows of 'high bits' and a dialogue box opens giving me the options to change Char. Set, Font, Language, Paragraph break.
I would like to zip only selected directories(and its child directories as well)I have many directories in the current folder like app, content, db, library etc.But I would like the zip only app and content and its child folders. I am trying the following.
zip -r ../backups/code/20110625 -i app/* -i content/* . *
But I am getting the following error. zip error: Invalid command arguments (nothing to select from)
I'm hoping somebody can find something here that I haven't. I'm trying to use rsync to backup home directories to a nas. First, I NFS mounted the nas and ran an rsync and everything worked out fine. the transfer completed after a few hours and everyting was transferred (lots of stuff!). I then decided that I don't want to leave the nas mounted all the time and I didn't want to automate mounting and unmounting of the nas as I didn't think I could produce a script that would work reliably enough. So I decided to start an rsync daemon on the nas and upgrade via that. I run the following command (results are included. the ^C is me killing it after it hangs).
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files 1- directory 2- .txt files 2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
nothing informative from me as of now, although I have only been out of "Windows" for a little less than a day, I suppose the only alternative conversation topic is a simple request for those more experienced users to point out some major/musts about topics to focus on as I'm new to open source ANYTHING, really
To think I was so "under the gun" about spending loads of cash to keep up with the IT group of choice, I have gained some much needed relaxation w/my cup a tea. Pinpointed request, I guess, "when you first started, and knowing what you can do with it now, would you be obliged to say 'HEY, THIS IS THIS, BUT WATCH OUT FOR THIS'
My system got crashed yesterday. Before I do any mess up with it I would like to backup some important scripts stored in it. With so many live cds available I am confused which one is to choose. It should not be too big (in terms of mb). Any small Live cd will do.
I'm sure I've done this before and never ran into trouble so it's not like I wasn't thinking, just something went wrong this time and I'm screwed. I'll post a fdisk -l to show ya what I mean.
[root@localhost ~]# fdisk -l /dev/sda Warning: invalid flag 0x0000 of partition table 5 will be corrected by w(rite) Disk /dev/sda: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Disk identifier: 0x624aa2e0
Well I tried to install Pclinuxos 2010 to the same partitions as Mint 7 and basically just over write it (this I have done before) but this time Pclinuxos says there is a bad block and it can't copy files to / (sda7) I didn't think much of it at the time and thought I would try to install it (Pclinuxos) on a separate 100 gb usb drive and it was a sucessful install.
After booting Pclinuxos I try to access my 1TB sda hard drive and I can ony access sda1 .....when I look at /dev/sda in gparted is shows the whole disk as unallocated.I have done nothing at this point in time other than what I just said because I am very afraid of loosing all my pictures and everything else on that disk....and for those that will say always back up ....if ya think I'm not kicking myself right now your dead wrong......
I have read many articles by eminent Linux users who laugh off, when they are asked "is a command line knowledge necessary"?. They go on to say that Linux Distros have evolved so much that the GUI is sufficient! I use my Win.desktop for 1)checking the news, 2)checking my e-mail, 3)writing a blog, 4)Listening to music, and 5)since i am a consultant physician, with specialization in diabetology,keeping up with the trends by visiting a few professional websites! So, my needs are few!Which Distribution would you suggest to a completely Linux-ignorant person,and that's me!
I am using MySQL as the database system for my application on a Linux system. Every week I update the system and take backups (mysqldump) of the databases changed (2 databases). I then .tar.gz them and ftp the resulting file to a remote server, after which I remove the original backups and tar.gz files from the Linux server. Being a complete novice when it comes to Unix systems, I would like to know if it is possible to write a script which would do all this automatically, i.e. perform the following steps.
1) Backup database A to A.sql (mysqldump) 2) Backup database B to B.sql (mysqldump) 3) tar -cvzf dest.tar.gz A.sql B.sql 4) ftp dest.tar.gz to email@example.com 5) Delete A.sql, B.sql, dest.tar from local server
I have 2 servers. One is an operational CentOS + cPanel server and the other is a blank CentOS server. I want to use cPanel's backup suite to backup our customers accounts but I want to do it so the blank server is mounted to the cPanel server, rather than an FTP backup. If that makes sense. I believe from previous experience that NFS is the way to go but I'm not sure.
I am looking for a piece of hardware that would be able to run Linux and act as a web, subversion and file server. Ideally it shouldn't have any fans, because it will be in the middle of my living room, the computing power needs are minimal. I would be grateful for recommendations.
This is my first post and I am a linux newbie.. took on the challenge of setting up an ubuntu server with proxy and firewall with vpn access as well.. sounds good? While messing with things, I installed ebox server, know called Zentyal and perform backups, however, when performing a full restore, following instructions and all it does not restore correctly. I ran many attempts with verious configurations and still allways got some errors, such as eboc-ebackup failing to start, etc. most noteably wa sthe ldap error, which i figured out how to fix by restarting once logged into and then rebooting.
I say the Clonezilla options for a full backup, and it sounds great, but does require me to perform a full abckup, while bring the server down to perform. What I wanted to know, was what do you recommend for performing a full backup (possibly to NTFS partition or USB key with FAT32) with automation or even if i had to perform the trigger, just without needing to bring the server down, so when i go into production i dont have any issues backing up anytime?
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I am administering a live web server i want to keep a backup of the access log file without disturbing the server performance. can anyone guide me how to to this. the size of teh log file run in GB so i will need to take a daily backup
I have installed a linux server in my office to run 16 machines. Its main use will be a internal mail server but will be also running websites.
I have installed Ubuntu 9.10 server x64 and have got apache running.
I am looking for the simplest more robust solution for smtp, pop3 and imap. I have only ever used qmail before and found it a pain to configure and its getting old so I though I should probably try something new. I have not much experience with running pop3 or imap on linux so would love a suggestion on that.
I have an account in university on Linux machine with 10TB of free space accessible via SFTP. I would like to backup my Windows 7 x64 laptop to university. Currently I am using rsync+cygwin, but backup is pretty slow (without shadow copy) and I hate console window appearing every day on my screen when I login.
So I am looking for something like Windows Backup but with support for SFTP. Combination of tools will work too.