I have a file server running in an office that's mostly used for file sharing and a scanner saves pdf files to the server. I'm running the latest LTS ubuntu server edition and I really only have ssh installed and samba. My question is that I've done so much to the server as far as premissions and configuration and I'd like to make a clone of this to another computer and not sure how I would do this?
I'm not sure if clonezilla or something like this can perform this task? I basically just have a very old computer and now I have another very old computer that I want to make into a spare just incase something happens to the original. Any recommedations on how I would accomplish this?
I have ubuntu 10.10 installed on 1TB HD. I would like to back up the full hd, on a 2TB usb HD I have. I would like an easy way to clone/backup the full installation, so if my hd fails I would be able to book the usb hd to get files or do a full reinstall. I have tried dd, but not certain if it worked correctly.Also tried clonezilla and that did not work
I am new Ubuntu. I have a new Dell Mini10 running that I have configured and running. I want to clone or backup the system at it's current state. I have been looking at the options and Clonezilla seems to be recommended often. Are there other choices that are easer or better? My real goal is to make a bootable DVD that will restore my system back to the point it was cloned.
I'm looking for software that can backup all the files in my /home directory including hidden files.
I liked Lucky Backup, but it puts everything in a tar file, meaning that the backup fails if the file gets too large (4 GB I think). I would prefer to avoid using tar/archives anyway, as often I only need to recover 1 file from a backup (an archive holding my 50 Gb of data would take ages to open).
Does anyone know of a program or a way to get rsync or the like to copy all the files in a directory, including hidden files, into another directory ( so I end up with effectively a carbon copy of the original). Disk space is not an issue so I don't need to compress anything. I'm not bothered whether its a fancy GUI-based program or a rsync command, just so long as it can save my previous files from.... myself.
I have a finely working CentOS server. I want to clone the complete OS (over network) so that I can use it for same functionality on several other machines..
i want clone my linux partition for create a backup. i want use dd command but i have some question.my linux partition is 30GB and linux only used 10GB of it if i use dd command for create a image i must have 30GB free space? can i use dd command in X window or i must first exit from linux and use live cd? in fedora i use dd command for create a backup when linux is running but after restore some command like su not work!!!
i use some tools like partimage for make a backup but it show me an error about block 0!!
The place I work has a web/dns server on opensuse 10 up and running hosting a few websites (our company one & a few vhost ones). The box was set up before I got here. Now we are wanting to create a new 11.2 server that is basically a backup / clone of 1st server in case it goes down.
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I am getting a new server and would like to avoid the typicall install process of OS and applications.How can I clone my actual server on a new one that has different hardware?
I have two identical servers, one has RHEL 5 and Zimbra installed and the other is currently not really doing anything. Both have hardware RAID (Adaptec) set to RAID10, identical hard drives, etc. The RHEL/Zimbra machine is set up with LVM2. Is it possible for me to hook them up on the secondary NICs and boot the second machine with Knoppix or something else, and easily tell it to duplicate the first machine onto the second, down to the last bit, or do I need to make all the partitions beforehand and dd each one separately?
i have centos 5.3 , i want to clone or create image of my working servers having centos5.3 in another hardisk so that if my server down i can just put this another hardisk which having image or clone of the crash server and my server will up in small amount of time is it possible or not if yes then how
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
I have been hassling with this for several days now. I have 64-bit Ubuntu Server 10.04 running on an Acer Aspire EasyStore H340. I have windows 7 running on a 64-bit desktop pc and on a laptop. I mainly wanted to use the Ubuntu server for a file server, so I installed Samba and created three shares. These do show up in Windows explorer, and I can read and write to them. My windows applications seem to be able to see the shares and open & save files.
My next step was to try and set up a backup of the Windows 7 pc to the Ubuntu server. Windows integrated backup sees the shares and sub-directories within the shares, and the initial part of the backup seems to run OK, but when it tries to save an image of the 'C:" drive it works for a long time and then ends with an error (cannot complete backup).
So, I looked for some free backup programs to try, but these do not allow me to select the shares as a destination (invalid destination). The dialogue sees the drives I have mapped the shares to in Windows, but does not show any sub-directories, and selecting the mapped drive letter does not take as a destination. If I try to browse down through "Network" in the destination dialogue, it selects "Network," but does not expand it or accept it as a destination.
So, I partitioned, formatted as ext3, and mounted my 2nd 1TB SATA drive on the server, and mounted it as "storage." I set this up as share in Samba and gave everyone read-write access, but still no luck selecting it for a backup destination. After some Googling, I downloaded and installed Ext2Fsd-0.48 (a windows 'driver' for Ext2/Ext3). This installed correctly, but when I open the program, neither "Network," the shares, or the mapped drives show up anywhere.
Is it possible in dd to use it for the output file to be stored at some remote location. I do not have free space on the LVM partition whose backup I want to have via dd.
I am running 8.04 Ubuntu server. Unfortunately a couple of days ago I thought I should upgrade as desktop upgrades usually go without a hitch and are very easy. I forgot that my server is live with a few websites and a radius server set up just the way I need (took painfully long time to figure out). Needless to say the upgrade caused many config file changes and many things stopped working. I panicked since this is a live server so I went straight to the backups to recover my system. I booted from a live CD and copied the entire system overtop the new one.
Everything that needed to work works, however now I get this message in my mail about every 10-20mins: Subject: Cron <root@IMwebserver> [ -x /usr/lib/php5/maxlifetime ] && [ -d /var/lib/php5 ] && find /var/lib/php5/ -type f -cmin +$(/usr/lib/php5/maxlifetime) -print0 | xargs -r -0 rm Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/root> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=root> xargs: xargs.c:443: main: Assertion `bc_ctl.arg_max <= (131072-204' failed. Aborted
Googling it, I found that its a problem with findutils. I tried to reinstall findutils with no luck. My backup script looks like so: @daily /usr/bin/rdiff-backup --exclude /dev --exclude /tmp --exclude /var/run/cups/cups.sock --exclude /var/log --exclude /mnt --exclude /media --exclude /proc --exclude /sys --exclude /var/cache/apt / /media/removable/BACKUP/rdiff/ How can I fix my system so the above e-mail no longer occurs?
I'm looking for way to automatically backup a few machines to my server. Does anyone know a good guide to set this up? I want it to pull the files from the machines at a certain time every week.
i want to run a postfix server as a backup mx, but anybody knows how can i collect the fist server mails with this one? this is multipop action but how can i do it with dovecot or any other pop collector?
I have a Linux host acting as an ISCSI server for a Windows box. I want to keep an off site backup, so I figure rsync will keep the ISCSI server synced with an offsite Linux host. I understand that Rsync does block level incremental transfer to conserve bandwidth ok, awesome.The trick is, that I also want an archival copy kept. Say I want to go back to a revision of a file from 10 days ago, I need to be able to do that.
I was planning on using Backup Exec, since we currently have a licensed copy. Throw the archives from Backup Exec onto the ISCSI server as well, and have it keep a rotating 30 day backup, or something like that. The issue I see here is that this will be creating a deleting files as it does its daily backup rotation. I'm guessing RSYNC will see these as new files, and likely retransmit everything on a daily basis. The question then becomes, is this assumption correct, or will it still know to do a block level incremental transfer even when file names and such are changing?
I have two shares in total and there are also two external hard drives. The server is used by two different organisations that are not supposed to have access to the data of the other one(at least not as normal users).he script I need should run in the background of the server and when a drive is plugged in, it should check, which organization the drive belongs to, and depending on who the drive belongs to, backup the respective share.When the drive belong to neither, it should just do nothing.Unfortunately, I have no clue about scripting and so this makes writing a script like that, at least for me, impossible.So I wanted to know if somebody could name some good websites for learning to write such a script or give tips.
I support a small business which has an Ubuntu server running as a file server. The server is running Ubuntu 10.4. There is one hard drive which is mounted as /media/hdd. Each night this is backed up to an external USB hard drive mounted as /media/backup. The backup is carried out using the command:
Code: rsync -av /media/hdd/ /media/backup/
Is there a way to encrypt this back-up so that if the USB hard drive is plugged into another machine it cannot be read?
I want to make a daily backup of my websites from ubuntu server over ftp to another server I own. Backup schedule and process works, the problem is backup restore. Winrar says: The file is corrupt, 7-zip crashes.
Backup archive looks ok (the same size as original folder) and you can also extract it ignoring the error by winrar. But the extracted folder only contains one or two subfolder and one file(usually image) and that's all.
If I try to restore from Webmin it doesn't report any error, and it looks like restore worked. But restored files are nowhere.
I have a personal ubuntu server that provides apache, glassfish, firewall, routing, email, CVS, MySQL, etc.... This server has been running for a while with two hard drives configured into a RAID 1 array. The array has two partitions, one for swap and one for the data. I currently back up the data with a removable hard drive. I use dd and create an image of one drive and the MBRs (partition tables) of each drive.In a disaster situation I can use this data to recreate one drive and then re mirror it to the second, or just boot the back up.I like this solution because I can easily recover from bare metal, and the backup is transparent. I can browser it if needed since its an uncompressed image of the drive. The one drawback is that I need to reboot the system with a linux CD to do the backup.
My hard drive space is almost at capacity. So what I want to do is add a third drive to the array and migrate it to RAID 5. However this will cause my current backup method to no longer work. How can I back up this RAID 5 array. I need to back up the entire system, and not just the data. I have made many tweaks to the system over the years that it has been running that I can't lose if a restore is needed. I have seen a large thread here that people have been using tar. My concern with tar is how do you use a tar archive to restore a system to a new array. Im assuming that you would need to setup the array and then just restore the archive? Also, i don't have much faith in using tar on a running system. Doesn't this open yourself up to corrupted backups? My second idea is using rsync. While I consider myself experienced in linux from 10 years of personal and professions use, I have not had much experience with this utility. Would rsync provide a more reliable way to backup a running system that would enable a bare metal restore later? I once read something about people using rsync with hard links to create a backup that could store many incremental backups.My main concern with both rsync and tar is not being able to restore the OS to the state that it was in at the time of the backup.
I'm going to make a nightly backup copy from one server to another, using rsync. If I have a sufficiently large file, say 4+ GB or so, I'm not interested in copying the whole file if only a small change has been made. Can rsync detect small changes on block level and backup only those if needed?
I currently have an Ubuntu 10.04 Server with 10 2TB hard drives (Hot Swappable). I discovered that having a software raid over 16TB is not supported, so I split the drives into 2 sections and have 2 Software Raid arrays storing my movies, audio, pictures, and other software. The total current usage is around 7TB. Since backing the files up to DVDs or even BlueRay is laughable, I am going to backup the system to 2TB hard drives probably 4 of them, the problem becomes that I can only hook one backup drive at a time into the system using a hot swap tray. Now I know that I can do this manually by copying the files one at a time to the drive until it is full, switching the drive out and repeating this, but I am hoping for an automated solution, Start backup, plug first drive in, system fills up drive, swap and repeat. Also it would be nice if the system remembered what had already been backed up so when I add files to the system, I only need to attach the last drive and not start the process over.