There's so much software for backups I don't know where to start. I figured it'd be easier to ask other ubuntusers.what would be best based on these particulars ?
1. An entire "tree" hierarchy of dirs and files to be backed up every so often on external HDDs (manually as I see fit).
2. As one HDD fills up, I can grab a new one to use as my backup medium, and the software will simply re-build the file system (tree) as necessary, only copying new/updated files and creating the directories to store them.
3. Backup software tracks which&where already-backed-up files were copied in some sort of spread/db/log, i.e. so that I may search and easily see which ex hdd I need to grab.
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
I have installed an application manager(monitoring application) on my linux server. Now, i need to have backup schedule for my application. The application itself has executive file to backup database.But when i put this file in my crontab to schedule the backup program it wont run!50 09 * * * root /opt/ME/AppManager9/bin/BackupMysqlDB.sh
I want to back up an entire Linux system on a 3Tb external Western DIgital USB3 drive.
I do not want to reformat it from what it is, apparemtly NTFS.
Is there a utility that can act like a file manager like mc, that will permit me to create an ever expanding (to 320Gb) TAR file that will retain all the original file permissions. I have had nothing but disappointment with Linux backup utils with a FAT32 external drive, and I am concerned if I just try an tar the entire drive at once, with around 3 million files, I might run out of memory.
basically want i want to do is copy my whole file system to a different hard drive, then reconfigure my partitions and copy it back. then reconfigure grub.
the reason i want to do this is when on dual boot i gave it only 70gb of space and now i want to add 300 more. and since the 300gb of space is a primary partition and this is a secondary i cant extend them or combine them.
so what i want to do is. sudo cp -rP / /home/me/sshfs-folder
I'm using ubuntu for a few weeks now and i created a backup script that can copy some folders into a .tgz file. Now i want to place back the folders to where they come from and overwrite the original folder. like the /home folder in the .tgz file overwrite the /home folder on my harddrive. I already tried to do this with: tar xvpfz filename.tgz. But after that the folders came in the same folders as the backupfile stands.
I have been looking around online and I am seeing that there are several solutions for doing a nightly automated backup on Linux. I was wondering what people here actually use for doing such and why they use one particular backup method over another.
What I am looking to do is every night (at say 3am) I want my system to backup my 200gig Documents folder to my external hard drive. Does Ubuntu have a tool built in by default to do this or do I need to add something from the repos/online?
I just switch back to ubuntu after running the windows for about a 6 months again, new laptop, programs needed in windows, either way I'm back. What I had setup in windows were specific files that would auto backup to my samba file server when the network was detected. I'm looking to do the same in ubuntu now. Basically I'm thinking of writing a script to backup the files, only thing I'd be stuck with is how to tell the script to run when I connect to the network at home? Is their software already designed for this.
I was doing a big backup system this morning (in Ubuntu 10.10) which was taking some time, when there was an urgent task I needed to see to, so I rashly stopped the backup with CTRL-C. Later I found that this left the recipient directory with an input-output problem, so I could not delete the truncated backup file (or any other file in the directory). I tried all sorts of chmod procedures etc., including using the usually very successful RESCUE CD from a USB pen, but nothing seemed to cure the input-output problem.
In the end, in desperation, I copied all the other files into a new directory, and then deleted the original directory from Windows 7. (I use a multiple boot system.) Windows obviously does not observe the same permissions as Linux, and it obeyed without demure! So all is now well.
Through the Black Friday shuffle of getting new hardware, I now have a 500TB external drive, a 1TB external drive, and an old computer I want to set up as a home server. My family has a lot of photos that are currently stored on many different computers and are not backed up, I want 500gb of space for photos, and for those photos to be backed up. That would leave the other half of the 1TB drive for assorted things like personal backups, and general file storage. I know enough how to set up Ubuntu server edition on the computer, but the options on how I can set up the storage is stumping me.
To Recap, I have 1.5TB of storage total split 1TB/500GB. I want 500GB to be used for a central storage for the 10+ computers in my house(mostly using Windows) and that 500GB would be automatically backed up. The 500GB that's left would be used for non critical files, and wouldn't be backed up.
What is the best way of backing up the files? (script once a day that copies files? Some backup program?)
Would the 500gb drive be best for backing up to(having the 1TB be where people would put the pictures) or the other way around? Does it really matter?
Any tips on the cleanest way to have this work cleanly with Windows, Linux, and Mac? How well do photo programs(Picasa, Shotwell, iPhoto) like a setup like this? Is it possible to have different programs on different machines all reference the same file system without their automatic sorting(to folders, usually by date) messing each other up?
Updated from Ubuntu 10.10 to 11.4. Boots only to grub. 11.4 Live CD results in very distorted video so option of reinstalling Grub 2 is not available. Since 11.4 Live CD has such bad video I want to go back to 10.10. Notebook system has one drive and no windows install. Live CD boot from 10.04 works with clean video, but when I try to copy filesystem /home/username it won't let me because I don't have permission. Properties says I am not owner so I can't change permission. Can someone tell me how I can get permission required to copy files in filesystem when booting from Live CD? Sudo command under terminal of Live CD appears to have no effect on harddrive filesystem files I'm trying to backup.
System76 laptop, 10.04, 320GB HDD, VMware with Win7 in one VM; want to use Clonezilla as am using it to back up (bare metal backup image) another older smaller dual-boot Ubuntu/XP machine. This System76 laptop is a work machine that I control; the Win7 VM only does a couple of things but they're necessary for work and I don't want to lose the configuration. The reason for the bare metal backup is so if I have to, I can restore and get back to work - something I've had to do on some previous occasions back when I used Windows. Data is no problem - I back that up separately on an hourly basis.
My question is what FS to use on the backup drive; for instance, for the dual-boot XP/Linux work machine I'm currently backing up, I'm using a 30GB external HDD formatted in FAT32. That's OK because 30GB is below the limit for FAT32. But for the newer laptop I'll need a much bigger backup partition. I chose FAT32 for the old one because I know everything on the computer being backed up, Windows & Linux both, is compatible with it. But what FS should I use to back up the new laptop, considering that I'll be backing up the Win7 VM as well as the main Linux part of the machine? I plan to use a backup partition of about 160GB. Could I format it NTFS and have it work with Ubuntu 10.04? Or, conversely, if I format it EXT, will it back up the Win7 VM OK?
I would like to make a backup file from my fedora 12, in case if I have any problem with it, I could restore all my programs and settings from OS, I used do this with northon ghost in windows, but now in linux I don't know for sure. Yesterday I made a backup, in the end it was 34gb of his size, I wanna backup only what is used, how I do this?
I have this little problem. I wanna backup a big file that is constantly growing. Is there any way to make one backup and then have some way to take incremental backup of it?
I am new to Linux. I am using tar to backup my emails to a server. I would like to automate this process to routinely backup my emails periodicly, however, i keep running into a problem: I start in the dir I would like to create the tar file (dir size = 240MB). I enter the following command
tar cf bup.mail.llc.tar "/Users/d/Library/Mail/INBOX.mbox/Messages" file size = 234MB
When I would like to backup my emails into the previously created tar file I use the following command:
tar uf bup.mail.llc.tar "/Users/d/Library/Mail/INBOX.mbox/Messages"file size = 462MB The backup command works, except the size of the original tar file grows, around twice the size. When I extract the updated tar file (file size = 462MB), the unarchived file is 240MB the same size as the original directory.
Why does the size of the tar keep growing each time i perform 'tar uf'?? I don't understand this
I've been using 11.04 Unity, & quite like it. I fired up Blender 2.49b the other day (not used it for a long time) & its behaviour was very erratic. Due to this I decided to reinstall 10.10 for the time being, until October or maybe even the 12.04 LTS. I backed everything up & reinstalled 10.10. I then tried to restore my Evolution from the Natty backup file, which simply didn't work. The message was something about it not being a valid file. I'm assuming this is a non backwards compatibility issue.
Any way getting Blender to work or restoring Evolution? With regard to Blender (in Natty); I've not tried proprietary drivers for my GPU yet as the open source defaults have always been fine, so that's an option. It means another reinstall (of Natty), but that really isn't such a big deal at this point. With regard to Evolution (in Maverick); I found a ppa but I'm unsure how to proceed once I've added it. Would I do an apt-get update & an apt-get install evolution?
I have a couple of Lenny LAMP servers, and a backup server. (virtual testing environment)
1. What is the best way to perform a backup? (system state as well as individual files) Although system state can also be accomplished through the hypervisor.
2. Between Windows computers, I access shared directories simply by \hostnamesharedmap or \host_IPsharedmap. Between Windows and linux i use SAMBA. But there must be a simple way to copy 2 files between linux hosts?
3. I've searched a lot, and only found people with the same question without a good answer: is there a linux equivalent for robocopy?
O/S: Fedora 12 I am newbie in linux. What I want to do is: Make backup for my file system, cos I learn how to configure servers. So if I made some thing wrong, I want to be able to restore the default setting for my files. Instated of install new O/S.
I did a backup of the ssd on my eeepc using the following command from a Linux Mint on a USB key: dd if=/dev/sda1 of=/media/disk/eeepc_save/SYSTEM/system.bck (/media/disk in an external USB disk)
I deleted the ext2 partition using GPartEd on live USB key and created it back. I rebooted Linux Mint and restored the filesystem using the opposite command : dd if=/media/disk/eeepc_save/SYSTEM/system.bck of=/dev/sda1
I mounted /dev/sda1 and when I "ls" the root directory, I get several "NFS stale file handle" messages concerning directories (/dev and other). I tried "e2fsck -y", had a bundle of corrections that resulted in the deletion of the directories. I don't use NFS. I did the same for the user filesystem and had no problem (it's an ext3 partition). The two filesystems are the ones that came with the original Xandros installed on my eeepc and that were mounted with union-fs.
Is there a way/command to back up all data from a Red hat Linux 4 serve[Including user rpofiles, data, group info, encrypts] either to a Red hat Linux 5.4 machine or as an Image file or manageable resource?
I have recently upgraded to Bugzilla3 and I wanted to restore my bugzilla database with my backup but when I attempt to tar -xvvzf file.tgz I get the error: gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error exit delayed from previous errors
My script that creates the backup is: #!/bin/sh datestr=`date +%m-%d-%Y` bakdirpart="bugzilla.backup.$datestr" bakdir="$HOME/$bakdirpart" mkdir "$bakdir"
(cd /etc; tar cvzf $bakdir/mysql.conf.tgz mysql) (cd /etc; tar cvzf $bakdir/apache2.conf.tgz apache2) (cd /usr/share; tar cvzf $bakdir/bzreport.share.tgz bzreport) (cd /usr/share; tar cvzf $bakdir/bugzilla.share.tgz bugzilla) (cd /var/lib; tar cvzf $bakdir/mysql.hotdb.tgz mysql) (cd /var; tar cvzf $bakdir/www.tgz www) (cd "$HOME"; tar cvf "${bakdir}.tar" "$bakdirpart")
Rsnapshot is a software written in Perl to make backup of local and remote file system. The well proven rsync is behind this utility. rsnapshot does not need root user intervention to restore the data of a normal user. It does not take much space in your Backup server. It can be easily automated (scheduled) to make life easier. Just setup once and forget it configuration. Basically it takes snapshot of file system (or a part of) in regular interval such as hourly, daily, weekly and monthly.
This can be configured easily through a simple text based configuration file. The above task can be setup in a few easy steps in a few minutes. Two major tasks are configuring rsnapshot and openssh automatic login. To make the backup automatically, we need to automate the remote login in a secured way. This can be done through openssh tools. This scenario depicts backup of desktop (assuming that IP address is 192.168.0.100) data to a backup server. My desktop runs on Ubuntu 10.04 and backup server runs on Debian Squeeze. [URL]
I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.
My problem is with the mySQL incremental / binary log backups.
The problem is that the binary log file(s) are always named xxxx-bin.1.
Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.
I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.
My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.
All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.
The my.cnf file contents that are relavent are as follows:
Code:
The statements in the backup script that do the backup are:
mysqladmin flush-logs
or
mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf
I'm setting up a Backup & Media server, which will be running debian. I will setup a small HD or SSD/CF card for the OS, and a MD raid for the data drives.The total size of the raid will be either 3 or 4TB, depending. Now, I need to figure out what filesystem to use on top of this raid.My criteria is as follows:
1. Support for large files. I can't imagine anything larger than about 1.2TB, but the 4GB of, say, fat32 just isn't enough.
2. Robust. I don't want it falling apart on me; nothing too unstable.
3. (and this is most important): Good Undelete support. I got burned recently when a software glitch managed to rm -rf my EXT4 drive. All the file data is still there, but all the metadata is gone. I *DO NOT* want that happening with this. I want to be able to do a "RM / -RF", immediately unmount it, and then recover *all* of the deleted data. Obviously, when data is overwritten it's overwritten, but I don't want to lose all my metadata if a "RM -RF" happens. FAT-32 is the model I'm looking at: You can usually recover deleted files if anything happens to the drive.
So, what are my options?EXT2 looks like a possibility. EX4 is *clear out*, unless there's some nice utility/mode that keeps a backup of all deleted metadata etc.