Ubuntu :: Back In Time To Backup Home Directory To A Second Hdd That Is Mounted At /media/backup?
May 18, 2010
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?
[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.
I need to backup my /home directory because I want to switch from Fedora to OpenSUSE but I didn't put /home as a separate partition so I need to back it up. Problem is, I can't figure out how.I've tried tar and gzip through every google hit I can possibly find but not one has worked.
I am using back in time to back up files from home and from another mounted directory on my system (ntfs). The back-ups are occurring automatically and appear to be complete; but, I cannot delete old back-up snapshots in the backintime GUI Also with sudo nautilus or as root in terminal with (rmdir) I cannot delete the snapshots. My drive is filling up and rather than uninstalling back in time, I would like to simply delete the unneeded snapshots. How can I delete these files? Is there an rsync file that I should configure to delete these? My expectation of backintime was that it would back-up at the requested frequency and not create complete duplicate copies of the files, but, use symbolic links to unchanged files. How can I verify if this is the case? Does the cron file control this>
I would like to make a backup of my /home directory onto a NAS device, and have whatever software is used for the purpose update (new and changed files) every night, or perhaps everytime there is a period of inactivity. Any suggestions for a GUI package that will do this?
I do not want a complete backup each time, just the new or changed files. Also prefer software that backs up to a mirror of the original (i.e., uncompressed folders and files)
I would like to have dump backup just my home directory but am having problems the command I am using wants to back every thing and takes hours upon hours it has been running for about 10 hr and only 21% is done. This is the command dump -0u -f dp_hd /media/CENTON USB/ /how can I get this to back up only my home directory
So I've finally given up on saving my kubuntu install that wont boot. I've searched, and looked, but couldn't find a thing.My delema now is to make sure that I:
a) get all of the user data safely packed up onto my external USB drive. I believe it's all in the home directory. I'm not sure about getting hidden files though...
b) get the new install to go smoothly, and not mess up grub or the parralell XP install on the same hard drive.
c) get the user data back on the computer and recreate the user structure. Permissions were messed up already, so setting those up again is not an issue.
So, I've been poking around, and this is how I think things should go:
a) tar cvpjf backup.tar.bz2 /home to get my home directory backed up. Not exactly sure how to get from here to my external hd, but I'm sure I can figure it out
b) just run a live cd of kubuntu, delete the old partitions, and reinstall over them?
c) unzip the tar into by /home directory.
That's all I've been able to find so far. How do I set up the users? Will they show up as soon as I untar? Will the resinstall play nice with my windows install? Will I get all the hidden files too? Is there anything I'm missing?
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
recently i made a backup of my home directory in 10.10 before reinstalling 10.10. again.This time I chose to manually define the partitions (50GB Root, 25GB Swap, 325GB Home)Now i wish to migrate the old home into the newly installed home, which is on a separate partition.I have found the following documentation URL...Still, as a beginner I am not quite sure about the necessary steps to perform.As the new home is located on a separate partition is it possible to simple delete all directories there and copy all directories from old home to new home with rsync?
Do I have to install all the software that corresponds to the old home first followed by migrating home or first migrating home followed by installing the software such as thunderbird, Texlive2010 etc.Guess that migration should take place at a later stage. Otherwise my old profile files from firefox and thunderbird will be overwriten by new ones?
I'm running a cron job every night to dump a MySQL database to an external hard drive. It works, however when I check on it the following morning the external is no longer mounted and the XFS log file is corrupted. If I run
Code: xfs_repair -L /dev/sdf1 It works, but then I get these issues: Code: XFS: Filesystem sdf1 has duplicate UUID - can't mount I can reset the UUID, but it's difficult to have to do this every day.
I am using Ubuntu 10.04 x86_64. I log in to the machine using nfs. For a problem with mounting my home directory, I had to copy all the contents of my home directory (including all configuration files) from a recent snapshot on to itself. That is, I did something like,
Code: cp -r /home/user/user /home/user
All of my recent data and program configurations were in /home/user/user. So after the copy operation, I logged out and logged back in again to see that all my configuration and data was restored to what I wanted. But the problem is that now on my desktop I see hundreds of mounted volumes. These are coming from an hourly/weekly snapshot program. The tech support guys for my lab have suggested copying all relevant data to a backup and then deleting the home directory altogether. But I don't want to configure all programs all over again. I think I should be able to get rid of the problem by editing/deleting one or more desktop configuration files. I just don't know which ones. I tried looking around the gconf-editor but was overwhelmed at the amount of information on there.
I'm writing a script to rsync some directories to external hdd for backup.
My external hdd gets automatically mounted to /media/backup1
My script then backs up predefined directories to /media/backup1.
I have added this script to cron to run once every day.
The problem is that in the case where the drive is not plugged in and the script runs, it backs up to my local hard drive, and since it is more than 70% full, it fills it up by duplicating that 70% onto itself.
I have taken the script further, to test whether /media/backup1 is mounted. If it is, the backup will run. If it is not, it will bail out.
I'm using the mountpoint program to test for mounts.
My script so far:
Code: #!/bin/bash if [[ `mountpoint /media/backup1` ]]; then echo "filesystem mounted" # The backup function. Commented out for testing.
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
It seems that MySQL can open and write to the file fine, it just can't dump
does anyone know of a good backup software for Ubuntu 10.4 that will let me select which folders to backup, rather than a complete backup? My install and settings etc can be replaced, but my photos and memories cannot!
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I'm setting up a Backup & Media server, which will be running debian. I will setup a small HD or SSD/CF card for the OS, and a MD raid for the data drives.The total size of the raid will be either 3 or 4TB, depending. Now, I need to figure out what filesystem to use on top of this raid.My criteria is as follows:
1. Support for large files. I can't imagine anything larger than about 1.2TB, but the 4GB of, say, fat32 just isn't enough.
2. Robust. I don't want it falling apart on me; nothing too unstable.
3. (and this is most important): Good Undelete support. I got burned recently when a software glitch managed to rm -rf my EXT4 drive. All the file data is still there, but all the metadata is gone. I *DO NOT* want that happening with this. I want to be able to do a "RM / -RF", immediately unmount it, and then recover *all* of the deleted data. Obviously, when data is overwritten it's overwritten, but I don't want to lose all my metadata if a "RM -RF" happens. FAT-32 is the model I'm looking at: You can usually recover deleted files if anything happens to the drive.
So, what are my options?EXT2 looks like a possibility. EX4 is *clear out*, unless there's some nice utility/mode that keeps a backup of all deleted metadata etc.