General :: Write A Simple Backup Script To Backup A Single Folder Nd?
Sep 15, 2009
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
I'm a beginner at backing up my Ubuntu system, but I've set Simple Backup to do a backup once a week. I deleted the oldest of these files, but now it's sitting in my Trash and I can't empty it. I get a permission denied error for the folders within the backup folder in the trash, yet I can't restore the folder either - Ubuntu says it 'failed to determine the original path' for the folder. I've just discovered this in Xubuntu Jaunty, but I'm confident the same will happen in any other WM I choose (I have several installed - I like variety ).
It's not a huge file, but it's hanging out there and I'd like to get it either deleted or restored. Possibly I oughtn't to have deleted it in the first place (it usually lives in /var/backup, which I can't access except as root). The files, which I probably deleted /as/ root, show up in my user trash rather than root's trash. I found the trash in ~/.local/share/Trash/files, but I'm not sure if just deleting them as root would be a good.
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
I am currently trying to figure out what's the best way to backup several PCs (about 5 computers each with Windows 7) from my family.
As I want the same solution for all Computer - I set up my old computer (Windows 7) and added some hard drives and there should now be enough space to backup the data of everyone from my family via the network. (Lets call this computer "Server")
But now I am wondering whats the best way to do this? What I do not want:
I do not want to start the Server each time manually when a computer tries to backup. (I thought about using WakeOnLan.. but I do not know if this is a good idea) I do not want the Server to run permanently I do not want to make the backups manually they should backup about every week automatically.
So which Software on the Computers / or the "Server" would you recommend?
Or would you eve recommend me to use Linux on the Server? If so, which Software would you use then?
I have a second HD in my computer for doing a backup onto. All I want is to select a couple of my documents folders and have them copied to it daily. I would like it to add new files and update newly modified ones. I don't want it encrypted or archived, just a basic copy.
I use Simple Backup to backup all my machines across my home network. I have just upgraded my test system to 11.04 with a clean install and want to restore all mt stuff from my backup, however, looking in the software centre Simple Backup no longer exists..
i am very sorry if this has been asked before... i'm sure it has.. but i have searched all over the net looking for an answer and i still cant find it...
I have a really simple cron job script like this:
When i run this manually it works fine but when i run it from my ROOT user in Plesk as a cron task is always creates a file that is just 45 bytes. Why doesn't it work... I am running it as a root user.. so surely i must have permission to access the file?
I'm trying to set up a simple backup script with cron.
In "crontab -e" (and sudo crontab-e - I tried both) I enter "0 22 * * * /home/USERNAME/.backup.sh", with the hope that it will run the script at 10pm each day. The srcipt work fine if I run in a terminal. why it won't work? It's bound to be something obvious....
uBuntu 10.10, Simple Backup My backup stopped working at certain date. Went back in "Simple backup-Configuration", it says, there is no configuration. What might cause trouble like this? Should I seek for more secure back up tool?
I am somewhat new to Linux and I am looking for a way to back up my HD with all my Linux files. I have a Toshiba laptop running Windows 7. The HD has been partitioned so that the computer can run Red Hat Scientific Linux. Using Grub I can dual boot to either Windows 7 or Linux on start up. I want a simple way of backing up the entire contents of my HD (both partitions - everything) - so that in the event of my laptop being damaged I can reconstruct my set up and data as before with all my files and settings in both Windows 7 and Linux intact. Is there a simple program that will enable me to copy everything to an external HD for back up. Can anyone recommend a package that will do this?
Here's what I want to do: Copy the whole Ubuntu 10.04 partition/installation from my old laptop to the new one.
What I tried: Used Simple backup to back up my Ubuntu installation to a USB hard drive. It yields a 10.4 gig folder containing 7 files. Installed 10.04 on the new laptop, used Synaptic to install Simple Backup, plugged in USB drive, started Simple Backup Restore, tagged the backup directory in Simple Backup Restore, and get the error:
Error: no backups found in the target directory.
I also tried copying the backup to the local drive, same difference.
I am using rhel5 running as samba PDC.Most of the user save their data on a common folder on the server.Now I want to backup this data to some other location to have redundancy.It could be external USB HDD or other folder on the same server.How to create backup script and automate it using cron.
Can anyone help with this SVNbackup script.When I run this script the incremental backup take place everyday placing repository.000000-000001,repo.000000-000002 etc.which consumes more disk space.I want to have a script which will match the earlier dumped file and do differential backup (not repitition which will consume less space.
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
It seems that MySQL can open and write to the file fine, it just can't dump
does anyone know of a good backup software for Ubuntu 10.4 that will let me select which folders to backup, rather than a complete backup? My install and settings etc can be replaced, but my photos and memories cannot!
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that..
rsync -r -a -v -z * --delete freenas:dSIBackups It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not. Can anyone think of a reason that this would happen?
I have use backup-manager tool very often but now found some problems and goes to it official site backup-manager.org for search answers.But this site didn't open already more than week! url url url
At now DNS records don't have A record:
Code: $host backup-manager.org backup-manager.org mail is handled by 10 private.sukria.net. backup-manager.org mail is handled by 15 private.nxr.fr. backup-manager.org mail is handled by 5 mx.sukria.net. backup-manager.org mail is handled by 10 jupiter.unix-scripts.info. Does this project moved, renamed or died? Maybe it change the domain address? Or this is only temporary problems with hosting or domain?
incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.
I'm trying to create a very simple back up script to back up the contents of one folder on my system to an NTFS formatted external hdd. I want to keep the ownerships and permissions of the files I'm backing up intact so I'm putting them into a tar archive. Compression is not necessary as I have plenty of space for the backup.
I have created the initial backup with the following command: Code: tar -cpf $bupath/backup.tar $sourcepath This seemed to work quite well with the resulting file being about 170gb and took about 5hrs. For subsequent backups, the files are probably only going to change by about 10% at most so it seems inefficient to create a whole new backup from scratch. I would like to be able to just update my existing archive with any new/altered files.
I have tried using the update mode (-u) with tar like so:Code: tar -upf $bupath/backup.tar $sourcepath
So far this has been running for about 10hrs and the archive has grown to approx 220gb! What's going wrong here? I was expecting the update to take 30mins max and there be no significant change in the archive size. Am I perhaps misinterpreting the purpose of update mode in tar, or is there something wrong with my command? Is there a better/easy way to accomplish this?
I have a really quick question-I would like to set up some type of scheduled event to back up my entire /home folder to a USB drive.I know about all of the various programs such as simple backup, etc. and have used them before. As far as I know, these programs cant do what Im trying to do.Does anyone know what I could use to back up my files at a specific time to a specific USB device?Preferably, I would like to just have a simple
Code: sudo cp /home /media/Cruzer run every night at, say, 2am.
Here is my brief hardware and software detail in my production environment : AMD Phenom X4 3.4gHZ (Over clock to 4gHZ, 8G of Memory, 1TB 7200rpm Harddrive, Running Ubuntu server 10.10.My web production environment were pieced together 3 weeks ago.Here is my dilemma. started out with less that 40 users and now hitting 4,000 unique users per day.Now I am thinking I need faster write to disk and backup of data so I am thinking about putting together a Raid5.
I preparation for this.I have bought a new motherboard, AMD Phenom X4 3.6, and 2 more 2TB 7200rpm (Currently, I have a 2TB 7200rpm not used much)Been digging around this forum for posts with raid setup but still not sure how to seamlessly moving the some 10Gig of data from my current running prod environment once I have RAID5 installed on this new machine via the LIVE Ubuntu Server CD.
I have been looking around online and I am seeing that there are several solutions for doing a nightly automated backup on Linux. I was wondering what people here actually use for doing such and why they use one particular backup method over another.
What I am looking to do is every night (at say 3am) I want my system to backup my 200gig Documents folder to my external hard drive. Does Ubuntu have a tool built in by default to do this or do I need to add something from the repos/online?
Before I reinstalled Ubuntu (this time allocating the entire disk to it as I never really used Windows any more) I backed up the entire contents of my /home folder using Deja Dup. Now that I am done reinstalling Ubuntu I am trying to restore the backup. However, when it actually begins restoring the backup, it says "Restore failed: failed with an unknown error".