incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?
I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.
I'm trying to take backup for my data for rhel, but I not able to take all backup. Could anybody help show me how I take incremental and full backups? What is the process?
A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.
I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:
I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.
Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.
My problem is with the mySQL incremental / binary log backups.
The problem is that the binary log file(s) are always named xxxx-bin.1.
Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.
I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.
My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.
All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.
The my.cnf file contents that are relavent are as follows:
Code:
The statements in the backup script that do the backup are:
mysqladmin flush-logs
or
mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf
I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as
DUMP: Only level 0 dumps are allowed on a subdirectory DUMP: The ENTIRE dump is aborted.
The code I used =============================== #!/bin/bash #Full Day Backup Script #application folders backup #test is the username now=$(date +"%d-%m-%Y") [Code]...
I have a really quick question-I would like to set up some type of scheduled event to back up my entire /home folder to a USB drive.I know about all of the various programs such as simple backup, etc. and have used them before. As far as I know, these programs cant do what Im trying to do.Does anyone know what I could use to back up my files at a specific time to a specific USB device?Preferably, I would like to just have a simple
Code: sudo cp /home /media/Cruzer run every night at, say, 2am.
I'm trying to create a very simple back up script to back up the contents of one folder on my system to an NTFS formatted external hdd. I want to keep the ownerships and permissions of the files I'm backing up intact so I'm putting them into a tar archive. Compression is not necessary as I have plenty of space for the backup.
I have created the initial backup with the following command: Code: tar -cpf $bupath/backup.tar $sourcepath This seemed to work quite well with the resulting file being about 170gb and took about 5hrs. For subsequent backups, the files are probably only going to change by about 10% at most so it seems inefficient to create a whole new backup from scratch. I would like to be able to just update my existing archive with any new/altered files.
I have tried using the update mode (-u) with tar like so:Code: tar -upf $bupath/backup.tar $sourcepath
So far this has been running for about 10hrs and the archive has grown to approx 220gb! What's going wrong here? I was expecting the update to take 30mins max and there be no significant change in the archive size. Am I perhaps misinterpreting the purpose of update mode in tar, or is there something wrong with my command? Is there a better/easy way to accomplish this?
I have been looking around online and I am seeing that there are several solutions for doing a nightly automated backup on Linux. I was wondering what people here actually use for doing such and why they use one particular backup method over another.
What I am looking to do is every night (at say 3am) I want my system to backup my 200gig Documents folder to my external hard drive. Does Ubuntu have a tool built in by default to do this or do I need to add something from the repos/online?
Before I reinstalled Ubuntu (this time allocating the entire disk to it as I never really used Windows any more) I backed up the entire contents of my /home folder using Deja Dup. Now that I am done reinstalling Ubuntu I am trying to restore the backup. However, when it actually begins restoring the backup, it says "Restore failed: failed with an unknown error".
I have read that i can backup the entire system with the home folder with commands, or with programs, such as clonezilla, but it doesnt work, so im trying to back it up with commands now but i cant find a good tutorial to explain what commands to use.
I am trying to use rsync & ssh to move a backup folder some computers to a server. I found a command that is supposed to do this, but I am having issues getting it to work.
I'm a beginner at backing up my Ubuntu system, but I've set Simple Backup to do a backup once a week. I deleted the oldest of these files, but now it's sitting in my Trash and I can't empty it. I get a permission denied error for the folders within the backup folder in the trash, yet I can't restore the folder either - Ubuntu says it 'failed to determine the original path' for the folder. I've just discovered this in Xubuntu Jaunty, but I'm confident the same will happen in any other WM I choose (I have several installed - I like variety ).
It's not a huge file, but it's hanging out there and I'd like to get it either deleted or restored. Possibly I oughtn't to have deleted it in the first place (it usually lives in /var/backup, which I can't access except as root). The files, which I probably deleted /as/ root, show up in my user trash rather than root's trash. I found the trash in ~/.local/share/Trash/files, but I'm not sure if just deleting them as root would be a good.
How to remove a specific folder from your backup?$ rdiff-backup --remove-older-than now /backup/backup_laptop/home/derick/DownloadsFatal Error: Increments for directory /backup/backup_laptop/home/derick/Downloads cannot be removed separately.Instead run on entire directory /backup/backup_laptop.
I have a backup folder which I need to prune and I've been trying to do a find and destroy action on files in this folder which have not been modified for more than 30 days. I figure if no users complain that their files have been missing for more than 30 days, it's safe to delete them from this folder.
Using redhat linux el5, when I had booted I had pbm repairing filesystem. So I run following commands 1. fsck -l 2. fsck But I cannot solve the pbm. My pbm is I had important folder "backup" in root folder just I want to copy it in pendrive how to do it.
I installed a second HD, and formatted it to ext4. I gave it the "/backup" label. I am trying to figure out how to mount it so that I can run cron to backup my home folder onto it once a week. This is what the fstab looks like now
I am using rhel5 running as samba PDC.Most of the user save their data on a common folder on the server.Now I want to backup this data to some other location to have redundancy.It could be external USB HDD or other folder on the same server.How to create backup script and automate it using cron.
I need to re-install Etch recently but forgot to backup /var/lib/dpkg folder, I know there's archive server provides old packages, but packages from www.debian-multimedia.org does not. I have tried its mirror sites but it seems they had removed the old packages from Etch and earlier releases.
can i just copy/backup postfix mail queues in /var/spool/postfix and paste that folder back in after i done migrating all users and mails to a new mailserver?