Fedora :: Incremental Automatic Backup
Jul 18, 2011
I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.
View 4 Replies
ADVERTISEMENT
Jan 15, 2010
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
file: including.txt:
View 7 Replies
View Related
Mar 16, 2011
I'm trying to take backup for my data for rhel, but I not able to take all backup. Could anybody help show me how I take incremental and full backups? What is the process?
View 3 Replies
View Related
Oct 20, 2010
when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?
View 5 Replies
View Related
Mar 8, 2011
incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.
View 6 Replies
View Related
Sep 18, 2010
A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.
View 14 Replies
View Related
May 7, 2009
I had full backup in mysql. now i added some tables .i got new binary logs. how i can i use these logs to have incremental backup.
View 3 Replies
View Related
Sep 17, 2010
I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.
Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
View 1 Replies
View Related
Apr 3, 2010
I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:
[code]....
View 7 Replies
View Related
Mar 15, 2011
I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.
My problem is with the mySQL incremental / binary log backups.
The problem is that the binary log file(s) are always named xxxx-bin.1.
Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.
I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.
My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.
All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.
The my.cnf file contents that are relavent are as follows:
Code:
The statements in the backup script that do the backup are:
mysqladmin flush-logs
or
mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf
View 3 Replies
View Related
Aug 10, 2010
I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.
View 2 Replies
View Related
Feb 8, 2016
Like topic, I want to create a weekly backup of some folder to anoter partition (or external usb), compressed or not (folder also of 20/30 gb), with only root permission on file (or folder) created..This system, where I have installed debian jessie, is always on being used like a NAS..
View 1 Replies
View Related
Jun 20, 2011
i am not familialized with Linux i installed recently a TACACS+ server under CentOS 5.5 and it is working well but my boss asked me to make a weekly backup of the tacacs.log file to check who was connected at any time the problem is that i don't know how tp make this backup and how to make it automatic and also he asked me to change the default port of tacacs from 49 to another port does someone knows how to make it?
View 1 Replies
View Related
Sep 6, 2010
I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as
DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.
The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...
View 2 Replies
View Related
Jun 9, 2010
I created a profile for Yast Backup and successfully executed it manually. However, when I set up the Automatic Backup with the same profile it doesn't work. It will start at the scheduled time and do the "search" part of it but it hangs before it creates the archive. These two processes (backup_cron & y2base) remains running and never terminates.
Here are the last few lines in the Yast log before it hangs:
Code:
2010-06-09 20:08:05 <1> londy(1957) [YCP] backup/ui.ycp:2037 subprocess started: true (PID: 1958)
2010-06-09 20:08:05 <1> londy(1957) [YCP] backup/ui.ycp:1827 Reading installed packages
2010-06-09 20:08:18 <1> londy(1957) [YCP] backup/ui.ycp:1735 Number of installed packages: 1281
2010-06-09 20:09:17 <1> londy(1957) [YCP] backup/ui.ycp:1842 Searching in packages started
[Code]....
View 2 Replies
View Related
Apr 14, 2010
Linux/Ubuntu noob here so please be gentle So I own an Ubuntu server (7.10 - "gutsy") which was previously used for my small business. All setup and maintenance of this server was done by an admin who has since moved on and I can't get in touch with.As part of the setup, this admin has somehow setup the server such that whenever I plug in an external HDD (USB) it automatically runs a backup script which copies over a whole bunch of stuff to this drive.I want to cancel/delete this script as this is no longer necessary. Can anyone give me any pointers as to how I could track down where this script is and how to remove it?
View 2 Replies
View Related
Sep 10, 2010
I wonder if anyone knows of a cross platform sync/backup utility that can do the following:
Incremental backups Revisioned backup (possibility to not just restore the latest but to go back to a specific point in time) Two way sync. E.g. two computers sync to the same server and the server also pushes changes to the clients (well I suppose it might be possible to solve it by restoring the latest backup also)
Cross platform: Linux, Windows & Mac (the server side may be Linux only)
Basically I want a revisioned version of Unison or a local version of Dropbox with support for custom folders. Depending on how you see it.
I'm not afraid to script things together if no pre-built solution exists but there are tools that when combined might do this. However it must be possible to automate, this comes from the fact that I want to be able to schedule it so it performs automatically. (If there is a way to push changes to the computers, like Dropbox, that's a plus but not a requirement.) This also means that a command line utility might be preferable over a GUI-only one.
View 1 Replies
View Related
Feb 2, 2010
I have been looking around online and I am seeing that there are several solutions for doing a nightly automated backup on Linux. I was wondering what people here actually use for doing such and why they use one particular backup method over another.
What I am looking to do is every night (at say 3am) I want my system to backup my 200gig Documents folder to my external hard drive. Does Ubuntu have a tool built in by default to do this or do I need to add something from the repos/online?
View 1 Replies
View Related
Aug 6, 2010
Currently, I would like to setup an automatic backup system, where the data on my computer is copied to an external harddisk which is connected to the router (Siemens SE551). This device theoretically allows setting up a file server, but unfortunately I can't access the share from Linux. I have not yet tried to access this share from Windows.
The rough plan is to use rsync and ssh, issued by a cronjob on both, my Ubuntu- and on my WinXP-notebook. But before considering this, I have to get my system connected with this drive.
BTW: I am using Ubuntu 10.4.
According to the SE551 manual [URL], NAS is not provided.
I followed this manual and set security OFF. According to the router's web interface, the device was recognized correctly.
I installed samba:
Code:
gedit /etc/samba/smb.conf
#
# Sample configuration file for the Samba suite for Debian GNU/Linux.
#
#
# This is the main Samba configuration file. You should read the
[Code]....
View 5 Replies
View Related
Sep 6, 2010
set up incremental backups with crontab. I just discovered that tar is not actually incrementing the tar files. I first created the tar files, then in crontab I have:
Code:
cd /; tar -cpf --incremental --exclude-from=/root/ExcludeFromTar.txt mnt/PATRIOT/bkp/home.tar home
I only just discovered that this creates a file whose filename is "/--incremental". I also tried using tar's -G switch instead of --incremental:
[Code]....
View 2 Replies
View Related
Aug 10, 2011
I want to make a backup from my Email and my Favorites from Mozilla.
But which folders I have to make a backup from.
View 3 Replies
View Related
Feb 26, 2010
Does less have an incremental search?
I'm on xubuntu.
View 2 Replies
View Related
Jan 21, 2010
I am using rsync to backup dirs on my ubuntu server onto a NAS (which is mounted onto the filesystem), but the problem is that it is constantly doing full backups rather than doing incrementals and I am not really sure why. After doing a bit of expermienting with the script I noticed that if I just backed up a home dir (/home/user) the incremental backups work fine. If however I was to back up a dir like (/home/domain/user) it always does full backups.I have tried various different scripts but still the same end result. The latest script is a variation on the a script found on the samba rsync examples webpage, see below...
#!/bin/bash
# rsyncbu.sh -- backup to nas using rsync
# This script backups files listed in BDIR to the BSERVER. The verbose output along with the date is listed in the LOG_FILE specified
# verbose output
[code]....
View 4 Replies
View Related
Jan 29, 2010
Can I use rsync for incremental backups of the running linux server?
View 5 Replies
View Related
May 6, 2010
I'm getting a video from a camera connected to the computer and saving it to a constantly increasing file.
The thing is that I'm trying to make a non-stop copy of this file over the network (i.e. using scp, rsync or something like that).
View 4 Replies
View Related
Dec 2, 2010
With the --backup and --backup-dir= options on rsync, I can tell it another tree where to put files that are deleted or replaced. I'm hoping it fills out the tree with a replica of the original directory paths (at least for the files put there) or else it's a show stopper. What I'm wanting to find out applies when I'm restoring files. Assuming each time I run rsync (once a day) I make a new directory tree (named by the date) for the backup directory. For each file name/path in the tree, I would start with whatever is in the main tree (the rsync target) and work through the incremental trees going backwards until I reach the date of interest to restore to. If along the way I encounter a file in an incremental, I would replace the previous file at that path with this next one. So by the time I get back to a given date, I should have the version of the file which was present at that date. Do this for each file in the tree and it should be a full restore.
But ... and this is the hard part, it seems. What about files that did not exist at the intended restore date, but do exist (were created) on a date after the intended restore date. What I'd want for a correct restore would be for such files to be absent in the restored tree (just as they were absent in the source tree on that date). How can such a restore be done to correctly exclude these files? Wouldn't rsync have to store some kind of sentinel that indicates that on dates prior, the file did not exist. I suspect someone might suggest I just make a complete hard linked replica tree for each date, and this way absent files will clearly be absent. I can assure you this is completely impractical because I have actually done this before. I ended up with backup filesystems that have so many directories and nodes that it could take over a day, maybe even days, to just do something like "du -s" on it. I'm intending to keep daily changes for at least a couple years, if not more. So that means the 40 million plus files would be multiplied by over 700, making programs like "du -s" have to check over 28 BILLION file names (and that's assuming the number of files does not grow over the next two years).
View 2 Replies
View Related
Jul 28, 2011
Any please help me for incremental copy command similar to windows.commad for copying in windows is c: source xcopy *.* destination /s/c/d/q/yany similar command is there in linux as I m new in Linux
View 4 Replies
View Related
Nov 29, 2010
I've had several HDD crashes on my personal server over the years and it's just gotten to be a real pain in the rear. Crashed again this morning. Currently, I make monthly tarball backups of the entire filesystem using my script:
Code:
#!/bin/sh
# Removes the tarball from the previous execution.
rm -rf /backup/data/*.tar.gz
[code].....
View 13 Replies
View Related
Mar 12, 2010
I'm converting pages in PDF to images in the following way: convert file.pdf out.jpg In this way I get in the current dir files with these names: out-0.jpg (first page), out-1.jpg (second page), out-2.jpg (third page) and so on. You can see the increment counter begins at 0. Can it be done to begin at 1, so I would get these names: out-1.jpg (first page), out-2.jpg (second page), out-3.jpg (third page)?
View 1 Replies
View Related
Feb 14, 2010
I received the following output from an rsync (3.0.0) command that was executed:
sending incremental file list
sent 77214 bytes received 484 bytes 155396.00 bytes/sec
total size is 254531170 speedup is 3275.90
What does "sending incremental file list" mean?
View 9 Replies
View Related