Server :: Create Daily Incremental Backups Easily?

Nov 29, 2010

I've had several HDD crashes on my personal server over the years and it's just gotten to be a real pain in the rear. Crashed again this morning. Currently, I make monthly tarball backups of the entire filesystem using my script:

Code:

#!/bin/sh
# Removes the tarball from the previous execution.
rm -rf /backup/data/*.tar.gz

[code].....

View 13 Replies


ADVERTISEMENT

Server :: Using Rsync For Incremental Backups?

Jan 29, 2010

Can I use rsync for incremental backups of the running linux server?

View 5 Replies View Related

General :: Rsync Incremental Backups Rather Than Full Backups?

Nov 12, 2009

How do you get Rsync to do incremental backups rather than full backups? At the moment I have a script that will create a backup folder (if it doesnt already exist) then copy the source files into the backup directory with the command

rsync $VERBOSE --exclude=$TARGET/ $EXCLUDE --exclude '/Ls-wtgl1c8/**' -rt --delete $source/ $TARGET/$source/ >> $LOG_FILE

Target is where the files will be backed up to Sources is the dir(s) to be backed up Exclude files is the list of files not to backup
log file is where the output will be saved to. At the moment it only does full backups, but I would only like to do incremental, how would this be achieved? Am I missing out an option in the Rsync that is required.

View 9 Replies View Related

General :: Rsync Not Doing Incremental Backups?

Jan 21, 2010

I am using rsync to backup dirs on my ubuntu server onto a NAS (which is mounted onto the filesystem), but the problem is that it is constantly doing full backups rather than doing incrementals and I am not really sure why. After doing a bit of expermienting with the script I noticed that if I just backed up a home dir (/home/user) the incremental backups work fine. If however I was to back up a dir like (/home/domain/user) it always does full backups.I have tried various different scripts but still the same end result. The latest script is a variation on the a script found on the samba rsync examples webpage, see below...

#!/bin/bash
# rsyncbu.sh -- backup to nas using rsync
# This script backups files listed in BDIR to the BSERVER. The verbose output along with the date is listed in the LOG_FILE specified
# verbose output

[code]....

View 4 Replies View Related

Software :: Rsync Incremental Backups To Be Restored?

Dec 2, 2010

With the --backup and --backup-dir= options on rsync, I can tell it another tree where to put files that are deleted or replaced. I'm hoping it fills out the tree with a replica of the original directory paths (at least for the files put there) or else it's a show stopper. What I'm wanting to find out applies when I'm restoring files. Assuming each time I run rsync (once a day) I make a new directory tree (named by the date) for the backup directory. For each file name/path in the tree, I would start with whatever is in the main tree (the rsync target) and work through the incremental trees going backwards until I reach the date of interest to restore to. If along the way I encounter a file in an incremental, I would replace the previous file at that path with this next one. So by the time I get back to a given date, I should have the version of the file which was present at that date. Do this for each file in the tree and it should be a full restore.

But ... and this is the hard part, it seems. What about files that did not exist at the intended restore date, but do exist (were created) on a date after the intended restore date. What I'd want for a correct restore would be for such files to be absent in the restored tree (just as they were absent in the source tree on that date). How can such a restore be done to correctly exclude these files? Wouldn't rsync have to store some kind of sentinel that indicates that on dates prior, the file did not exist. I suspect someone might suggest I just make a complete hard linked replica tree for each date, and this way absent files will clearly be absent. I can assure you this is completely impractical because I have actually done this before. I ended up with backup filesystems that have so many directories and nodes that it could take over a day, maybe even days, to just do something like "du -s" on it. I'm intending to keep daily changes for at least a couple years, if not more. So that means the 40 million plus files would be multiplied by over 700, making programs like "du -s" have to check over 28 BILLION file names (and that's assuming the number of files does not grow over the next two years).

View 2 Replies View Related

General :: Make Incremental Backups On A Separate Partition Of The Same Hard Drive?

Jan 22, 2010

one would have to exclude certain folders / directories but would the backup be possible if the system is up and running in its native "live" state ? Which directories could be excluded ? Does swap need to be turned off ? I would like to make incremental backups on a separate partition of the same hard drive. I will endeavour to backup the MBR/ Partition table using dd.

View 6 Replies View Related

Ubuntu Installation :: Easily Create A Boot-Info Summary?

Aug 9, 2011

When you have boot troubles, you will often be asked to run Boot-Info-Script in order to see a summary of your boot parameters. Standard method is a little difficult, so I made a little GUI to do it very easily :

1) Boot on Ubuntu live-CD or live-USB. (or Boot-Repair-Disk which will lead you automatically to step 4)
2) Connect internet
3) Open a terminal and type :

Code:
sudo add-apt-repository ppa:yannubuntu/boot-repair && sudo apt-get update && sudo apt-get install -y boot-repair && gksu boot-repair
4) Click on "Create a BootInfo summary":
5) Done !

Now just indicate (or copy/paste) the URL that appears to people who help you on this forum.

View 9 Replies View Related

Ubuntu :: Create Loss Less Backups Of DVDs?

May 30, 2010

I would like to create lossless backups of my DVDs, or more exactly the main movie including one or more audio tracks and a subtitle of my choice. I would like to have the subtitle burned into the movie so that I only have one file (container). No, I don't want a complete DVD folder (TS_Video and other stuff) nor do I want to create an iso file of the DVD. Is it possible? The way I see it there should be two options:

1. Extract the wanted audio track/tracks, one subtitle and the main movie. Keep the audio track/tracks and the movie track in their original formats, and put it all together in one file/container (subtitle burnt in). In theory it should work! I know the video tracks are mostly MPEG2 and the audio tracks mostly AC-3 or DTS.

2. Do almost exactly as above, but before putting it all together compress the video and audio tracks even further to some lossless formats. In theory this should work too! In some other forum a guy told me that since MPEG2 and AC-3/DTS are already compressed formats it probably isn't possible to compress audio and video much further without losses, which is probably true.

Is it possible to do what I want to do? How? If this process is not easy to do it would be nice if some of the skilled guys would create an application that does exactly this. I believe more than me would find it extremely useful.

View 6 Replies View Related

General :: Create Backups From Some Copy Protected DVD's?

Apr 14, 2010

I'd like to create backups from some copy protected DVD's, for my private use only.

Does it work this way to circumvent the copy protection mechanisms?

Code:
# dd if=/dev/dvd of=dvd.iso

and then burn dvd.iso on an empty DVD.

View 3 Replies View Related

Debian Configuration :: Cron.daily Not Running Daily On Laptop?

May 31, 2011

I am running on a laptop and cron.daily is set to run at 0625 So I wonder what happens if my machine is not turned on at that time.. At that rate it could also be off for the other periods as well (weekly, monthly) Is there solution that will allow them to run once they are online after the appointed time? using a cron entry that runs every 15 or 5 or 1 minute.

View 2 Replies View Related

General :: Can Root Create Directory When Not Exists While Backups

Nov 8, 2010

I am getting the databases from mysql and my database name is username_something.
I am getting the username and then puting the respective backups in corresponding folders like

tar bala bla /backups/sql/username/username_something.tar.sql.gz

The problem is system worrks if i have the folder username already there but for new databases if get the error like unknown file path.

How can i do that if username folder is not there it should be created

View 2 Replies View Related

Ubuntu :: Can Create An Automatic Sequence Of Spreadsheet Backups

Jan 7, 2011

I would like to create an automatic sequence of Openoffice backups of a spreadsheet file, and each would be the daughter of the previous version.I would like it to autosave every hour so at the end of the day I could then manually make up a 'Day' file for permanent record.

View 2 Replies View Related

Server :: Incremental Backup In Linux

Sep 18, 2010

A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.

View 14 Replies View Related

Ubuntu Networking :: Create File Sharing Between MacBook Pro And Lucid Box For Time Machine Backups?

Mar 19, 2011

I've been trying to create file sharing between my MacBook Pro and my Lucid Lynx box for time machine backups and media server purposes. I followed this guide:[URL]..Everything seems to work with these exceptions: I can see my LucidLynx box in my finder app in my Mac but only when I run these commands from Ubuntu:

Code:
sudo /etc/init.d/netatalk restart
sudo restart avahi-daemon

If I restart my LucidLynx box then I can't see anything in finder. I can't log into my LucidLynx box from finder. I don't get a bad username or password error it just tells me the connection failed. *Note if I do enter an incorrect username or password it WILL tell me it's incorrect. I've looked at this link below since some people have used it in theses forums but it's a bit dated[URL]..

View 2 Replies View Related

CentOS 5 :: Scheduled Unattended Backups With Alerts If The Backups Fail

Feb 3, 2011

I've been a DOS/Windows guy for 20 years, and recently became a SW test lab helper. My company uses CentOS for a lot, so I've become familiar with it, but obviously not as comfortable as I am with Windows.

Here's what I have planned:

machine: Core 2 Duo E8400, 8GB DDR2, 60GB SSD OS drive, ATI 4650 video card, other storage is flexible (I have 3 1TB drives and 4 750GB drives around that can be used in this machine.)

uses: HTPC, Network Storage, VMWare server host: SMTP, FTP server, and Web server virtual machines

I've figured out how to do much of this, but I haven't figured out how to do backups in Linux. I've been spoiled with Windows, with the built in backup system so simple to use. I find myself overwhelmed with the array of backup software, and unable to determine which to use. none of them seem to do everything I need them to do, but some come close, I think. I'm hoping someone here can help me out in figuring out which program to use and how to use it.

Here is what I need the backup software to do:
1. scheduled unattended backups, with alerts if the backups fail
2. a weekly full backup with incremental every 12 hours
3. removing the old backups when the new full backup runs, I would prefer to keep 2 weeks of backups, but that's not necessary
4. a GUI would be preferable, since my arthritic fingers don't always do as I want them to do. I typo things a lot, and the label worn off my backspace can attest to that.

View 7 Replies View Related

Server :: Making Incremental Copies/transfers With Rsync In Cygwin?

Mar 21, 2011

As an example, I have two servers, sm-i222 and fileserv. sm-i222 is a Win2k3 system running cygwin. fileserv is a linux box running RHEL 4.7. sm-i222 maps /cygdrive/c to the c: drive and /cygdrive/d maps to the d: drive(actually a single 4TB RAID). from the sm-i222 server /cygdrive/c I call a small script from the crontab. The internal IP for fileserv is 10.0.0.7. See below.

cd /cygdrive/d/fileserv/home
rm -rf /cygdrive/d/fileserv/home/*
rsync -avz -e 'ssh -c blowfish' root@10.0.0.7:/home/* .

These three lines perform well in that they make a full transfer of the fileserv:/home/ directory on fileserv to the appropriate place on sm-i222 using rsync. I use rsync instead of scp because I have to traverse subdirectories and symbolic links in the /home/... filesystem on fileserv. What I'm looking to do is use rsync to do an incremental transfer/backup of only the files that have changed since the last full backup. I'll manage the times I do this manually or in crontab. A colleague says this is do-able, but not how. Rsync.org says this is do-able but not how. Cygwin says this is do-able... see rsync.org. I believe what I'm looking for is a single rsync line like I have above that only transfers the changed files on fileserv to sm-i222.

View 14 Replies View Related

Red Hat :: Server Reboots Sometimes Cron.daily

Dec 31, 2008

I have an issue, it doesn't happen every day, I can't seem to trace it down. It happens at 4:02am the cron.daily kicks of at 4AM.. The following is from the /var/log/messages. I receive an panic error then the server shuts down.

View 1 Replies View Related

General :: High CPU Use - Daily Server Down On Same Time

Sep 2, 2010

I am working on Ubuntu 8.04.3 OS, with this I am getting a problem, Daily my server is down on same time at 4:00 PM. I seems server is down by "kswapd0" process, I am not sure, As I run top command, I got below out put

[code]...

View 12 Replies View Related

Server :: Can't Get Logwatch To Email A Daily Summary?

Oct 12, 2010

I have a squid proxy server (which I am very new too) which all traffic from my office goes through. The proxy itself is working fine, but I can not get logwatch to email me a daily summary. logrotate seems to be throwing an error:

# logrotate /etc/logrotate.conf
error: squid:1 duplicate log entry for /var/log/squid/access.log

My /etc/logrotate.d/squid file is below... My access logs are in /logs/squid not in /var/log/squid.

[Code]...

View 1 Replies View Related

Ubuntu :: Archive Daily Images From Webcam-Server?

May 12, 2011

I was just wondering if there is a way (and seeing as how this is Linux we are talking about here, there probably is...) to make a fully functional webcam-server on Ubuntu 10.04.2 save images daily, or weekly? I have a perfectly installed webcam-server running on my website that you just refresh, and it gives you a current image, but I was wanting to have a daily type of archive thing with it, just because.

View 3 Replies View Related

Ubuntu :: Using Server Or Desktop In Computer For Daily Work?

Aug 7, 2011

anyone of you using ubuntu server or desktop in your PC for your daily work? can you share how stable is it?

View 6 Replies View Related

Server :: Logrotate Rotating Daily When Should Rotate Weekly?

Mar 18, 2010

I have an Ubuntu server, and I have a special script on logrotate.d to rotate the samba_audit logs:

/var/log/auditsamba/auditoria.log {
weekly
rotate 12
missingok

[code]....

View 14 Replies View Related

Server :: Cron Job That Does The Backups?

Mar 21, 2010

I have a crontab related question which I am hoping someone can answer. I recently took over a Redhat Enterprise 5 Server, and I was told by the previous Server Admin that there is a cron job that does the backups. I ran the following command to get a list of all users:

Code:
cat /etc/passwd | grep "/home" |cut -d: -f1

I then ran the following command for each of those users to see if they have any crontabs associated with them:

Code:
crontab -u USER -l

It doesn't show any crontab entries for any users (including root). But I am positive that there is a scheduled job somewhere because the backups are still running every night.

View 6 Replies View Related

Server :: Use FileSystem For Backups?

Jul 8, 2010

With so many filesystems available which one should I use to make backups? All I care about is reliability and stability. I don't care at all about portability.

View 8 Replies View Related

General :: Copy Data Files From One Server To Another Online, Automatically, Daily, Using A Script?

Jun 22, 2011

How to move a data files from one Linux server to another in an online Internet environment, automatically daily, using a script?

View 1 Replies View Related

General :: Use Bacula To Make Backups Of The Files Stored On The Server?

Apr 14, 2010

I have an ubuntu (8.04.3) server where I use bacula to make backups of the files stored on the server. Ive been trying to find a solution (with no luck) trying to succesfully implement the following:-

A Backup tape for each day of the week besides Thurs which is resused on a weekly basis. For the thursday tapes we have a backup tape corresponding to the week number that the thursday falls so for the first thursday of the month it would be ThursOne For example. These tapes are resued on a monthly basis. We then have a monthly tape that is used on the last thursday of the month. These tapes will be resused on a yearly basis.

Another requirement is just in case a tape is accidently not changed a backup should still occur regardless of what tape is in the drive (so if its tuesday and mondays tape is still in the tape drive it should rewrite that tape).

I did have this successfully set up where the tape was appended after each use rather than being recycled after the nightly backup. But then after a few weeks I would have to manually purge tapes when they became full (which isnt ideal - as Im not always in the office so in my absence it may be that a backup may not take place), so have been playing around and have now got the tapes to be marked as used after a max of 2 jobs (so the backup of the files and the catalog of the night). I also added this line 'Recycle Current Volume = yes' so that it would hopefully recycle the volume in the drive.

However what I am finding is that the tape that should be recycled is not, but in yesterday case the Mondays tape was recycled rather than the Tuesday although Mondays was the last written so Im not even sure why it choose to recycle this tape.

View 1 Replies View Related

Server :: Setup House Mostly (for The First Time) For Backups And File Sharing?

Jan 28, 2011

**Edit: path for mount was incorrect Distro Server: CentOS 5.5

Clients:
Fedora(latest)
OSX(latest)

Backround I am attempting to setup a server in my house mostly(for the first time) for backups and file sharing. It is important to me that file permissions are preserved. So its my understanding that I must use idmapd in order for this to work. As of now I'm only working with the linux distros while osx will be dealt with once these two work together. portmapper is up and running, along with lockd on both machines. Firewalls are also down on both machines for now. The server side was all setup using the GUI interface with no extra options selected. Problem When attempting to "mount -t nfs4 10.0.0.2/$sharedfolder /mnt" I get an error operation not permitted with no error printing in /var/log/message. If I use "mount -t -o nolock nfs4 10.0.0.2/$sharedfolder /mnt" it mounts just fine. Ive checked both machines multiple times to make sure that lockd is up and running. In the idmapd.conf file I the domain as "localdomain" for both machines but I doubt that is right; like I stated above this is my first attempt at a server. I'm assuming the problem is a whole missing step that involves some kind of id mapping server I need to setup.

View 5 Replies View Related

Security :: Secure And Automated Backups - Add Public Key To Authorized_hosts File On Prod Server?

Mar 13, 2010

I'm trying to find a secure way to backup files on my Prod Server to Backup Server. It must be automated, so I will need to run a command with cron which will login to Prod Server from Backup Server and backup data. 1. Do you think it would be secure enough to do this by creating an passwordless RSA private key on Backup Server and adding it's public key to authorized_hosts file on Prod Server? I can't think of a way to Automate this without having to enter any passwords without passwordless RSA key. Is there another. more secure way? 2. Should I create a special user for backup, which will only have read access to all files in the directory that I am backing up? If so, How can I run a check that this new backup user indeed has read access to ALL files in the folder that I intent to back up? How can I ensure the backup process will not skip files due to some permission problem? 3. I'm thinking of using rsnapshot tool, which uses rsync.

View 10 Replies View Related

CentOS 5 :: Daily Backup Of CentOS Server To Disk?

Oct 6, 2010

I'm looking for a simple solution to backup my CentOS Server (5.x) on a daily base to a mounted disk. I found the glastree tool but I have no clue if it will work on CentOS.All recommendations, tipps, hints and maybe scripts are welcome. Unfortunatelly I'm an Linux newbie and starting with Linux CentOS a couple of weeks ago

View 1 Replies View Related

General ::Create Custom Name Server \ Create Own Name Server?

Nov 4, 2009

I bought a web hosting account(cPanel) and I want to create my own name server(ns1 and ns2.mydomain.com). So when I want to host addon domain, I can point them to my name server instead of hosting company name server.

View 10 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved