Ubuntu :: Cron - Backup Runs To Have Incremental Number

Apr 3, 2010

I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:

[code]....

View 7 Replies


ADVERTISEMENT

CentOS 5 :: Backup Script With TAR - Incremental Backup With Simple FTP To Another Location And Email Status

Jan 15, 2010

After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :

I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.

It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).

Files for TAR to include and exclude are in txt files listed each line separate name:

file: including.txt:

View 7 Replies View Related

Ubuntu :: Rsync Like Incremental Backup

Oct 20, 2010

when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?

View 5 Replies View Related

Ubuntu :: How To Incremental Backup Of Folder

Mar 8, 2011

incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.

View 6 Replies View Related

Fedora :: Incremental Automatic Backup

Jul 18, 2011

I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.

View 4 Replies View Related

General :: Administrator - How To Take Incremental Backup

Mar 16, 2011

I'm trying to take backup for my data for rhel, but I not able to take all backup. Could anybody help show me how I take incremental and full backups? What is the process?

View 3 Replies View Related

Server :: Incremental Backup In Linux

Sep 18, 2010

A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.

View 14 Replies View Related

General :: Binary - Use Logs To Have Incremental Backup

May 7, 2009

I had full backup in mysql. now i added some tables .i got new binary logs. how i can i use these logs to have incremental backup.

View 3 Replies View Related

Debian :: Incremental Encyrpted Local Backup Utility?

Sep 17, 2010

I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.

Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

View 1 Replies View Related

Ubuntu Servers :: MySQL Incremental Backup - Binary Log File Names

Mar 15, 2011

I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.

My problem is with the mySQL incremental / binary log backups.

The problem is that the binary log file(s) are always named xxxx-bin.1.

Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.

I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.

My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.

All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.

The my.cnf file contents that are relavent are as follows:

Code:

The statements in the backup script that do the backup are:

mysqladmin flush-logs

or

mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf

View 3 Replies View Related

Software :: Incremental Encrypted Local Backup Utility Alternative To Duplicity?

Aug 10, 2010

I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.

View 2 Replies View Related

Ubuntu :: Cron Command: Runs From CLI, But Not Auto?

Mar 10, 2010

I have a script that is basically a series of rsync commands called bkup_all.sh. This script is located in the /root/ dir.From the command line (su'd as root), I can run the script like this:/root/bkup_all.sh > /var/log/bkup/bkup_$(date +%Y%m%d).logThis excecutes perfectly, and all the rsync adn script output is saved in a log file in the intended destination. However, I want this command to run automatically, so again, su'd as root I enter:crontab -ethen enter the following:00 02 * * * /root/bkup_all.sh > /var/log/bkup/bkup_$(date +%Y%m%d).logI want the script to run each night at 2:00am.But, the script does not run. There is no log file generated and I do not see anything in the syslog or system messages to indicate an error.

View 2 Replies View Related

Ubuntu :: Surpress Logging When Cron Job Runs?

Dec 24, 2010

I have a cron job that I'm running once per minute.don't want to have the /var/log/crond.log get updated 60 times per hour. How can I suppress the logging of the job?I've tried adding the following to the cron line but they just get logged right along with it!

Code:
*/1 * * * * /path/to/script > /dev/null
or

[code]....

View 1 Replies View Related

General :: Cron Job Runs Every 30 Secs?

Jul 24, 2011

The problem is I need the php program to send member email confirmation which contains a confirm link. Run every min may still make the member wait. So I like to make it to run every 20 or 30 secs.

I don't want to put the code to send email on my sign up page as that's no good.

But I don't want to put a sleep 30 sec on my php script and going on loop. If it failed in the middle then it may wait abit to start.

What can be done to achieve my goal and what's the best way?

Making a php script to run as a daemon process? Is that possible and okay?

View 9 Replies View Related

Ubuntu :: Cron Runs Shell Script - But Only First Line Of It?

May 18, 2010

I have a cron job that runs a shell script. But it only runs the first line of that shell script and not the rest of the file. I'm a little stumped as to why. If I run the shell script manually, it runs and executes every single line as it should. I think I must need some additional syntax to make this run correctly?

Here is the crontab ...

Code:
root@kchlinux:~/macs# crontab -l
# m h dom mon dow command

[code]....

View 2 Replies View Related

CentOS 5 :: Cron Job Runs At Incorrect Time - Clocks Seem Ok

Mar 9, 2010

Within a VMWare ESX virtual machine, I am running CentOS 5.2. (Actually, it is kind of a virtual appliance to run CollabNet's Teamforge - which I have installed for a trial). I've been dabling with Linux for a year or so, but I know I have much to learn.

I'm attempting to run a cron job that runs a backup script at 11pm. It works great, but unfortunately it runs at 11:30 am.

I created the cron job using 'crontab -e', while logged in as root. My cron job line is : 0 23 * * 1,2,3,4,5 /etc/tjt_backup/collabnet_backup.sh

If I type 'date', I get the correct date/time in my timezone: Tue Mar 9 16:27:12 CST 2010

If I type 'clock', I also get the correct date/time: Tue 09 Mar 2010 04:26:57 PM CST -0.463330 seconds

(Although, it appears there is a little drift)

View 10 Replies View Related

Ubuntu :: Script Fails In Cron But Runs Fine In Regular Shell?

Jul 11, 2011

I have an Ubuntu server running Couch Potato, Sick Beard and Sabnzbdplus. Everything "works" pretty well in a sense that CP and SB push the NZB's to Sabnzbdplus, but Sab crashes regularly (haven't found the solution or the cause for this problem, so if you have some advice regarding that, it's welcome).To counter this problem (Sab crashing) I have a script written which checks if Sab is runnning and if it isn't start it:

Code:
bart@Pyro:~$ cat CheckSabRunning.sh
#!/bin/sh

[code]....

View 9 Replies View Related

General :: Rsync Runs Back Up But Process Is Already Running The Same Backup

Sep 15, 2010

I have cronjob that uses rsync to back up to a remote directory every hour. how do I set this rsync up so that if initial process is running already, it prevents running same rsync command and checks again in next hour until the process is finished/terminated?

View 5 Replies View Related

Ubuntu :: Simple Cron Job Backup ?

May 4, 2010

i am very sorry if this has been asked before... i'm sure it has.. but i have searched all over the net looking for an answer and i still cant find it...

I have a really simple cron job script like this:

When i run this manually it works fine but when i run it from my ROOT user in Plesk as a cron task is always creates a file that is just 45 bytes. Why doesn't it work... I am running it as a root user.. so surely i must have permission to access the file?

View 7 Replies View Related

Ubuntu :: Setup A Simple Backup Cron Job

Jun 10, 2010

I'm trying to set up a simple backup script with cron.

In "crontab -e" (and sudo crontab-e - I tried both) I enter "0 22 * * * /home/USERNAME/.backup.sh", with the hope that it will run the script at 10pm each day. The srcipt work fine if I run in a terminal. why it won't work? It's bound to be something obvious....

View 5 Replies View Related

Programming :: Incremental Backup Using DUMP Command - Error "DUMP: Only Level 0 Dumps Are Allowed On A Subdirectory"

Sep 6, 2010

I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as

DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.

The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...

View 2 Replies View Related

General :: DB Backup With Cron From Sh Script?

Jun 8, 2009

Maybe this is a MySQL question, maybe not...

I've written a shell script to back up a database.

But when I run it, it prompts for password even though the script provides it. If I'm doing this manually, it's not a problem, but I want to make a cron job to do it...

Here's the script: Quote: #!/bin/bash
set -xv
#First let's rotate the backup files...
/bin/mv /home/cabazio/someDB-3.tar.gz /home/some/someDB-4.tar.gz
/bin/mv /home/some/someDB-2.tar.gz /home/some/someDB-3.tar.gz

[Code]....

View 2 Replies View Related

General :: Cron Backup Job Fails To Run?

Jan 5, 2011

I'm having a small issue where the backup jobs that I set to run in the crontab of the backup user do not appear to be running. Here's how I set it up (with crontab -e as the backup user):

run amanda every night (check at 2:45 and backup at 3)

[code]...

View 5 Replies View Related

Ubuntu Servers :: Cron Job Backup - External HD No Longer Mounted

Dec 23, 2010

I'm running a cron job every night to dump a MySQL database to an external hard drive. It works, however when I check on it the following morning the external is no longer mounted and the XFS log file is corrupted. If I run

Code:
xfs_repair -L /dev/sdf1
It works, but then I get these issues:
Code:
XFS: Filesystem sdf1 has duplicate UUID - can't mount
I can reset the UUID, but it's difficult to have to do this every day.

View 2 Replies View Related

General :: Cron Job To Create And Move Backup From Ubuntu To Windows?

Jun 15, 2010

This is Kishore and i am new to Ubuntu and SVN and please some one help me in creating a cron job for my svn backup every day at 10:30 pm I already created a cron job which looks like 30 10 * * * svnadmin dump /home/administrator/svnrepository >svn1 when i run command directly i am getting whole backup and it's size is 3.6 gb but when i run through cron job the backup size is only 9 mb. So finally my requests are 1. cron job for taking complete svn backup at 10:30 pm daily and 2. cron job to copy the SVN backup in to my windows system in d drive and this must be run every day at 11:30 pm.

View 1 Replies View Related

Server :: Directory Backup Cron Script?

Oct 24, 2009

What's a good cron script for backing up and zipping a directory of files, or multiple directories with files, to a backup directory on my server, on a daily basis?I found an easy to use mysql backup script, now I need to backup my site directory, but not all the directories in it. So I need a method in the script to omit certain directories from backing up, ie dirs that contain gigs worth of files.This seems like it should be one of the most common crons to set a server up with but two pages deep in google (and here) I have yet to find anything remotely resembling a solution.

View 9 Replies View Related

Debian Configuration :: Can't Get Cron-fired Backup Scripts To Run

Oct 15, 2010

Due to a disk crash I've had to rebuild my Debian Lenny system. For some reason I can't get my cron-fired backup scripts to run. They will run manually.

It looks like crond is not running. If I try to start it, here's what I get:

Pancho:/home/lloyd# /etc/init.d/cron start
Starting periodic command scheduler: crond failed!

MORE INFO:

lloyd@Pancho:~$ /etc/init.d/cron start
Starting periodic command scheduler: crond/etc/init.d/cron: line 54: start-stop-daemon: command not found
failed!

[Code].....

Clearly the problem failure to find start-stop-daemon is not the problem and I'm still in the dark.

View 2 Replies View Related

Server :: Cron Backup Failing / Need To Browse Tape

Feb 9, 2010

we have a server that runs a backup (cron job) at 9:15 every night. When I log on in the morning I have mail message that gives me a long list of all the files that were backed up the night before. For a couple of weeks now, the mail message gives me an empty list. Yet, when I run the same job manually from a #prompt, it runs. I am not able to run this job with cron in the daytime because too many users are in it. I wanted to browse the tape to see if the backup is really failing to copy the files or if they are on the tape and the mail message is bogus.

Since the backkup was done with cpio instead of tar, I'm not sure if I can browse the tape with restore -i anyway.What would be the best way to browse the tape on /dev/rmt/1 without actually restoring anything ?This is an ancient DGUX system, not Linux, and I'm not a unix expert I just inherited this server recently, but a lot of things are very similar to Linux and it looked like this might be a good place to ask.

View 3 Replies View Related

Server :: MySQL Backup Cron Job Not Executing Correctly?

Mar 2, 2011

I own a CentOS 5 VPS. I typed crontab -e, and then I added the following line to automatically have my server backup mysql

0 * * * * mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz

When I go in and look, it doesn't place any files in /home/dbbackup. When I run

mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz

View 3 Replies View Related

Fedora :: Cron Backup Permission Denied On External Drive?

Aug 28, 2010

I have a cron backup scheme in which I rsync, then tar, then copy files on my internal hard drives to an external (USB) drive. When it works, it works. But I often get a "Permission Denied" message for all of these tasks. how the external drive is auto-mounted so I edited the etc/fstab so that the owner of the cron job is also the owner of the external drive (I think. Unfortunately, I'm not at that machine right now (it's at work), I can't give the exact fstab line (I will post it as an update to this thread next time I am at the machine).) BUT, I still get times when the cron backup runs fine and other times I get the Permission Denied. This is a shared machine that is dual-booted, so what I *think* is going on is that when the machine is rebooted to Fedora, but nobody logs in, I get a Permission Denied for the cron backup. It seems like on days when someone has logged in as the main user and left without logging out that the cron backup runs fine.

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved