Server :: Directory Backup Cron Script?
Oct 24, 2009
What's a good cron script for backing up and zipping a directory of files, or multiple directories with files, to a backup directory on my server, on a daily basis?I found an easy to use mysql backup script, now I need to backup my site directory, but not all the directories in it. So I need a method in the script to omit certain directories from backing up, ie dirs that contain gigs worth of files.This seems like it should be one of the most common crons to set a server up with but two pages deep in google (and here) I have yet to find anything remotely resembling a solution.
View 9 Replies
ADVERTISEMENT
Feb 9, 2010
we have a server that runs a backup (cron job) at 9:15 every night. When I log on in the morning I have mail message that gives me a long list of all the files that were backed up the night before. For a couple of weeks now, the mail message gives me an empty list. Yet, when I run the same job manually from a #prompt, it runs. I am not able to run this job with cron in the daytime because too many users are in it. I wanted to browse the tape to see if the backup is really failing to copy the files or if they are on the tape and the mail message is bogus.
Since the backkup was done with cpio instead of tar, I'm not sure if I can browse the tape with restore -i anyway.What would be the best way to browse the tape on /dev/rmt/1 without actually restoring anything ?This is an ancient DGUX system, not Linux, and I'm not a unix expert I just inherited this server recently, but a lot of things are very similar to Linux and it looked like this might be a good place to ask.
View 3 Replies
View Related
Mar 2, 2011
I own a CentOS 5 VPS. I typed crontab -e, and then I added the following line to automatically have my server backup mysql
0 * * * * mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz
When I go in and look, it doesn't place any files in /home/dbbackup. When I run
mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz
View 3 Replies
View Related
Jul 20, 2010
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that..
rsync -r -a -v -z * --delete freenas:dSIBackups It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not. Can anyone think of a reason that this would happen?
View 3 Replies
View Related
May 18, 2010
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
View 2 Replies
View Related
Nov 10, 2010
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days
find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
View 2 Replies
View Related
Jun 8, 2009
Maybe this is a MySQL question, maybe not...
I've written a shell script to back up a database.
But when I run it, it prompts for password even though the script provides it. If I'm doing this manually, it's not a problem, but I want to make a cron job to do it...
Here's the script: Quote: #!/bin/bash
set -xv
#First let's rotate the backup files...
/bin/mv /home/cabazio/someDB-3.tar.gz /home/some/someDB-4.tar.gz
/bin/mv /home/some/someDB-2.tar.gz /home/some/someDB-3.tar.gz
[Code]....
View 2 Replies
View Related
May 4, 2010
i am very sorry if this has been asked before... i'm sure it has.. but i have searched all over the net looking for an answer and i still cant find it...
I have a really simple cron job script like this:
When i run this manually it works fine but when i run it from my ROOT user in Plesk as a cron task is always creates a file that is just 45 bytes. Why doesn't it work... I am running it as a root user.. so surely i must have permission to access the file?
View 7 Replies
View Related
Jan 5, 2011
I'm having a small issue where the backup jobs that I set to run in the crontab of the backup user do not appear to be running. Here's how I set it up (with crontab -e as the backup user):
run amanda every night (check at 2:45 and backup at 3)
[code]...
View 5 Replies
View Related
Jun 10, 2010
I'm trying to set up a simple backup script with cron.
In "crontab -e" (and sudo crontab-e - I tried both) I enter "0 22 * * * /home/USERNAME/.backup.sh", with the hope that it will run the script at 10pm each day. The srcipt work fine if I run in a terminal. why it won't work? It's bound to be something obvious....
View 5 Replies
View Related
Oct 15, 2010
Due to a disk crash I've had to rebuild my Debian Lenny system. For some reason I can't get my cron-fired backup scripts to run. They will run manually.
It looks like crond is not running. If I try to start it, here's what I get:
Pancho:/home/lloyd# /etc/init.d/cron start
Starting periodic command scheduler: crond failed!
MORE INFO:
lloyd@Pancho:~$ /etc/init.d/cron start
Starting periodic command scheduler: crond/etc/init.d/cron: line 54: start-stop-daemon: command not found
failed!
[Code].....
Clearly the problem failure to find start-stop-daemon is not the problem and I'm still in the dark.
View 2 Replies
View Related
Apr 3, 2010
I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:
[code]....
View 7 Replies
View Related
Aug 28, 2010
I have a cron backup scheme in which I rsync, then tar, then copy files on my internal hard drives to an external (USB) drive. When it works, it works. But I often get a "Permission Denied" message for all of these tasks. how the external drive is auto-mounted so I edited the etc/fstab so that the owner of the cron job is also the owner of the external drive (I think. Unfortunately, I'm not at that machine right now (it's at work), I can't give the exact fstab line (I will post it as an update to this thread next time I am at the machine).) BUT, I still get times when the cron backup runs fine and other times I get the Permission Denied. This is a shared machine that is dual-booted, so what I *think* is going on is that when the machine is rebooted to Fedora, but nobody logs in, I get a Permission Denied for the cron backup. It seems like on days when someone has logged in as the main user and left without logging out that the cron backup runs fine.
View 8 Replies
View Related
Jan 4, 2010
I installed a second HD, and formatted it to ext4. I gave it the "/backup" label. I am trying to figure out how to mount it so that I can run cron to backup my home folder onto it once a week. This is what the fstab looks like now
[code]...
View 9 Replies
View Related
Dec 23, 2010
I'm running a cron job every night to dump a MySQL database to an external hard drive. It works, however when I check on it the following morning the external is no longer mounted and the XFS log file is corrupted. If I run
Code:
xfs_repair -L /dev/sdf1
It works, but then I get these issues:
Code:
XFS: Filesystem sdf1 has duplicate UUID - can't mount
I can reset the UUID, but it's difficult to have to do this every day.
View 2 Replies
View Related
Jun 15, 2010
This is Kishore and i am new to Ubuntu and SVN and please some one help me in creating a cron job for my svn backup every day at 10:30 pm I already created a cron job which looks like 30 10 * * * svnadmin dump /home/administrator/svnrepository >svn1 when i run command directly i am getting whole backup and it's size is 3.6 gb but when i run through cron job the backup size is only 9 mb. So finally my requests are 1. cron job for taking complete svn backup at 10:30 pm daily and 2. cron job to copy the SVN backup in to my windows system in d drive and this must be run every day at 11:30 pm.
View 1 Replies
View Related
Aug 14, 2011
I would like to backup important files (totaling about 400GB) on my ext 4 RAID 5 array to an ext4 external hard drive over USB (external drive is mounted to /mnt. In the future I'd like to automate the process using rsync and cron so for now I'm using rsync to transfer the files. My problem is that using the rsync command like this: # rsync -Pr "/dir1" "/dir2" "/dir3" "/dir4" /mnt
rsync shows me the checks and transfers for awhile and then throws up an i/o error (wish I had a screenshot to show but I don't). When I ls /mnt I get a similar i/o error. I then check /dev for the drive and find that it no longer shows up. Originally the partition was /dev/sdc1. I tried unplugging the USB at this point, plugging it back in and mounting the drive back to /mnt, however it has now assigned it to (you guessed it) /dev/sdd1. I get the drive mounted and try the original rsync command again, hoping the first error was a fluke or some kind of one-time drive fart. This time it makes it quite a bit further and then throws up the exact same problem. Am I doing something terribly wrong here? As I said, I'm very new to bash so I'm not making some absolutely moronic, newbie mistake.
View 9 Replies
View Related
Mar 3, 2011
How do you create a cron file that will regularly perform a level 0 backup once per month?
View 2 Replies
View Related
Sep 29, 2009
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
View 2 Replies
View Related
May 10, 2011
I install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.
[URL]
The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?
View 1 Replies
View Related
Jan 19, 2011
I have noticed that in the /etc directory there are some subdirectories of /etc/cron.daily and /etc/cron.hourly. Also I have determined that there is a service included as part of the Fedora OS 'crond' which I have started with the use of 'service crond start'. My question is this. If I drop an executable file into the '/etc/cron.daily' directory will it just run? If not what configuration files should I go about editing to run my executable daily? There seems to be a pretty large network of cron related files. I'm trying to make sense out of all of it.
View 9 Replies
View Related
Jun 28, 2010
I would like to create a cronjob that will delete all files within a directory 1 hours after it is created to the folderI found this cron find /path/to/file/* -ctime +1 -exec rm {} ; but it's deleted all files.I want to make an exception, all file should be deleted except one file (letsay file a.zip)
View 16 Replies
View Related
May 10, 2010
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
View 7 Replies
View Related
Oct 4, 2010
I have the following cron entry but it doesn't seem to be running:
Code:
The script does exist. And so does the directory /home/usr/log and writable. /var/log/syslog only has a bunch of these:
Code:
I don't see any file gets written to the log directory. That suggests to me that cron didn't run the job, as confirmed by /var/log/syslog.
View 9 Replies
View Related
Apr 13, 2011
After an upgrade gone wrong my 9.10/10.4 is unbootable and I see no way to fix it.
I can't mount the directory now (LiveCD or chroot), it probably has something to do with the breakage and/or pre-existing ecryptfs issues, as I have both login and mount passphrases.
And it's all on one partition too, so before reinstalling I have to back up. How can I back up the encrypted directory? And can I transplant it to a fresh Ubuntu installation?
cp complains about symlinks, returning 'not permitted', is this normal?
ETA: Now that I look at it from LiveCD .ecryptfs and Private appear as broken symlinks (what do they link to anyway??), but when I tried to copy them while chrooting, they were both accessible directories (though still couldn't mount or copy) - I don't get it.
View 2 Replies
View Related
Sep 10, 2010
I need to backup my /home directory because I want to switch from Fedora to OpenSUSE but I didn't put /home as a separate partition so I need to back it up. Problem is, I can't figure out how.I've tried tar and gzip through every google hit I can possibly find but not one has worked.
View 7 Replies
View Related
Mar 12, 2010
How do I tar into tar.bz2 a locked backup directory from the simpleconfig backup program? The backup directory is located in /home/lukasz/Downloads/backup/
View 1 Replies
View Related
Mar 28, 2010
I'm looking for software that can backup all the files in my /home directory including hidden files.
I liked Lucky Backup, but it puts everything in a tar file, meaning that the backup fails if the file gets too large (4 GB I think). I would prefer to avoid using tar/archives anyway, as often I only need to recover 1 file from a backup (an archive holding my 50 Gb of data would take ages to open).
Does anyone know of a program or a way to get rsync or the like to copy all the files in a directory, including hidden files, into another directory ( so I end up with effectively a carbon copy of the original). Disk space is not an issue so I don't need to compress anything. I'm not bothered whether its a fancy GUI-based program or a rsync command, just so long as it can save my previous files from.... myself.
View 9 Replies
View Related
Jun 25, 2011
I wonder if there is a directory for a terminal commands that may I can backup the commands I have used in the terminal before.
View 7 Replies
View Related
Mar 27, 2011
I would like to have dump backup just my home directory but am having problems the command I am using wants to back every thing and takes hours upon hours it has been running for about 10 hr and only 21% is done. This is the command dump -0u -f dp_hd /media/CENTON USB/ /how can I get this to back up only my home directory
View 7 Replies
View Related