General :: Delete Backups Older Than 7 Days Through Shell Script?
Jul 15, 2011
I am having my own testing server in which mySQL database will be backed up daily. in the format mysql_backup_dd/mm/yy.tar.gz in my home folder I need to setup a cron job to delete the backups older than 7 days.how to do this.
I am trying to put together ksh shell and I am new with writing scripts.How do you write a command to delete any files if it's 30 days old and also it's not currently being locked?
find -type f /path/* -mtime +7 -exec rm {} ;Is this the best way to delete only files (not directories) within /path that are older than 7 days? or is there a better way?
I have been trying to write a script that will take a directory, for example /accounts compress it into a .tar file with the filename containing the date of compression, for example accounts030210.tar and then place that file into a directory called /archive
I also want the script to delete files in /archive that are older than 7 days.
I want to make the script to copy all folders which older than 7 days from Linux server to Windows server by Samba. And make it automatically by using cron
I am new to unix, I am looking for a script to delete files older than 7 days but i also want to exclude certain directories (like arch,log .....) and also some files with extensions ike .ksh, .ch, ..............) in directories and sub directories
I got this part from my script working that it will delete a folder is from 8 days agoEightDaysAgo=`(date --date="8 days ago" +%d-%m-%Y)`rm -rf $EightDaysAgoTarHowever I need to remove files that are older than 8days for example if the script is'nt run for a day it will remove both the 9th and 8th day one not just the 8th day one. If I'm making any sense lol
I have set up a simple find and delete script for files older than X days. The problem is that some of the files that are send in this share are transfered from an archive server and creation/modified date remains the same when copied and the age of them could be a year ago or older and they get deleted over night by the script.For performance reasons the raid is mounted with noatime in fstab.Do you see any solution to this problem except enabling atime?I'm thinking at some more advanced script that writes the list of added files once a day and marks them for deletion after some time.
I'm trying to truncate a postfix Maildata directory for one of our users. I want to be able to move any files older than <n> days to a new location, but also copying the relevant directory structure. This should be doable in the one comman. I've used find to locate the files, and mv to move them, but I can't figure out how to build the directoryt structure on the fly in the new location.
Possible Duplicate:How do I delete files greater than a certain date on linux How to delete all files in current directory and it`s sub directories older than one year ?
Seem my rotation part is not removing files older than 90 days. Anybody know what is wrong?
Code: #!/bin/sh #navigate to the desired backup location cd /public/backup/linux #dump the MySQL entirely, output file is dated mysqldump -u root -pmt1jxz68f2 --all-databases > "`date +%Y%m%d`.sql" #backup the web folder
How do you get Rsync to do incremental backups rather than full backups? At the moment I have a script that will create a backup folder (if it doesnt already exist) then copy the source files into the backup directory with the command
Target is where the files will be backed up to Sources is the dir(s) to be backed up Exclude files is the list of files not to backup log file is where the output will be saved to. At the moment it only does full backups, but I would only like to do incremental, how would this be achieved? Am I missing out an option in the Rsync that is required.
I want Firefox to delete history after several days. Can I configure it this way?"Remember my browser history for at least..." is not the option which does it, right?
I want to capture a mms stream daily. All works well with that, but I also want to delete any mpg files that are captured from the stream that are more than 7 days old. Will this command work?
I've previously created a script to backup some iso files to dvd's but had to do a format so lost it. Back then I used a command prior to growisofs that closed the dvd drive (if opened) and waited until the disk was ready to write before starting growisofs. I can't find that command now, anyone know which one it is? I remember it is a one line only that I think showed some basic info of the disk thus had to wait before it was ready...
When I turn on my computer, because of frequent updates it will display several versions of Ubuntu 10.10 that I can choose from. I wonder if it is possible to delete some of the older versions and how. I think having several versions of Ubuntu uses up a lot of space in the hard drive.
i have to write a shell script that will delete all the .dat files in /var/oracle/etl/incoming which the created date of the file is 7 days before the currrent date.
I have a problem need help, I want to do a schedule with using shell script with crontab in linux SuSE SLES 10.
1. I have many server that want to do backup mysql all from that server everyday. I need advice for for write shell script to backup all mysql in different server to server backup everyday and create auto folder as date example 27102009, 28102009..........for a month will has 30 folder in server backup.
2. Also need to write shell script to delete all folder but keep only one week last example from 1 to 30 it will has 30 folder in sever backup but i want to keep only 7 folder last and want to set schedule delete it every saturday night.
I have server running ubuntu. There is folder /var/netflow and I have there files, which creates every 5minutes new ones(monitoring traffic on network). And I need to delete files older than 6 months manually. Can you help?
I have an archive directory that needs to be cleaned up once per quarter. The top level (/data/archive/*) directory names change daily, as well as the subdirectories and the filenames (the application names everything according to date). Also, there are two top level directories, bin and incoming, that we can't touch. I want to write a shell script that loops through the 15 or 20 top level directories and deletes all files and subdirectories older than 3 days (skipping the bin and incoming folders). Can someone get me started on a script? I am kinda new to shell scripting.
i want such a shell script or single line command to delete all the files with extension specified in script i have bash !! ex... delete all files of extension .obj
I have requirement to delete some log files from a directory if a string"deletethisfile" is found.Then restart the application servers.1. Search for the string ?deletethisfile? in server.log file under a directory, If found 2. Stop that particular server.3. Delete the log file 4. Restart the server.
is there a recursive shell or Perl script to delete files with the same name as the parent folder? i wish to include the starting folder name as argument to the script.
I've been a DOS/Windows guy for 20 years, and recently became a SW test lab helper. My company uses CentOS for a lot, so I've become familiar with it, but obviously not as comfortable as I am with Windows.
Here's what I have planned:
machine: Core 2 Duo E8400, 8GB DDR2, 60GB SSD OS drive, ATI 4650 video card, other storage is flexible (I have 3 1TB drives and 4 750GB drives around that can be used in this machine.)
uses: HTPC, Network Storage, VMWare server host: SMTP, FTP server, and Web server virtual machines
I've figured out how to do much of this, but I haven't figured out how to do backups in Linux. I've been spoiled with Windows, with the built in backup system so simple to use. I find myself overwhelmed with the array of backup software, and unable to determine which to use. none of them seem to do everything I need them to do, but some come close, I think. I'm hoping someone here can help me out in figuring out which program to use and how to use it.
Here is what I need the backup software to do: 1. scheduled unattended backups, with alerts if the backups fail 2. a weekly full backup with incremental every 12 hours 3. removing the old backups when the new full backup runs, I would prefer to keep 2 weeks of backups, but that's not necessary 4. a GUI would be preferable, since my arthritic fingers don't always do as I want them to do. I typo things a lot, and the label worn off my backspace can attest to that.