Ubuntu :: BASH Script To Move Files More Than 30 Days Old?
May 12, 2011
I need to move files to a backup drive if they are over 30 days old. All I could find, when looking for scripts, were ways to sort files by date and the solutions were all over the place and nothing seemed simple or good.
I've thought of building my own running "ls -al > filelist.txt" type of approach and then processing the file, but not sure that is best way. An array would be much easier and eliminate the file, but never done an array in bash.
I have a script that moves files after 60 days from the FTP folder over to a trash folder which is not accessible by the FTP. From there I can delete the files whenever the drive get's full. That works fine so far. The only thing that bothers me is that files moved to ".Trash" are not in any folder structure anymore (one big directory with the deleted files in it).
When I do a "ls -la" on the array {} I see the file names including the folders. I'm not sure if the find command or the mv command forgets about the directories.
This is the script: bash -c 'date;find /Volumes/data1/test/ -mtime +60 -type f -exec mv {} /Volumes/data1/.Trash/ ;;date' >> ~/mylog
I`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
I've tried a number of suggestions found on the internet and none of them work. Here's one:Code:mv "$x" ~/.Trash/...where $x is the pathname of the file passed to the script.I've also tried different paths to Trash - on Desktop, in Home folder, in my user folder, it makes no difference. Either nothing happens, or more often, the file is simply copied to my desktop or userfolder with the name "Trash".What is the actual path to the Trash folder and how can I move files there? I'm using Ubuntu 10.04.
I am looking for a script/advice or guidance on how to write a script so that when I use the 'del' command it removes/sends the files/folders to a I specify for example 'dustbin
How can I write a script to copy files from one directory to another directory according to last modified date?
ls -al -rw-r--r-- 1 user user 100 2011-05-26 12:33 ABC1234_frontcover_10344000_2011-05.doc
What exactly I want to do is, Using the above bold part of ls -al result the ABC1234_frontcover_10344000_2011-05.doc file should be copied to /home/abcd/ABC1234/2011-05/26/. There should be some way to do it using value of date -r $file +%m and basename *.doc | awk -F_ '{print $1}'.
I want to capture a mms stream daily. All works well with that, but I also want to delete any mpg files that are captured from the stream that are more than 7 days old. Will this command work?
I have searched for a way to copy file less than X days old and I found this:http://www.howtogeek.com/howto/ubunt...days-on-linux/ The syntax for deleting files less than 7 days old would be like this:find /path/to/files* -mtime -7 -exec rm {} ;I would like to copy the files to mntas, and I'm not sure what the syntax should be.ould this work?find /path/to/files* -mtime -7 -exec cp {} mnt as ;
i am a newbie in linux ,i am writing a bash script to identify the files which are exactly 7 days ( a week old) i tried this command find /var/backup -mtime +7 -exec ls -d {} ;but this gives me even the files which are older than 7 days
I am new to unix, I am looking for a script to delete files older than 7 days but i also want to exclude certain directories (like arch,log .....) and also some files with extensions ike .ksh, .ch, ..............) in directories and sub directories
I am trying to put together ksh shell and I am new with writing scripts.How do you write a command to delete any files if it's 30 days old and also it's not currently being locked?
find -type f /path/* -mtime +7 -exec rm {} ;Is this the best way to delete only files (not directories) within /path that are older than 7 days? or is there a better way?
Seem my rotation part is not removing files older than 90 days. Anybody know what is wrong?
Code: #!/bin/sh #navigate to the desired backup location cd /public/backup/linux #dump the MySQL entirely, output file is dated mysqldump -u root -pmt1jxz68f2 --all-databases > "`date +%Y%m%d`.sql" #backup the web folder
I'm trying to truncate a postfix Maildata directory for one of our users. I want to be able to move any files older than <n> days to a new location, but also copying the relevant directory structure. This should be doable in the one comman. I've used find to locate the files, and mv to move them, but I can't figure out how to build the directoryt structure on the fly in the new location.
I have set up a simple find and delete script for files older than X days. The problem is that some of the files that are send in this share are transfered from an archive server and creation/modified date remains the same when copied and the age of them could be a year ago or older and they get deleted over night by the script.For performance reasons the raid is mounted with noatime in fstab.Do you see any solution to this problem except enabling atime?I'm thinking at some more advanced script that writes the list of added files once a day and marks them for deletion after some time.
I've got a bash script I'm using to download a text file list of links via axel. What I'd like to do is automate the movement of completed links in the for loop when axel has successfully completed the download. This is what I've got. I can figure that I can just echo append the line to a new file, but what is the easiest way to delete the line with the link I just downloaded?
Code:
#!/bin/bash for i in $( cat $1); do axel --alternate --num-connections=6 $i export RC=$?
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
I want to search a directory recursively looking for new .rar/ .zip files. When a new file is found I want to extract the contents to another directory. To top things off would like to rename the source file as something like original.rar.extracted.
Being relatively new both to Linux and this forum, i am sorry if i make a post that already is, evn though i couldn`t find it.My problem is i can`t move downloaded files over to root filesystem, i have downloaded and unpacked them to files. to change it`s looks and downloaded a skin, i open root, go to usr---> amsn ---> share --> skins, now i am to copy the file of the skin over to the root directory, butI also tried alt+f2, writing sudo conqueror, as an advice i got, but there was noe difference.
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
I want to run rsync on server A to copy all files from Server B when they are newer than 7 days.(find . -mtime -7) I don't want to delete the files on Server B.