General :: Script To Move Files With Log And Email And Cron?
Jan 4, 2011
A user will be ftping some files to an upload directory. I need to move those files to another directory. I also need to mail a list of the just moved files to the user. This job will need to run every 10 minutes. I need to keep a log that holds all the files for the day that were moved, renaming it with the date/timestamp.I have this below but I just can't put it all together. a workable script out of this?
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
Using CentOS. I have a cron setup to run this command: Code: /var/test.sh | mail -s "Test Cron" mr182@somewhere.com The email is sent but the output of the script is not in the email body, it's just blank. I know there is some output because there are some echo statements in the script.I don't want to get an email for all cronjobs, just this one.
I have some scripts that do usual stuff like backing up MySql database, gz, tar some files and put them on FTP or sync with some other backup/mirror system. Some scripts are running quite frequently (like twice/thrice per hour). I am sending email with attached log output from commands after completion of job. It is quite a lot of email to keep track of.I want to send email only when script fails to do something, that is when some command in script fails. How can I accomplish this?
I am running a Cron job which mails the content of a log report every day. The problem is that the contents are sent as an attachment instead of as the body of the email. The strange thing is that if I run the instructions from the command line everything works fine, but if I do so from the Cron job the log report is attached instead of being sent inline.
The instruction I use is: Code: mail -s "logfile for cron" cron@example.com </var/log/cron-log Following some advice I read on a blog I also tried this instruction in my Cron file, but the result is the same: it works fine from the command line, but attaches the report when run from Cron: Code: echo "Content-Type: text/plain;" | mail -s "logfile for cron" cron@example.com </var/log/cron-log How to ensure the content of the log file is inline?
This is Kishore and i am new to Ubuntu and SVN and please some one help me in creating a cron job for my svn backup every day at 10:30 pm I already created a cron job which looks like 30 10 * * * svnadmin dump /home/administrator/svnrepository >svn1 when i run command directly i am getting whole backup and it's size is 3.6 gb but when i run through cron job the backup size is only 9 mb. So finally my requests are 1. cron job for taking complete svn backup at 10:30 pm daily and 2. cron job to copy the SVN backup in to my windows system in d drive and this must be run every day at 11:30 pm.
I have installed CPAN module Spreadsheet::WriteExcel, to generate some reports. Now when i execute my perl module from command line. it works fine and generates the excel file. When i put this module to be executed via cron, it doenst work and an email is generated. My entry in cron tab is as follows:2 14 * * * perl /scripts/postpaidRecon/postpaid.plThe email i receive in /var/spool/mail/root for module failure is:
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
I have tried the MAILTO="" and adding >/dev/null 2>&1 to the end of the line, but my server is still sending an email about my cron.weekly script. After making changes to /etc/crontab I restart cron with /etc/init.d/cron restart without error. I have also tried crontab -e but that loads up an empty file (su'ed into root). What step am I missing / what am I doing wrong? Here's my crontab:
Being relatively new both to Linux and this forum, i am sorry if i make a post that already is, evn though i couldn`t find it.My problem is i can`t move downloaded files over to root filesystem, i have downloaded and unpacked them to files. to change it`s looks and downloaded a skin, i open root, go to usr---> amsn ---> share --> skins, now i am to copy the file of the skin over to the root directory, butI also tried alt+f2, writing sudo conqueror, as an advice i got, but there was noe difference.
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
in my linux box i have lot of cron jobs. i need to make all cron jobs in script files. is it possible to make this. like[URL]Now how i change this in a script file
I would like to create a cronjob that will delete all files within a directory 1 hours after it is created to the folderI found this cron find /path/to/file/* -ctime +1 -exec rm {} ; but it's deleted all files.I want to make an exception, all file should be deleted except one file (letsay file a.zip)
I have a directory with a bunch of scripts and other random files. I want to move the files with the executable flag set to another folder en masse (or, if it's easier, the ones without the x flag). Is there an easy way?
In my bash script I need to move files in a folder if it is not in use.
Code: for entry in `ls /root/shared_storage/input`; do echo $entry run=`lsof /root/shared_storage/input/$entry` ru=${run:0:5} echo $entry if [ "$ru" == "" ]; then ........ It worked fine sometimes but sometimes it just get stuck at lsof. Is there any other way that I can use here to check if the $entry is using some other process?
So I was wondering, if I capture this output into a file (ie. one file per line), can anyone help me write a command which iterates through the file and moves the files one by one to a specified directory?
I have this script that I use to find log files in the /var/log directory that are 2 days old, move them to /var/log/tmp, rename them to the system date.filename and move them back to /var/log. Everything seems to work as planned, except that the files don't get moved out of temp, and they keep getting rename. This leads to very long filenames such as:
What is it about this script that isn't moving it back to /var/log? Also, is there a better way of doing this than what I'm doing? Basically, I'm just trying to set up an audit trail on some of the files in /var/log, so that at the end of the month I can tar them, and then have our syslog server pick up the one giant monthly log.
I have Ubuntu 10.l0 installed on my laptop. I recently install the KDE desktop from the Software Center. Today, I noticed something strange. I tried to move a file to the trash when I got this error message: "The trash has reached its maximum size! Cleanup the trash manually." I don't have any files in the trash. I went back to Gnome, and was able to delete the file. I opened up Dolphin while still in Gnome, and couldn't delete anything, so I know that this isn't a KDE problem
I use avg free as antivirus scanner. I looked some time for a scanner to scan and remove viruses. Problem is avg 8.5 will only detect viruses, not remove them...I use a oneliner looking like this:
I have to move files between two file systems /inst and /inst2.When I perform 'cp -a /inst /inst2' it copies everything even hidden files and preserves access permissions.But when I perform 'mv /inst /inst2' it also preserves access perms and moves everything besides hidden files.Questions :hy is so ?What tool to use when moving file systems from one fs to another (rsync) ?
the system currently have a directory with all the invalid files. how bad is it to move a single file to a directory containing 3 million files already?
I recently had data recovered and it was sent back to me on what I think is an NTFS drive. I copied all the files over to a file share I have on a Linux box, that's ext4. Now I have that share mounted on my OSX machine, and I can't move or rename most of the files. However, in a couple cases I was able to rename a folder after the third try. Another time I was able to rename a folder once, but not again. All the permissions are showing up the same on the command-line -- I can't see any differences between the permissions on any of the files/folders. Note that I can create new folders and add files no problem, and then rename and move those all I want.
I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
I`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
I'd like to move a selection of files from all the sub-directories within an overall directory to a single destination. I don't want any of the directory structure, just the files themselves. This is what I tried so far:
mv /dir1/*/igs*.sp3.Z /dir2
There are other .sp3.Z files in the * directories within /dir1 but I just need the ones that start with igs..