Programming :: Running Shell Script Using Cron Job
Jul 8, 2011
I am using Shell Script to run my Java program. I have written a Cron job to invoke Shell Script every day at 7pm. Cron job is running every day at 7pm ,but it is not invoking my shellscript and also I am not getting any error message. I am able to run same shell script from cmd propmt using bash, and also I am able to invoke the same script from mainframes Universal command job. Same ShellScript and Same cronjob is working fine in my Dev server. But in my QA server it is not working.
I've been having a bit of trouble running a shell script with cron. A friend of mine does a community radio show and the station has a live stream but no podcasts, so I've set up a script to record the stream and encode it as an mp3 while I'm away using mplayer and lame -that's what I'm trying to do anyway.
Here's the script, but it doesn't seem to run- at least, I don't see any of the files it should be outputing, would they be in the cron.weekly directory (where I have the script) or in my home directory?
#This is a script to record 'The Unnamed Show'
#it will record the show from the live stream, then convert the output #to an MP3
#Finally, it will delete any files no longer required HOME=/home/byron/
Im having some difficulties with running a shell script from cron which I am unable to resolve for almost 2 days now.
For testing purposes, im trying this with simple shell script test.sh with chmod 777
This script is located at /var When I type /var/test.sh it runs perfectly and prints asdasdasd When I type /var/test.sh > /home/log it writes asdasdasd to /home/log - works
The problem occurs whenever I add it as a cron job to var/spool/cron/crontab/root
There is: 11 10 * * * /var/test.sh > /home/log - however, at 10:11 there is no file at /home/log
Cron as a service works, forexample, every day at 4 am it makes this backup sshpass -p xxxx rsync -avz -e ssh root@x.x.10.7:/data/backup/ /home/backups/isp_admin
My root Vixie cron crontab is set to perform a system snapshot via fsarchiver: 0 0 * * * fsarchiver savefs -o -A /backups/p30_root.fsa /dev/sda2 /dev/sda3 The command itself works fine, generating about a 7G snapshot of my Suse server. I then wish to rsync this to a NAS I have located in another building: 0 3 * * * rsync -av -e ssh --delete /backups/ root{at address}:/DataVolume/os_backups/
(yes, I'm rsyncing as root. I absolutely loathe it, but I got thrown into an quasi-SA position with a ridiculous to-do list and no time to do it. I'm having to make things just work and then go back and try to improve them/learn how. I couldn't make it work in a non-root way quick enough, so for now I'm having to cron rsync jobs as root because of all the differing file permissions on this samba/MySql server. I set the NAS to only accepts ssh from the server IP, and we're behind a campus firewall... It's serious trial by fire.) The crontab also has rsync commands for the samba areas, our specialized chemistry software and affiliated MySQL databases............
after that wait till that time but script is giving message no such file in directory but same thing is excute manually it is running fine and found the files. Below is my script details file name ejrename.sh. code...
But there is problem here. I have to close and open Internet Explorer to see my changes effect on iptables! Is there a way that I don't have to restart the IE?
1. i need to check whether the oracle database is running or not if it is running it will run auto mount script.nohup /u01/app/oracle/product/11.2.0/dbhome_1/bin/dbfs_client dbfs@xxxxx -o allow_other,direct_io /u01/app/oracle/DBFS/XXXXX < /home/oracle/passwd_dbfs.txt&nohup /u01/app/oracle/product/11.2.0/dbhome_1/bin/dbfs_client dbfs@xxxxxx -o allow_other,direct_io /u01/app/oracle/DBFS/XXXXX < /home/oracle/passwd_dbfs.txt &2. For every 1 hour i need to check above mount points is mounted or not if it is not mounted it should be mounted.
I have a script that I would like to run using 'cron'. I want to use 'scp' to transfer files from one machine to another. I have set up the SSH keys on both machines. When I run the script from bash terminal, it works flawlessly. But when I schedule a 'cron' job to run the same script, 'scp' does not transfer the files.
'Return value' is 0 when the script is run from bash directly. But when it runs from 'cron', the 'Return value' is 1. That means, surely, that 'scp' is throwing an error. I don't know which error is being encountered. Could anybody let me know how to make it work?
I'm creating a script all worked fine in the command line. But not work in the cron. Below you could see the script
[Code]...
So far I found when I use corn following part not working, nothing goes to the processedfiles file. ls -l /var/lct/mou2/processed | grep $TODAY | awk '{print " " $8}' > /home/trans/mou/processedfiles ls -l /var/lct/mou2/processed | grep $YESTERDAY | awk '{print " " $8}' >> /home/trans/mou/processedfiles This work perfect in command line. Corn job and command line use by the same user.
I have a cron job that runs a shell script. But it only runs the first line of that shell script and not the rest of the file. I'm a little stumped as to why. If I run the shell script manually, it runs and executes every single line as it should. I think I must need some additional syntax to make this run correctly?
Here is the crontab ...
Code: root@kchlinux:~/macs# crontab -l # m h dom mon dow command
Setup: 10.04 server with "bash" as /bin/sh When I run "ls -l" in a shell I get the following format:
Code: -rw-r----- 1 syslog adm 0 2010-06-13 06:53 /var/log/user.log Whereas if "ls -l" executes from a cron job the format is:
Code: -rw-r----- 1 syslog adm 0 Jun 13 06:53 /var/log/user.log Notice the different time format. Now I could fix this by changing the cron job to
Code: ls -l --time-style=+%Y-%m-%d %H:%M ... but I'm interested in knowing why this behavior occurs. What's different between the cron job and the shell?
I want to create a shell script that I can run as a cron job to do automated backups. how to proceed. I have the source code and have pulled out the backup source files. Here is the link for the source code. The download is at the bottom of this page. The backup files can be found in the src folder listed as backup.cpp and backup.h [URL]
If someone can do this, I believe it will make a nice addition to the Basket application for all.
Also I am not running Kubuntu I am running Karmic 9.10 64 bit
I have an Ubuntu server running Couch Potato, Sick Beard and Sabnzbdplus. Everything "works" pretty well in a sense that CP and SB push the NZB's to Sabnzbdplus, but Sab crashes regularly (haven't found the solution or the cause for this problem, so if you have some advice regarding that, it's welcome).To counter this problem (Sab crashing) I have a script written which checks if Sab is runnning and if it isn't start it:
I am still an openSUSE noob and have been struggeling to get cron jobs running for the past couple of hours...without having success Hopefully a more experienced user can give me a hand As root I used crontab to setup some cron jobs:
Code: crontab -u api-cebian -e The resulting cronjob looks like this:
why my second cron job isn't running? The first one runs fine but the second one does not run at all. If I manually run the script using ./check.sh then it does what it is supposed to but will not run from cron. It should run every weekday at 9:45am, it looks good to me but clearly I am doing something wrong.
Code: # m h dom mon dow command #This will backup my hosted websites. 0 2 * * * /home/bob/scripts/websitebakscript.sh #This will run check and upload it's contents to the net. 45 9 * * 1,2,3,4,5 /home/bob/scripts/check.sh
Cron refuses to work for a newbie.I am trying to setup a cron job to backup a folder in my home directory ('/home/scratch') to a folder ('home/internet_backup') that is mounted (using nfs) to a network folder.the folder ('home/internet_backup') is mounted correctly to the network folder upon system startup, so this part works.
When I refresh the page 'updatestatistics.php' in my webbrowser it does generate a new page ('statistics.html', which is included in 'statistics.php'). But when I let this cron job do the job, nothing happens. I looked into the log file, it shows this:
After saving a few sample scripts in crontab, I discovered that cron is not running properly on my Ubuntu 9.10 (32 bit), after a recent reinstallation. Please note that the files "/var/log/cron" and "/etc/defaults/rc.conf" are empty in my system.
I want to make a cron job for my webserver, only it seems not to work...
I have now the followig to test (every 16th minute of the hour, write test to a file), but it doesn't show up. When I write out the cron, it is done to the file: /tmp/crontab.QXsBIO/crontab . How to make cron work? It must be very easy, but I have real troubles getting it fired up.
My Linux version is debian lenny and the cron file looks like (crontab -e):
I have created a conjob via ssh by going to crontab -e and adding my scripts like 05 10 * * * /scripts/old-files-delete.sh.The file has 755 permissions and if I run the script manualy it works fine. I have checked the cron log and it does not show up as running at all.