Software :: Cronjob And Feh / When Its Run Via Cron It Fails?
Aug 21, 2010
I wrote a small bash script to use feh to change my background every time its called. I use a cron job to have it execute every 5 minutes.
When I run the script manually it works perfectly, however for some reason when its run via cron it fails. I can see that it is executing but it seems to error for some reason.
Its kind of a drag to have a working script that when run as a cron job suddenly no longer works.
I'm having a small issue where the backup jobs that I set to run in the crontab of the backup user do not appear to be running. Here's how I set it up (with crontab -e as the backup user):
run amanda every night (check at 2:45 and backup at 3)
I have an Ubuntu server running Couch Potato, Sick Beard and Sabnzbdplus. Everything "works" pretty well in a sense that CP and SB push the NZB's to Sabnzbdplus, but Sab crashes regularly (haven't found the solution or the cause for this problem, so if you have some advice regarding that, it's welcome).To counter this problem (Sab crashing) I have a script written which checks if Sab is runnning and if it isn't start it:
I'm using Ubuntu 10.04 LTS server and Postgresql 8.4. I have a .sh script that is run by cron every other hour. That works fine. The .sh script includes an rsync command that copies a postgresql dump .tar file to a remote archive location via ssh. That fails when run by cron; I think because it is (quietly) asking for the remote user's password (and not getting it). I set up the public/private ssh key arrangement. The script succeeds when run manually as the same user that the cron job uses, and does not ask for the password. I am able to ssh to the remote server from the source server (using the same username) and not get the password prompt (both directions), so why doesn't rsync work? I even put a .pgpass file in the root of that user's directory with that user's password, and the user/password are identical on both servers.
I think the problem is rsync is not able to use the ssh key correctly. I tried adding this to my script but it didn't help.
Code:
Here is the rsync command embedding in the .sh script.
I have added some executable scripts to /etc/cron.daily but don't get the stdout/stderr output from them as mail (or anywhere else I have found). At least one of them is running (because I can see that it has added a file to the disk).
The peculiar thing is that I do get the output from /etc/cron.daily/0logwatch (part of the logwatch package) as an email each day.
The MAILTO line in /etc/crontab is "MAILTO=root" (unchanged from default). Same for /etc/anacrontab.
I do have an alias at the end of /etc/aliases which redirects root's mail to my own account, but this alias works fine for mail I send manually. (It also appears to work fine for the output from the file /etc/cron.daily/0logwatch.)
I put in my cron entries to run my backup script which rsyncs my data to my 2nd drive, however on a hunch I checked my backup drive which mounts automatically via fstab and I realize it had not ran in a while. I checked cron and there were no entries for it. I got to wondering if I should ever be worried about a cron update coming down and over-writing my existing cron file with the backup entries in it to run.
I know crontab -e sets a cronjob in /var/spool/cron but how do I set a cronjob to run from /etc/crontab? Is there a command used for this or would I have to manually edit a certain file?
I want to use a cronjob to start a script running streamripper to record radioshows when I'm not at home. I'm running 10.04 server 32 bit.
I've setup a cronjob via Webmin to start the recording. It runs the following script:
Code: ORDNER=`date +%Y"_"%m"_"%d"-"%H"_"%M"(FM4)"` # 'ORDNER' means Folder and creates a folder with time/date to save the stream streamripper http://mp3stream1.apasf.apa.at:8000/ -d /mnt/Samsung/Radio/$ORDNER --xs_padding=2000:500 -a -q -s
[Code]....
Is there any logfile or way to see why the script is killed or what stops the recording? When executing the script manually the recording runs just fine until I kill the process.
Is it possible to send a libnotify from a cronjob? I'd like a notify message when my daily backup is done. When I run my script from terminal everything is working fine, when (ana)cron runs it, it works as well but the notify is surpressed. My Guess was, that it happens because the jobs are run as root. Nevertheless is it possible to get a notification message?
I wrote a little backupscript wich is working fine if i start it from the shell manualy. But I'm not able to let it start automaticaly via cron in ubuntu 10.10.
I'm trying to change my desktop background on my fluxbox desktop every minute using cronjob to a random picture within my Pictures folderthis is crontab -e file;
Code: # Edit this file to introduce tasks to be run by cron. #
I'm working on a bash script that will be run regularly via a cronjob. As there is no easy way for the user to see if it's running in the background, I'd like a have some kind of pop up window what say's something like "Process started" and "Process finished" when it's complete. Ive read though some similar threads and came across xmessage. It kind of does what I want, with the exception of:-Id like the message Process started to stay up for the whole process, but it looks like the script wont continue until the xmessage window is closed. I cant get any xmessages to appear when the script is run from a cronjob. It only appears if I run the script manually from a terminal window. Which is no good for me. If anyone has any suggestions that would be great and just to add, it's doesn't have to be xmessage, any text box would do. I'm working with Redhat 4 and 5, if that makes a difference. Chuck Norris destroyed the periodic table, because he only recognises the element of surprise.
I'm trying to setup a cronjob that needs to run every 40 minutes between 10am and 3.30pm.I generally used cronjob for simple configuration, but now I'm a bit lost.There is a way to setup cronjob for that configuration? (better if all in one line of code, not multilines).
I have the following Cron job scheduled on my Postfix mail server: Code: 00 18 * * * /usr/bin/clamscan -r --remove /home/.This is just running a scan on my entire /home/ directory and removing any infected files it finds. My question is since this is being ran at 6pm via Cron, how can I get the results of this job emailed to me via text? Does anyone recommend a command I can add to the end which will dump the results into a file or email and send it to a specific email address? This server is my company Postfix MTA for everyone.
I have to run one testcase which opens a web_browser on Linux machine. When a run with cygwin it works but I can't always keep it open. I want to schedule a cronjob to run this testcase.
Some of my cronjobs are filling up files in my / directory. How do I make this stop? One of my cron jobs uses wget:[URL].. The bexcb0.php file writes a file and then echos a result if it is sucsesfull. These echo results are being put into bexcb0.php files in my /root folder and are piling up.
My / folder is filling up with files bexcb0.php etc bexcb0.php bexcb0.php.1 bexcb0.php.2 bexcb0.php.3 bexcb0.php.4 bexcb0.php.5 etc How do I make this stop? If I just remove the echo will they stop writing to the / folder?
I have added "@daily shutdown -r now" to my root crontab (sudo crontab -e) but it does not seem to ever run. When I look at the chron log using webmin I can see that it tried to run and there was no error. Also when I run it manually using webmin the system reboots fine. I also tried using reboot -f in the crontab instead and that also worked when manually run but not on schedule. The reason I know it didn't run is on webmin it shows the system uptime. This is the output of the chron log:
Currently we have to comment out an rsync task every now and then. Is there a way to have a PHP script on the server so we can simply browse to it to uncomment the rsync task and another one to comment it out again ?
The rsync job is the only job in crontab so even if it means deleting / re-creating the job wouldn't be a problem. I just want to avoid connecting via SSH and VPN every time I am asked to disable it