When run it compares the public ip address against a file (ipAddress.txt) to basically determine if I am on my VPN or not. If I am not, i.e. it finds my public ip in the ipAddress.txt file it is supposed to exit otherwise it launches firefox with the given URL.
I have tested this by running from cmd line and running from the crontab. Running from the cmd line I can start the script, let it launch the first URL, then i disconnect from the VPN and the next time it goes through the loop it detects that my VPN has disconnected and kills the script. However, from the cron job if I have disconnected from the vpn I have found that it even though it finds my public ip address within the ipAddress.txt file it still marches right through to the next step in the script and launces the browser and will do this over and over.
I'm really running into a wall trying to figure this out. I have a Bash script and narrowed down the one command that doesn't seem to work via cron and it's my pgp decrypting line. Works fine if I run the command via terminal but if I run it via cron it doesn't output anything.crontab -e shows the cronjob and it runs, creates the log file with no output. Is there maybe something I need to run as well? Permissions look set, unless the cron is running as a different user(was under the assumption if it showed up under crontab while logged into that user, then it would run as that user.
I am trying to display a text using crontab. My settings :
*/2 * * * * /bin/echo "hello"
This setting sends a mail to the user, instead of displaying on the screen. Now tried changing the setting to:
*/2 * * * * /bin/echo "hello" > /dev/tty1
Now I can see the text on the screen, but this setting comes with a catch. What would happen if the user changes the terminal. for instance if he gets into tty2, he wont have write access on tty1. So the user gets a mail saying "permission denied". Is there any way that I can force the user to use a particular terminal or can a cron job be set in a way such that the user would get the text irrespective of the terminal he logs in.
I have followed this guide to setup automatic screen rotation on a fresh installation of Maverick on a Lenovo X200t. This involves creating the file:
Code: $cat /etc/acpi/events/x200t-swivel-down # /etc/acpi/events/x200t-swivel-down # called when tablet head swivels down event=ibm/hotkey HKEY 00000080 00005009 action=/etc/acpi/x200t-swivel-down.sh and the script
Code: $cat /etc/acpi/x200t-swivel-down.sh #!/bin/sh /usr/bin/xrandr -o inverted touch /home/nikos/Desktop/swivel-down xsetwacom set "Serial Wacom Tablet stylus" Rotate half xsetwacom set "Serial Wacom Tablet touch" Rotate half xsetwacom set "Serial Wacom Tablet eraser" Rotate half (similar for swivel-up)
What I find strange: - calling the script with ./x200t-swivel-down.sh works (both with and without sudo). - rotating the screen to tablet mode only executes the touch commands (which I entered for debugging)
Obviously, the acpi event is registered correctly and reacted upon. Just why are the xrandr and xsetwacom commands ignored?
I've recently noticed that whenever I log in via fingerprint reader (AES2501), my system behaves strangely. For example, the Gnome panels take almost a full minute to appear after everything else has been loaded, and there's no NetworkManager icon in the panel - even though nm-applet (and thus NetworkManager) is running as shown by top; that can be (partially) solved by killing NetworkManager, restarting dbus, and then manually running NetworkManager. None of that happens when logging in using a password.
Short of not using my fingerprint reader, what can I do?
I have added some executable scripts to /etc/cron.daily but don't get the stdout/stderr output from them as mail (or anywhere else I have found). At least one of them is running (because I can see that it has added a file to the disk).
The peculiar thing is that I do get the output from /etc/cron.daily/0logwatch (part of the logwatch package) as an email each day.
The MAILTO line in /etc/crontab is "MAILTO=root" (unchanged from default). Same for /etc/anacrontab.
I do have an alias at the end of /etc/aliases which redirects root's mail to my own account, but this alias works fine for mail I send manually. (It also appears to work fine for the output from the file /etc/cron.daily/0logwatch.)
I put in my cron entries to run my backup script which rsyncs my data to my 2nd drive, however on a hunch I checked my backup drive which mounts automatically via fstab and I realize it had not ran in a while. I checked cron and there were no entries for it. I got to wondering if I should ever be worried about a cron update coming down and over-writing my existing cron file with the backup entries in it to run.
Is there a terminal emulator which works well in an Ubuntu desktop and provides the following features which Mac OS X's Terminal application has? Re-wrapping text when the window is resized.A Clear command which clears scrollback (as the shell clear does not) and does not clear the cursor's line (typically containing a prompt).
I have a favorite REXX program called fv2. When I was a Windows user I had an icon for fv2 on the Quick Launch bar. Click that icon, and the program ran. Now, as a Linux (Ubuntu) user it is necessary to go through several steps to run fv2.
1) Launch a terminal by clicking on the terminal icon at the top of the screen. What's that area called? The GNOME panel? 2) Enter: ~/Desktop/RexxScripts 3) Enter: regina fv2
I run fv2 several times per day and would really like to have the convenience of a clickable icon.
How can I make terminal applications immune to terminal emulator close, but still able to use all virtual terminal features?
egin{UPDATE}I want my terminal application remain alive and accessible if I accidentally close terminal emulator. This functionality is provided by screen and tmux, but they have issues with colors and they flush screen.Yes,I can run the shell inside screen, but I do not want the shell remain alive unless there is some other program running.
end{UPDATE}I see this must be something like screen, but without VT100 terminal emulation, something which will just apply whatever application does with "terminal proxy"'s terminal (like outputting something to stdout/stderr or using stty to set terminal options) to the terminal this proxy runs in.
// I know about screen and altscreen on, but it makes either this (screen with TERM=screen):
or this (screen with TERM=rxvt-unicode):
while I want this (rxvt-unicode without screen):
I have figured out that everything looks fine if I compile rxvt-unicode with USE=-xterm-color (in fact vim looks like on the second picture even without screen if I add this USE flag) and set TERM=screen-256color, but I do not like this workaround because it actually changes colors and I can't be sure that it will always change them only this way:
Right now when I start a program from a terminal I can't use that terminal instance again until I close the program.
I am a new user of linux, and I want to know if there is a way to execute a program/application from a terminal without blocking the terminal until the program ends.
I have a php script that checks versions of a remote stock file every 15 minutes. When it discovers that a new version is available it downloads the new file and runs it through a series of scripts that prepare it for an e-commerce application. It takes about an hour to import the file and process it. Otherwise, it echos "file on remote host not newer than local file". The stock file is updated every 3 or four hours and it's never at the same time. So we would like to run this script every 5, 10, or 15 minutes to make sure we have updated files.
So what happens when my php script finds a new file and takes an hour to do its work? Will the cron be activated three more times during this span? In other words, will it keep starting over even if its still running from a previous cron? Is there a way to stop the cron from running while the file download and processing occur, and then resume when its finished?
I am trying to set up a script that uses SCP and will run via cron to go out 3 nights a week and pull down the zip files that are created of my website on my hosting server. I am having trouble setting up the SCP script that will automate this. I don't want to have to enter a password each time. I know this can be done with public/private keys but need help with the syntax. Below is an example of what I have.
I have two shell scripts, one has to run after the other has finished running.I know that in cron.monthly they run in alphabetical order, but does each one run after the other has completed, or is it possible that the first may still be processing when the second is run?
I'm having a script / cron problem. i'm trying to run a script that backs up a postgresql database using a pgdump command.
I've included a simplified version of my script here.
The pgdump works fine, and outputs an SQL file.
The problem is with the tar command. it fails to correctly tar, and then compress, the .sql file when the .sh script is run from cron. but when run manually from command line (with sudo) it works well. Note ls -l printout, where the top lines were generated by cron (and tar & tar.gz are minisule), whereas in the second group the tar and tar.gz are much larger (and correct).
Note that the .sh script is owned by root, who is also the user in the cron entry. I tried using postgres as the user for all those, but i think i then had password problems. should i be using some kind of "credential"?
Here is the script and cron entry (cron in test mode to produce every 2 minutes):
Code: command failed with exit status 1 /bin/bash: -c: line 0: unexpected EOF while looking for matching ``' /bin/bash: -c: line 1: syntax error: unexpected end of file I tried a slightly different syntax:
hello I tried to find a good subject but it was the best of mine, anyway I'll explain it here. some time I do some thing like installing a new application in Linux terminal of my office PC but it take a long time and I have to go home during its installation or configuration process that it is not good to cancel it.My current solution is abandoning the process until next day. I wanted to know is there any way to redirect an input and out put of a terminal to another one, if it works I can continue my abandoned process by ssh to my Linux office PC and redirect that terminal to my new remote sshed terminal from my home.
I have a simple script. When I run it as cron job. I jot email saying:/bin/sh: line 1: test.tmp: command not found.Even I took first line out, I got the same error.The current shell I have is /bin/tcsh.