Programming :: Keep The Bash-script Running If "exec Command" Fails
Aug 13, 2010
Is there a way to use exec, but if exec fails to go on with the script?
Example:
Code:
#!/usr/bin/env bash
exec startx
echo "Starting of X failed"
If startx fails, the echo will be seen on the screen. I tried all kind of stuff, but guess it ain't of much use to post it here. I searched the web, but searching for "exec and bash" in one sentence does give results which are not what i am looking for.
Within a bash script, I'm attempting to redirect file descriptors with exec, e.g. exec 3>&1 1>&2; however, I'd like to do something like exec $FD>&1 1>&2, which doesn't work because bash tries to execute the value of $FD. Various placements of eval fail, as well. Is there a way around this, or am I stuck hard-coding the descriptor?
I am trying to fix a perl script, and I really suck at perl. But I think this problem will be easy for people who know it.
The problem is, I have an old setup script someone wrote many years ago. It fails if the standard shell is dash and not bash. The only way I've gotten it to work is to point /bin/sh to bash. I looked thru the script and it uses "system" many places, and I think that's the problem.
I searched for it and found this link:url
My plan is to include this function:
Code: sub system_bash { my @args = ( "bash", "-c", shift ); system(@args); } Then I could simply change all calls to system into system_bash and it should work?
The parameter to the system calls is usually some variable. What if the parameter is a list already? Do I need to test for it somehow, and if it's a list, prepend "bash" and "-c" to the list? How do I do that?
In the script there are lots of places like this:
my $error = system($cmd); if ($error) { die/warn "some error message"; }
Shouldn't there be a return in the system_bash function?
i had a problem with the find command in bash (which i deem is close enough to a promming language, if not please move this thread :P). i tried to reduce the command to the problem. i want the backticks, or $() for that matter; to be evaluated by -exec of find, not by bash. is that a caveat of find?
Code:
$ find testd -exec echo `basename {}` ; #confused me test test/a test/b
[code]...
edit: i found out whats causing this. `basename {}` gets evaluated by bash before find is invoked, returns {} and `find . -exec echo {} ;" is run. now my question is, how to escape this eveluation from happening before.
writing bash scripts (wrote my first one this Sunday) and I'm trying to zip a group of files. It has to be in zip format so alternatives like tar won't work here. I have my script in a folder which has a bunch of Sub-directories in the format "Lab 3" "Lab 5" "Lab 6" etcWhat I'd like to be able to do is take all of the files (just the contents not the folder itself) in the "Lab 3" Folder and put them in Lab3.zip. I'm really close but no matter what I try I keep getting the folders put into the zip file instead of the Folders contents
trouble when I log in the fedora as a normal user(not root) today. When I fill the username and the password press enter key. Then pop up a little window locate at topleft corner. And have one line message : "/usr/bin/xterm : Could not exec /bin/bash : Permission Denied"hen I log in as root, it's ok. And when I open shell and type "su username" then the console print "su: /bin/bash: Permission denied"I have checked the perms of "/bin/bash" , it's 755. And I have tried all things which suggested from articles searched through Google. like change / or /root and other directory's permissions but failed
Using xsel I pass a selection into a variable. I then check that the variable includes an embedded newline to be sure that the selection returned by xsel is complete. If the selection content preceding the newline is just a single word, the check fails to detect the newline, thus
Is running a command in the Alt+F2 prompt possible in a bash script?I need this for a launcher for gnome-shell.For it I have written a little script to check if the process gnome-shell is alive and act accordingly.The script works fine, I just don't know how to write "debugexit" to the Alt+F2 prompt, as that is the only decent way I have found to shut gnome-shell down and going back to gdm desktop smoothly.
I'm in trouble when I log in the fedora as a normal user(not root) today. When I fill the username and the password press enter key. Then pop up a little window locate at top left corner. And have one line message : "/usr/bin/xterm : Could not exec /bin/bash : Permission Denied" Then I log in as root, it's ok. And when I open shell and type "su username" then the console print "su: /bin/bash: Permission denied" I have checked the perms of "/bin/bash" , it's 755. And I have tried all things which suggested from articles searched through Google. like change / or /root and other directory's permissions but failed.
I have a Bash script that runs other bash scripts. If the parent code fails, is there any way for me to also kill the child code?That kills any multiple instances of a script if I run it more than once. Is there any way I can just modify this into something that prevents the child code from running/continuing from running if the parent stops from an error?
In windows command prompt, F8 key can cycle through your previously entered commands i.e say you enter "ping google.com" and then "pushd <dir>". Next when you type p and then press F8 brings pushd and next F8 brings ping command. You can then hit enter to execute the corresponding command i.e ping in this case.Is there anything similar in Ubuntu Terminal running bash? Very handy to get back previously entered commands.
~$ sed s/^bb/bbbbbb/ foo1.txt ~$ sed: -e expression #1, char 3: unterminated s command ~$ Where 'b' stands for space. What is the error here? Sed is running under bash.
I'm having trouble trying to make a script. What I want to do is check if xScreenSaver is running in my user account. If not, run it. If it's running, kill it.
So this is the script I've made:
Code:
The problem is that I've echoed the output of $(pgrep -u $(whoami) xscreensaver) and it always seems to add 4 numbers to the pid, even if the pid doesn't exist. What do I mean by "doesn't exist"? That no xscreensaver is running in my user, and if I run pgrep -u $(whoami) xscreensaver in bash, I get not output, but if I run the command though the script, I get (for instance) 4050. If I run it again, I get 4054, and again 4058... etc. What the hell is going on with that?
We have a custom app that runs on boot on some older hardware running DSL linux, and their startup manager was quite simple. We purchased some newer Asus eeebox's which run xandros and things are quite stable and run nice with 1 exception.The application only runs from the root (/) location. This box auto logs in as 'user' and there is a /home/user/.kde/Autostart folder where you can stick scripts to run at boot. So I have a start.sh script, and with little bash programming tried things such as; sudo cd / sudo /startapp.pl
but the errors start spewing with the basic;can't find data/xyz as it's looking in the local.I thought there was a basic cwd (change working directory) but everything I try just forces the run from that location.Any ideas or suggestions are appreciated, but things like can you change the code, etc. can't be done, so it must be a programming thing. The only other thought I had but not sure, can you do a cronjob with @boot or something, that when the box starts, it can run this job as root and fire off?
I have a bash script that I want to be running on a "clean" screen, but when the script finishes/exits I want to see what was previously on the screen. Any thoughts? The "Clear" command does not enable me to get the information back, and the "Screen" Command runs the program in the original window, so you see nothing until the new screen is exited.
I wrote a script that easily runs it in the same directory as it was run below: #for f in *.MTS do ffmpeg -i "$f" -acodec copy -vcodec libx264 -threads 2 -deinterlace -vpre slow -b 20000k -bt 3000k -refs 4 "${f%.MTS}.mp4" ; #done
I want to be able to use the find command so it will recurse through all the videos in my videos folder. Is there a painless way to do this. Here is the start of my find command but it doesn't work. Any help appreciated:
I am running a Java application on the command line bash terminal under Mint Debian. I have JDK1.6.0_22 installed 64-bit, and the OS is 64-bit too. I have a few JAR files in the directory and a few native LWJGL libraries. When I run the application using the command line, all works fine. Lets assume my directory where the files are is called /home/riz/MyGame. I change to that directory and this is the command I use code...
I am trying to replicate what is happening on this page under the tcsh shell, but using the bash shell found in Wheezy. Here is the page I am referring to:[URL] The command I am trying to replicate is on page 6 under figure 2.4. The command is "prompt> ./mem &; ./mem &".
I would like to run the same program twice, concurrently, but do not know how. Note that I am not trying to use a bash script, but rather by simply using syntax on the command line.
I have around 600 empty text files that I need to add the name of this file as part of the data, I meanfiles from "file1.txt to "file599.txt, all of them empty, and I need to get the name inside the file, so, when I open the file show the name as part the data "file1".these files were created on my web site, I am thinking in a small script in bash
I am running a simple script that I copied from slug.ceca.utc.edu/docs/2009-3-26-linux-server-health.pdf and edited with the names and paths of my own servers. I don't know much about scripting (re: nothing) but I wanted to try and be efficient in my new role as a Linux Sys Admin. The script was saved to root's home directory and runs as part of root's crontab once a week. The script runs with no problem, but it doesn't actually seem to run all of the commands contained within. It skips some in the middle and the end and I don't know why. The script itself is this:
How do you catch user input while the script is running? Or, how would you make two scripts run at the same time, but use input from one script to the other? The program I'm trying to make, echos text on the screen continuously, but while thats happening, I want the user to be able to input something, so the program can detect the input and display something else. So I thought maybe I could make two scripts run to do each task.
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I understand that $! is the PID of a command. For example:
Code: #!/bin/bash myprogram & echo "PID of myprogram is $!"
I'd like to send the output of "myprogram" to both console and to a log file using the "tee" command but I also want to store the PID of "myprogam". Something like this:
Code: #!/bin/bash myprogram | tee ./logfile & echo "PID of myprogram is $!"
The problem is that $! is now the PID of "tee" rather than the PID of "myprogram".
I often want to extract some info using awk from a variable/filename while running other things using xargs and sh. Below is an example: Code: ls -1 *.txt | xargs -i sh -c 'NEW=`echo $0 | awk -F'_' '{print $1}'`; echo $NEW' {}
In the above case I would like to grab just the first field from a filename (delimited by '_') from within an sh command. This is a simplified example, where normally I would be doing some further data processing with the sh command(s).
The error message that I get is: Code: }`; echo $NEW: -c: line 0: unexpected EOF while looking for matching ``' }`; echo $NEW: -c: line 1: syntax error: unexpected end of file. I haven't been able to figure out how to escape the awk command properly.
I have a requirement to find the files having its name as ack_reply. However, there are many other files in the same directory as these resides. Now I have to remove these files from the folder and retain others after 7 days. So I tried to write the below script with grep command.
find $directory -type f -mtime +7 | grep ack_reply
how can I pass this output to -exec command.
If I am not using grep command my script would be as
find $directory -type f -mtime +7 -exec remove.sh {}\;;