Programming :: Get Background Jobs To Execute Sequentially?
May 4, 2009
I'm looking for a general pointer, or label for what I am trying to do. Not a specific line of code. I've written a couple of small sh script that gets some user input, then calls several programs to run in the background with the &. My problem is, they all run at the same time. Is there some way to get these jobs to run sequentially rather than consecutively? If yes, what is this process called? I'm thinking there should be a way to line up background jobs in a job queue, similar to how a print queue works, one job at a time. After searching for a couple of hours, I'm thinking there must be a name for this, but I don't know what it is.
I have 6 scripts that do after-hours tasks: mostly backup related. Is there a way for me to create a "trigger.sh" script that will invoke each of the others, and let them finish before starting the next? That would give me a single place to turn on, shut off or move the order of my scripts.
When I tried this, trigger.sh ran the first script in line and then exited. I don't know how to get back to trigger.sh to start the next process. Yes, I have searched around and tried several things, but this is proving harder (for me) than I thought.
I am trying to rename some files that do not have a pattern in their names to a sequential names. original file names are in the form of REC92837498, REC9837449 and so on. I want to rename them to REC_1, REC_2...etc.
I used the following script:
Code: j=1 for i in $(ls -rt REC*) do /path/${i} /path/REC_${j}
Whats wrong with this line?Code:sudo -u user /usr/bin/nohup sh /home/user/somescript.sh &This should ask for the password then execute the script at background and get back to menu
What you do if the job takes a long time to finish and you don't want to wait.Say, I ssh to a remote server from my laptop and start a long-running job. Then few hours later I ssh again and inspect how did job run, its uotput and etc.
want to write a shell script to copy database archivelog files sequential from one directory to another directory within a server. I am hereby enclosing the sample archivelog file name. Archive log filename : log_0000118432_1.arc (Here number 0000118432 will be sequentially incremented by 1 for next filename). Here the catch is all Archivelogs must be copied to destination directory. Previously copied files should not be copied.
I use an application called redbutton-browser to access some of the things available on the redbutton digital tv channels. It compiles fine if I use a simple make but fails if I try to run parallel jobs with make -j4. I'd like some help altering the Makefile so that it does a few commands sequentially before it does the rest of the Makefile in parallel.
I need to spawn 2 processes in parallel and each takes an hour or so to finish. Is the following one of the correct ways of using `at` in a script run by crontab?
Code: #!/bin/bash # define the env var, cd, etc... assume everything ok up to this point date +"The start time is %H:%M:%S" rm -f a.fin at now <<END_OF_AT do_a &> a.log
I am trying run audio conversion on my server that I want limited to a certain number of processes based on process name. I am using the following script but it isnt limiting the number of job like I want it to.
Code: #!/bin/bash $num_jobs = 13 while [ $(ps -A | grep -v grep | grep -c pacpl) -ge $num_jobs ] do sleep 1
i have a server program which accept multiple client connection and am using polling. like every 2 secs it will look to client whether any data is received after it binded. i have used setitimer but there is runtime error i got.. the server accept all client connection but doesn't execute any msg which client sent.
Sequentially number files based on date modified (rename cli)
I'm almost done a larger script which takes all the pictures in a folder, converts it to video, and emails it to me. Everything worked fine until I realized the picture filenames weren't always starting at 1, then ffmpeg chokes.
I have a bunch of files in a folder which I need to rename to:
I don't want to install any additional packages and I'd like this to run in a single command if possible.
If not possible, then a bash script would work too.
Is there, by chance, a fancy name to describe code that must be in a program but will never be executed? In one of my (Haskell) programs, I have some error-handling code that must be in the program to keep the compiler happy (due to the type checking). However I know that, due to the logical structure of the program, it is impossible for the code to be evaluated. I am curious if there is a technical name given to code that must exist but cannot be executed.
I think my title pretty much explains it. I am writing a script and I want to start a program in the background, and when that program finishes I want to check the return value to make sure there was no error.For example normal I would do something like this:
#!/bin/sh program if [ ! $? -eq 0 ]; then echo "There was an error" exit 1 fi
Now I want to do something like this:
#!/bin/sh PRTN=`program1 &` program2 if [ ! $? -eq 0 ]; then
[code]....
In this case if program2 finishes before program1, I don't think the return value from program1 $PRTN would be valid at the time it is checked.
into my php script, it works fine. However, if I put
Code:
$r = exec('myX11application'); echo $r;
it doesn't (to be precise, the script still works, but myX11application is not executed). Of course, scripts are run by user "apache", who doesn't have access to X11 server and doesn't even have DISPLAY variable defined.I installed virtual framebuffer Xvfb, created a small bash script:
Code:
Xvfb :2 export DISPLAY=:2 myX11application
and called it with exec from php, but it still doesn't run since it looks like Xvfb can't start if the normal X server is running (I need that for development purposes). The reported error from Xvfb is
Code:
(EE) config/hal: NewInputDeviceRequest failed (2)
Is there a solution to have php running your X11 applications with normal X server running?
I'm trying to compile a simple script for a ar71xx (bleeding edge /from snapshots) Openwrt router.I have previously compiled scripts for Kamikaze 8.09. I just copied the gcc file inside the SDK dir and used it without problems.
I'm doing a program and I want it to execute some code during n seconds. For example e put a command in the shell like this 'ls % 10' and the program should run the command ls for 10 seconds.I'm trying something like this:
I am executing a run command in a script after that i need to copy files into a directory which are the inputs for the run,on run a new shell is created and the remaining commands in the script does not execute,wot should i do to execute the remaining commands in the script??
Assume two applications, Application A and Application B. Both are written in C language.I had tried to use system() function in Application A to execute Application B. It success, but Application A cannot continue its own tasks before Application B exit. I want Application A to execute Application B, but Application A can continue its tasks without waiting for Application B to exit.
I want to execute a bash script in a C program. Bash script return some message and I want to store this message in a C program. Anybody know how can I do it. I know I can use 'system("bash myscript.sh")' command in C. But I want to store the message which script return.
i want to execute iptables command via php. i can run simple command as echo 'iptables -h'; but i can't run ipables -L and i create excutable file (firewall.sh). i can run it on console but not done on php.
I have some ideas about writing a small game in terminal ( just for fun ) using ncurses library. I want to use some kind of menus (in Midnight Commander's style), but there are some problems with rendering windows, that I don't understand. I create a window with newwin(), assign a color pair to it calling wattron() (for example, I want to fill a window with blue background), and then I call my own function wnd_fill() :
I have a script that calls another program/script, xxx, to run in the background. Supposedly this program at most should finish within five (5) minutes so after five (5) minutes, I run some other steps to run the script into completion. My problem is sometimes the program takes longer than five (5) minutes and this is causing problems when running the rest of the steps in the scripts. Can anyone suggest how to re-program my script. At the moment, the KSH script, i.e. test.ksh, is doing as below:
test.ksh: ..... ..... xxx/xxx.ksh <--- program/script called by the script sleep 300 ..... run the rest of the script ..... ..... problem is sometimes xxx/xxx.ksh takes longer than 300 seconds ..... ..... any way that I can monitor that xxx/xxx.ksh finishes before I run ..... ..... the rest of the scripts .....
I am doing a program that reads data from a gps and some other devices and writes some files with all the information. When I run it normally it works fine, but if I run the program in the background (with the ampersand) files are not created until I bring it to foreground or close it. I am confused, the program should run the same way with and without the ampersand. (Could it be that the main process that creates all the threads does not create them when it is executed in background? It seems like if the program is stopped until I take it to the foreground).
I have a set of files to copy and decompress, and want to do these operations concurrently with a script.
Manually it would be something like:
Code:
The single & is intended to background the processes, while the && is intended to execute the gzip process if and only if the cp completes successfully.
My script is:
Code:
When I run it, bash gets angry with the following error: