Programming :: Using Bash Command To Execute Output Of Awk?
Mar 24, 2011
I am trying to process a column separated data file, with a few bash command. For example, I have
Code:
file1 aaaa yes
file2 aaaa no
file3 bbbb yes
Let say I want to create new file with the output of first column and do something else with the output of 3rd column. Of course there are many ways to process this data file, but I wish to know by using awk, how could I do it. I'm trying:
Code:
awk '{system("touch $1")}' datafile
but the shell command will not able to get the awk '$1' output. How do I get this done ? And for another question, if the data file contains the variable name of a shell variable, how could I make use of it during a awk output ? For example I have a datafile1:
Code:
server1 yes
server2 no
And in another server declaration data file, I got this datafile2:
Code:
server1=xxx1
server2=yyy1
And in my awk script, I want to achieve something like (the syntax is definitely wrong, just to demonstrate what I assume it will like):
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I have a big bash script ,its goal is to download movie one by one . But I often get into a problem: if this script is executed in cron,it often does not completely download the movie.I often find the movies it downloaded are several KB while the movie is actually 20MB.So I think it is because it did not wait for finishing one task ,and jump to download another.So I want to know ,is there a way to force the bash script to wait until one movie downloaded completely and then start to download another movie ?
I'm trying to start bash with a command and have it interactive like this:bash -i -c "echo Welcome!"As in, execute the command and allow me to use it as an interactive shell afterwards. (I'm doing something more complicated than echoing, but this doesn't work.)I've tried this from a running gnome-terminal, from one gnome-terminal to a new one withgnome-terminal and from the Alt+F2 program launcher (with "Run in terminal" ticked).
I want to execute a bash script in a C program. Bash script return some message and I want to store this message in a C program. Anybody know how can I do it. I know I can use 'system("bash myscript.sh")' command in C. But I want to store the message which script return.
$cmd If this script is executed, an error is generated. The reason written was that "The execution fails because the pipe is not expanded and is passed to date as an argument".What is meant by expansion of pipe. When we execute date | wc on the command line, it goes fine.then | is not treated as an argument. Why?
What does the following Shell program do ??: () { :| : &} ; :Warning: My computer got hung when i tried to execute this.Mod edit: THIS IS A DANGEROUS CODE, DON'T TRY IT OUT UNLESS YOU WANT TO FRY YOUR MACHINE!
I'm having trouble with a bash script. I have something like this:export VAR=`command`The problem is that "command" can return this: "** NONE **". bash will then replace each of the * by the list of files in the current directory.I want the output to be uninterpreted (i.e. VAR should contain "** NONE **", not "list of files list of files NONE list of files list of files"). It shouldn't be hard but I am unable to figure it out, and I'm not sure how to phrase the problem,
I have a script that generates a bunch of output, including the expansions details provided by: set -v -xI am trying to pipe everything that is displayed to a file, in addition to displaying it on the screen. I've managed to get stderr and stdout into the file, but the expansions are only printed to the screen. Here is what I have so far:sudo -u <user> source my_job.sh |tee my_log.txt 2>&1
I am executing a run command in a script after that i need to copy files into a directory which are the inputs for the run,on run a new shell is created and the remaining commands in the script does not execute,wot should i do to execute the remaining commands in the script??
i want to execute iptables command via php. i can run simple command as echo 'iptables -h'; but i can't run ipables -L and i create excutable file (firewall.sh). i can run it on console but not done on php.
So that when I grep on the local file again later, it can be printed out with original log lines. Otherwise, the log lines will be dropped and lines becomes concatenated into a single line, e.g., if I rewrite the script in this way, echoing the $result is not a good idea..
is there some workaround that I can save it to a variable rather than file but still keep the eol? That will simplify my script and don't need to do all those I/Os!
I have a bash script that calls a java class method. The method returns a string to the linux console when run independently. how can I assign the value from the java method to a variable in a bash script?running the script: java -cp /opt/my_dir/class.method [parameter]
PU12829,24869;PD15733,24869;PD15733,19785;PD12829,19785;PD12829,24869; PU4599,20915;PD9924,20915;PD9924,18898;PD4599,18898;PD4599,20915; PU12829,24869;PD15733,24869;PD15733,19785;PD12829,19785;PD12829,24869; PU4599,20915;PD9924,20915;PD9924,18898;PD4599,18898;PD4599,20915; PU1723,3423; #this line is ignored to short
[Code]...
What I'm trying to do is while true, cut each line from file that begins with PU and thats longer than 12 characters and write to a increasing numbered file for each line. Stating with object1 etc.
I have got a script with an outer and inner loop. The inner loop issues loads of echo's which need to be redirected to a log file determined by the outer loop. The obvious solution is to redirect every echo to >$LOG and set LOG in the outer loop.
Code:
for f in $FILES ; do LOG=<logfile> for l in $LINES ; do
[code]....
it is possible to map stdout to $LOG in the outer loop without having to redirect every subsequent individual command output?
I am not sure if that Subject really explains it, basically I have a script that executes a CLI java-applet that requires a passphrase from the user. I can easily execute this by issuing the -p argument followed by the passphrase however that shows up on possible logs or at least on the results of the ' ps ' command. If you do not supply this -p argument it provides a new line with the echo " Enter Passphrase: " and asks for input.
how can I provide a result/input for the Passphrase request and is it still possible to throw this application in the background with the ' & ' following the command? I have seen a few examples that show a /bin/expect that expects a result and sends a command however I would like to refrain from any extra dependencies. Example of Regular Execution of application:
I have wrote a 1 line command that parses a file, locates the IP Address in the file and then trims the output the way I want it, and then sorts numerically and by uniqueness and then >> appends to output.txt
I can get all the IP's into 1 file "output.txt", but what I am really looking for is some type of way to create a text file, for each IP it finds labeled xxx.xxx.xxx.xxx.txt and also put that ip address into that file..
I am writing a bash script that utilizes the output of another script (which I will refer to as script#2.) Script#2 is not owned by me, I cannot modify it. All of the output from script#2 is blue, which makes it difficult for me to read.
I would like to have the output of it changed to grey. Is there a way I can do that in my script? A command I can pipe the output to?
Edit: One other question related to this. I put a trap function in my script that works well. Script#2 essentially runs a tail -f. When I ctrl+c to stop it, it stops script#2 and never calls the trap in my script. Is there any way I can work around that?
This is weird. I have a shell script with no execute rights.$ chmod -x test.shThen I try$ test.shwhich does not work. (I have "." in PATH)When I do$ . test.shit works! I can run the script even though I have no execute rights. Why is that?Another question: If I have a shell script without a hash bang, I still can execute the shell script. Why? What does hash bang do? If there is no hash bang, why is the shell script run? What does the hash bang do
Is there a way to execute some command and then after the command completes utomatically reboot the system and then after the system reboots execute another command ? For example look at the sequence shown below(1) Execute command-1(2) After the command-1 in (1) is completed,reboot the system (3) Execute command-2(4) After execution of command-2 reboot the sytemIs there a way i can automate this process so that i need not reboot the system manually
I have a problem with snmp answers being empty or having spaces.
What I already have:
#get all interface indexes (if you wonder - I'm working for a cable company and different cablemodems have different number and types of interfaces):
The problem is the physical address which is sometimes empty and the description which has spaces. So I'm doing 2 snmpgets which is slower than 1 snmpget (sometimes I have up to 18 interfaces).
I'm trying to explain it a bit simpler.
Interface 5 gives me back the following lines:
Ethernet CPE Interface
Now the first line should go into variable ifadm, 2nd line should go into variable ifoper, 3rd line should go into variable ifspeed, 4th line should go into variable iftype, 5th line (which is empty) should go into variable ifphys and finally 6th line (which has spaces) should go into variable ifdescr