Programming :: BASH - Output Of Snmpget With Multiple OIDs Into Separate Variables
Jul 7, 2010
I have a problem with snmp answers being empty or having spaces.
What I already have:
#get all interface indexes (if you wonder - I'm working for a cable company and different cablemodems have different number and types of interfaces):
The problem is the physical address which is sometimes empty and the description which has spaces. So I'm doing 2 snmpgets which is slower than 1 snmpget (sometimes I have up to 18 interfaces).
I'm trying to explain it a bit simpler.
Interface 5 gives me back the following lines:
Ethernet CPE Interface
Now the first line should go into variable ifadm,
2nd line should go into variable ifoper,
3rd line should go into variable ifspeed,
4th line should go into variable iftype,
5th line (which is empty) should go into variable ifphys and finally
6th line (which has spaces) should go into variable ifdescr
I have wrote a 1 line command that parses a file, locates the IP Address in the file and then trims the output the way I want it, and then sorts numerically and by uniqueness and then >> appends to output.txt
I can get all the IP's into 1 file "output.txt", but what I am really looking for is some type of way to create a text file, for each IP it finds labeled xxx.xxx.xxx.xxx.txt and also put that ip address into that file..
I'm trying to write a bash script and I'm having trouble with it.I have a list of DNS entires from a file called zoneExport.txt.Than I want to parse a log file to see if that DNS entry has been queried for. So I'm running a grep command and trying to save it into a variable. What I'm looking for is a variable ($varGrepQ) that has the number of matches for the grep query. I will then run this through an if statement and do some things from there..
But my problem right now is with this grep query. It keeps outputting '0' even when I know there are records in that file and when I run the same query on the command line I get the actual count. My thought is that the $record variable is not passing right.
If I read in variables entered by the user, how can I check to make sure the correct number of variables were entered? For example, after reading in a data file and making it into an array, I have:echo "To check the data, enter the first element number, last element number and step size as x y z:"read x y z.It then goes on to start a loop, but what I would like now (before the loop) is a check to see if three variables have been entered, before the rest of the script continues.
I've tried specifying the variables as $1, $2 and $3, but if I echo $#, the value comes out as zero, so it's obviously not working.
I'm trying to do something here:: I'm writing a bash script, I want to [open a new terminal and run a bash command in it] inside the script. I tried to use this, but apparently I get syntax errors.
I have a bash variable where the content looks like this where ;f1; and ;f2; are delimiters: ;f1;field1value1;f2;field2 value1 ;f1;field1value2;f2;field2 value2 ;f1;field1value3;f2;field2 value3
So what I need is to extract and put into variables each combination of f1 and f2 in a loop to something like that:
#first pass of the loop I need: f1=field1value1 f2=field2 value1
#second pass of the loop I need: f1=field1value2 f2=field2 value2
# third pass of the loop I need: f1=field1value3 f2=field2 value3
mkvmerge -o <filename without extension>_TV.mkv -S <filename> && mkvextract tracks <filename> 3:<filename without extension>.*** && perl /home/brian/Desktop/ass2srt.pl <filename without extension>.*** && rm <filename without extension>.***
Doing these commands for multiple command line file inputs is the goal. So I can just type ./script.sh *.mkv in my terminal.This is what I have so far, but it doesn't work whatsoever.
I have written quite a few separate bash & scripts and php scripts that up to now I have run from cron jobs. However I have to estimate how long each takes to run, before running the next and so it probably takes much longer than necessary to run them all. They have to run in order.
Now there are so many I am thinking it would be better to have a master bash script that would run one after the other, but I am not sure how to get the master script to wait before starting to run the next script. Is this possible and is there a command that will make the script wait between bash and php scripts , for them to finish, before running the next?
I have a script that generates a bunch of output, including the expansions details provided by: set -v -xI am trying to pipe everything that is displayed to a file, in addition to displaying it on the screen. I've managed to get stderr and stdout into the file, but the expansions are only printed to the screen. Here is what I have so far:sudo -u <user> source my_job.sh |tee my_log.txt 2>&1
So that when I grep on the local file again later, it can be printed out with original log lines. Otherwise, the log lines will be dropped and lines becomes concatenated into a single line, e.g., if I rewrite the script in this way, echoing the $result is not a good idea..
is there some workaround that I can save it to a variable rather than file but still keep the eol? That will simplify my script and don't need to do all those I/Os!
I have a bash script that calls a java class method. The method returns a string to the linux console when run independently. how can I assign the value from the java method to a variable in a bash script?running the script: java -cp /opt/my_dir/class.method [parameter]
PU12829,24869;PD15733,24869;PD15733,19785;PD12829,19785;PD12829,24869; PU4599,20915;PD9924,20915;PD9924,18898;PD4599,18898;PD4599,20915; PU12829,24869;PD15733,24869;PD15733,19785;PD12829,19785;PD12829,24869; PU4599,20915;PD9924,20915;PD9924,18898;PD4599,18898;PD4599,20915; PU1723,3423; #this line is ignored to short
[Code]...
What I'm trying to do is while true, cut each line from file that begins with PU and thats longer than 12 characters and write to a increasing numbered file for each line. Stating with object1 etc.
I am trying to process a column separated data file, with a few bash command. For example, I have
Code:
file1 aaaa yes file2 aaaa no file3 bbbb yes
Let say I want to create new file with the output of first column and do something else with the output of 3rd column. Of course there are many ways to process this data file, but I wish to know by using awk, how could I do it. I'm trying:
Code:
awk '{system("touch $1")}' datafile
but the shell command will not able to get the awk '$1' output. How do I get this done ? And for another question, if the data file contains the variable name of a shell variable, how could I make use of it during a awk output ? For example I have a datafile1:
Code:
server1 yes server2 no
And in another server declaration data file, I got this datafile2:
Code:
server1=xxx1 server2=yyy1
And in my awk script, I want to achieve something like (the syntax is definitely wrong, just to demonstrate what I assume it will like):
I have got a script with an outer and inner loop. The inner loop issues loads of echo's which need to be redirected to a log file determined by the outer loop. The obvious solution is to redirect every echo to >$LOG and set LOG in the outer loop.
Code:
for f in $FILES ; do LOG=<logfile> for l in $LINES ; do
[code]....
it is possible to map stdout to $LOG in the outer loop without having to redirect every subsequent individual command output?
I am not sure if that Subject really explains it, basically I have a script that executes a CLI java-applet that requires a passphrase from the user. I can easily execute this by issuing the -p argument followed by the passphrase however that shows up on possible logs or at least on the results of the ' ps ' command. If you do not supply this -p argument it provides a new line with the echo " Enter Passphrase: " and asks for input.
how can I provide a result/input for the Passphrase request and is it still possible to throw this application in the background with the ' & ' following the command? I have seen a few examples that show a /bin/expect that expects a result and sends a command however I would like to refrain from any extra dependencies. Example of Regular Execution of application:
I am writing a bash script that utilizes the output of another script (which I will refer to as script#2.) Script#2 is not owned by me, I cannot modify it. All of the output from script#2 is blue, which makes it difficult for me to read.
I would like to have the output of it changed to grey. Is there a way I can do that in my script? A command I can pipe the output to?
Edit: One other question related to this. I put a trap function in my script that works well. Script#2 essentially runs a tail -f. When I ctrl+c to stop it, it stops script#2 and never calls the trap in my script. Is there any way I can work around that?
I have a requirement like this:Cut the characters from each line of a file with following positions: 21-24, 25-34 ,111-120.Thse fields now need to be placed in a tab delimited output file.Currently this is how I am achieving it:
As I'm starting to learn bash scripting I'm trying to automatize some tasks I usually perform. I have a notification mail I need to send several times a day. It has this structure:
Quote:
Dear user, blah blah blah blah
You need to contact the following people:
[code]...
To replace "user", I found this:
Code:
read -p "Please enter username: " username echo "Dear $username,"
Which probe to be very useful with other simple notifications like this. But I don't know how to manage the email addresses as they are usually more than one and could vary from 1 to 10. They should appear one above the other. I found this: "Here is a little work around. The only thing the user needs to do is hit enter without anything else on a line and it will close out"
Code:
#!/usr/bin/ksh word=a until [[ $word = "" ]];do
[code]....
I tried to use it and modify for my needs but I failed, I don't realize yet how can I use it. If possible, I would like to use the until loop like the above example just for learning purposes but any other form will be accepted as well.