I run a 3rd party command line utility and it works fine, but sometimes it says "Error blah blah blah... Connection timed out"
I want to script this utility, but I need to not execute the commands in the script if it gives me that connection timeout error.
bash code to capture that response from the utility? Something along the lines of:
Quote:
#!/bin/bash
3rdpartyutil > /tmp/temp.txt
if [ ! -f /tmp/temp.txt]; then
echo no error, run whatever you need to man
fi
rm /tmp/temp.txt
Unfortunately, that doesn't work because the utility outputs non-error information to the screen even when it is successful, so it always outputs something, I never need to see it, but I do need to be able to act upon if some of that text says "error" or "connection timed out"
I am capturing the response of three URLs which are in file named urls.txt using following command and write response in output.txt file.
####Code############ wget -i urls.txt -q -O - | tee output.txt ###########End of code Now i am finding the case where url is not responding and that output is not available to me in the file and on console i am getting "Could not connect to host".
Now I want to modify my urls.txt to have two fields: Name and URL Example: URL1 | http://10.0.0.2/xsc/abc URL2 | http://10.0.0.1/lkj/csv URL3 | http://10.0.0.5/sdf/plk
I want to execute each url and print there response against each name Example: URL1 : CONNECTED URL2 : NOT CONNECTED URL3 : NOT CONNECTED
I want to prefix the number of ✔ to corresponding row If I use the command :.s/✔//gn I get output written like '2 matches on 1 line' How can I extract the '2 matches' in above case ?
I am writing a bash shell script on RHEL. I need a way to analyze the output from a command, and provide a response to that command depending on what is found.
On the command line this looks like:
In other words I want to script this - capture the output from the mlsmailbox --delete command, respond with a yes if the mailbox was found, and go on if it was not found. There may be other responses to the mlsmailbox --delete command that I need to analyze and respond to as well.
I've made a simple php wrapper around scp. It works fine, but unlike when I run the scp command straight from the console, there is no output returned. I've tried using passthru(), exec(), system() and shell_exec(), all to no avail. I'm redirecting stderr to stdout already.
Will scp the files correctly to the server, but doesn't print any output - $result is just an empty array. I'd like to see the output so I can visually confirm that the files have been transferred correctly.
then luvcview should capture the video to the .avi file. However, when I try it, no output file is created. Grabbing a raw stream or raw frames works fine, but not the creation of AVI files. Am I missing something?
I would like to capture all output spewed to a terminal session including processes that are terminated that were invoked from a script running in a terminal window. this is beyond capturing just stderr and stdout . for example
{ ./script } 2> stderr.cap 1>stdout.cap
if script is terminated (including because of memory violations) I get spewed output to the terminal I would like to capture that spewing to a file automatically or to a bit bucket /dev/null Is there another filehandle which can be redirected to do this? If so how or is there another way???
I want local programmatic access to ssh output in Mac Terminal. First, I tried redirecting the output of each command to a file. The file was perfect, but of course it was on the remote server, and an sftp for each command output seemed a little.. Next, I tried to Applescript Terminal, but it only gives access to the currently visible text in a tab (i.e. if half the output has already scrolled out of sight, it doesn't get returned - useless).
Last, I tried piping ssh to tee (e.g. ssh user@host | tee output.txt). This almost worked. I have the output in a local file, but there are a lot of unwanted characters mixed in. For example, every time I hit backspace, there's a ^H in the file. There's also text like "[0m[K" which is harder to get rid of.
I have a backup schedule running a full backup everyday. I'm using webmin to manage these backup now. The problem is when the dump command sends a prompt asking if we want to rewrite the tape, Webmin does not display this prompt and we end up having to terminate the backup -> erase the tape(which takes a long time) and then run the backup again.I was wondering if there is a technique that could be used to pass "Yes" as a parameter to the dump command, much like in windows? or if there is a more efficient way of getting this done.
I am using openSUSE 10.3.When I install software from tarball then to record time required I send output of date to beg.txt(when installation begins) and end.txt (when installation finishes).How can I append output of date to a file so I don't need two files?
when I try to access any page even small html pages it stays like 3 seconds in HTTP request sent; waiting for response. state..even when I use Lynx locally on the server..bypassing any possible network issues..logs dont show a thing..the server itself is a high end server with nothing running on it apart from apache which is not serving anny clients now, firewall is disabled and hostnamelookups are set to OFF.
I want to run gsettings list-schemas (which return a list of about 100 names separated by spaces)and somehow direct each name one at a time as the input to this command:gsettings list-recursivelyI've tried it with awk, and standard | piping and also as a string variable strvar=$(gsettings list-schemas) and using the $strvar as the input butam missing something in between I'm sure like for - while or proper syntax of awk etc
I want to use the output of a previous command as a parameter to another command. For example: to know where "nice" is stored i typed: which nice output: /usr/bin/nice now the second command i typed is: ls -l /usr/bin/nice Is there a way to have a single command like: ls -l which nice ?
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I want to scan a particular directory recursively and run a particular command with each file as input. For this I am using "find /dir/path". I dont want to write any long script containing loop on the output of "find". I want a single command which will allow me to run a command on each file of the "find" command output.
My server is currently copying a large amount of date. I periodically check how much has already been copied by using the "ls" command in the destination folder. Is there a way so that ls kind of self updates itself? Like in a log or so? Or like when using cp -v?
I write LaTeX in Emacs and then run a shell script to process the LaTeX code. I used to run a subshell buffer with M-x shell and then execute the script from within there, but this results in a lot of switching between buffers, which seems unnecessary. Then, I found out about executing shell commands with M-! cmd RET, as described here:[URL]The problem with this is that the output from the script splits my screen. It's a nuisance, and I would like to run the script without any output. I've tried appending > /dev/null to the command, but it doesn't work.For example, when from within Emacs I enter M-! followed by
Code: sh make.sh > /dev/null it splits my screen so that one portion displays output from the make.sh script. I want it to run silently, and leave my Emacs buffers alone
Code: man -k mail Which lists commands that contain the keyword "mail" in their description.I want the output of this command in less and the words highlighted by grep. Something like
Code: man -k mail | grep mail | less The command doesn't work, how do I fix it?
Not sure exactly how to explain it. My command prompt screens are appearing messed up.the resolution appears to have squeezed the command prompt vertically to just a few pixes at the top of the screen. The command prompt appears to also repeat several times horizontally. This is also the case with boot-up.I'm still in the experimental phases of learning how linux works, messing with GDM, and installing/installing some boot screen tweaks here and there to see what would happen.I can't quite remember exactly WHAT I did.
This is an extremely weird issue that I can't find any help with on Google. It is minor but extremely annoying.
When I type in a linux command in the terminal, (e.g. "ls -la"), and then press enter, the cursor goes to the next line and just sits there, as if its processing some long command.
If I press enter again, I see the ls output as well as my prompt twice. It's like the terminal window isn't auto-scrolling, but I've also seen this happen when there wasnt even enough text in the console screen to warrant a scrollbar. Has anyone seen this before and know what I need to do? I hope what I'm asking about makes sense.
Does anybody know if with OpenSUSE 11.2 something changed in the way the users are logged in to X? I am running an application which uses notify-send command to send pop ups and it is not working properly, but it works in OpenSUSE 11.1, 11.0, 10.x, The same goes for SLED and SLES, all versions.This is what I have found so far.Before the 'who' output was
Code: $ foo@bar:~/Desktop> who foo :0 2010-01-26 14:40
For example, if I type ':pwd' to get the current working directory, I can select the text in gvim but I can't figure out how to copy it to the clipboard. If I try the same in console vim, I can't even select it with the mouse. I would like this to work with all vim commands, such as set guifont to copy the guifont=Consolas:h10:cANSI output.
i am running ps xo "pid,command" but I can't find my process in the results. I know that the process is running because I run ps ax | grep command-name