General :: Unable To Redirect Script Output To File
Aug 25, 2009
I have a python script that when run outputs to screen.
eg.
./international_sms_check.py 0403000511 919227434827
TS 21 check ok
TS 22 check ok
sms successfully delivered from 61403000511 to 919227434827
But when I try:./international_sms_check.py 0403000511 919227434827 > test
The file test is created but there is nothing in it.if I try ls > test this works fine with output of ls redirected to file test.
I was trying to redirect the output of two variables to different columns of a .csv file in MS excel like this,
Code: echo "$a $b" > abc.csv But I am getting both $a and $b in the same column, is there anything I can use instead of to move the value of $b to the next column? Or is there a good different approach to do it?
I'm working on some scheduled task script files to keep nightly backups of some of our database information in place, and it's a bit annoying when they blow up. I know how to redirect stdout and stderr to a flat file I can view when I come in, and I know that 2>&1 maps them both to the same file (whatever was named in 1). However, I'm running into some cron-time situations where it's easier to have the two streams together, and other cron-time situations where it's easier to have them separated. I can't really tell which is going to happen; is there some way I could create both kinds of output file for my scripts, so that I've got a std_err only file and an interleaved std_out/std_err file?
Note: I've looked at the 'tee' command, but I don't think it will work for what I'm after. 'tee' appears to only work with stdout; I'm trying to work with stderr.
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I am again struggling to make a script work, but hey, it is fun, I am learning new things. I discovered the set -x option which was, for me, like the second coming. Still, what I am not able to do is redirect ALL output to a (log) file, including what is produced by the -x setting. Let's assume a very simple script: Code: #!/bin/bash set -x source="/home/atelier/Bureau/" ls -la $source and I am running it as . test.sh >> /var/log/test.rmcb.log
The result of ls goes inded into the log file, but the rest still shows on the console where I am running the script: Code: ++ source=/home/atelier/Bureau/ ++ ls --color=auto -la /home/atelier/Bureau/ Is there a way to redirect EVERYTHING to the log file ?
I want to redirect the output of a command to a file, but not at the end of the file, but after a line. Do you know how can I do it?
Something like:
cat file_a | grep some_text >> resulting_file
# in this file I need to place the output from grep, but not at the bottom of resulting_file, like it would normally happen, but after line .. 3 , for example
I'd like to redirect the output to a file and to the console. I know about tee but the issue is that it waits until the first process finishes.e.gecho "hello world" | tee test.txtfirst calls echo and then tee.Is there a way to redirect "on the fly" ?
I booted to command line only and entered the following command: Sudo Xorg -configure > xorglog.txt
the command seems to run just fine and does create a new xorg.conf.new file but I would like to see all the output of the Xorg -configure command but it just scrolls by too fast and I can't go back to see it. Hence this is why I'm trying to do the > . It seems to ignore the >.
I have got a script with an outer and inner loop. The inner loop issues loads of echo's which need to be redirected to a log file determined by the outer loop. The obvious solution is to redirect every echo to >$LOG and set LOG in the outer loop.
Code:
for f in $FILES ; do LOG=<logfile> for l in $LINES ; do
[code]....
it is possible to map stdout to $LOG in the outer loop without having to redirect every subsequent individual command output?
Is there one command that will let me record an entire terminal session (with any possible errors) to a text file while also seeing all output on screen too? I know it can be done for individual commands, but I'm looking to do this for an entire session where the individual commands will be normal (i.e., not piped into tee, etc.). It would be even better if the command prompt is captured too. The obvious utility of this makes me think someone surely has come up with a solution long ago (probably in the 60's).(I'm sure it goes without saying, but subsequent output in that session should be appended to the file. The file should contain the full history, with all output and errors, of the session.)
I'm doing some commands on a remote server (using ssh to log on to the remote server, did a ssh key swap), how do i redirect the output of a command back to the local server ?the person who helps me out is my HERO i'm really stuck on this and it would bring me a lot further if i get this to work
i have a process launch by another app, i want to see the output (that is in console) in a terminal (gnome-terminal or tty); how can i capture de standart I/O from a process. my process (aria2) is launch by firefox and the output of ps is like:
...is running but i cant see the output (download state), how can i capture or redirect standart I/O to my terminal to get something like the output of:
I am trying to automate an svnadmin dump command for a backup script, and I want to do something like this:
find /var/svn/* ( ! -name dir -prune ) -type d -exec svnadmin dump {} > {}.svn ;
This seems to work, in that it looks through each svn repository in /var/svn, and runs svnadmin dump on it.
However, the second {} in the exec command doesn't get substituted for the name of the directory being processed. It basically just results a single file named {}.svn.
I suspect that this is because the shell interprets > to end the find command, and it tries redirecting stdout from that command to the file named {}.svn.
I have a program that writes to stdout. Is there a way that I can redirect the output to the linux diff command or do I have to write the output to a file and then compare that. For example I have a bunch of test input files for a program and the corresponding expected output in another set of files. And I'd like to do something like ./program < t1.input | diff t1.expected.
My Problem: The output redirection auf a script works if the script is called in the terminal but not when its called via crontab.
My Situation: I have 2 scripts: ~/backup1 Code: echo backup a to c rsync -a -v --progress --delete --exclude=.Trash-1000 /path/a/ /path/c/backup/ echo backup b to c
if I'm posting to the wrong forum. Be so kind to tell me where to better ask this question, as I'm really not finding the right words to google for.So, I have a shell application (fdb) which is a Flash debugger. I want to run it using bash script, capture it's output and pass it the commands (it can read from STDIN). The reason I want to do so is that Flash Builder (the IDE for Flash development) is plain stupid when it comes to compilation, and it won't allow me to compile any file in the project... so, I found out that I can make Eclipse to run an external tool. This external tool is my *.sh file whichches the compiler, and then it launches the debugger.The Eclipse console can display the compilation results, or errors. When I run the debugger it can even pass the input from Eclipse console to the debugger, however, the output from the debugger isn't shown.
I would really like to capture the output of scp and my file's progress. Scp updates the transfer rate every 1 second, and I will like to save the transfer rate at every update. So for example, if the file transfer takes 30 seconds, I would like 30 reports of the transfer rate.
The output looks like: Code: file.dat 1% 3664KB 938.5KB/s 05:48
Whenever I try a simple redirect like: Code: scp file.dat 192.168.1.100:~/ &> output ... it does not save the rate at every update, it only shows the final rate.
If I try using typescript by starting "script" ... it's the same deal.
i got a bash script which can remind me my friends' birthday ,and i want run it as a cron job everyday,but the linux just emails me the output.Now my question is how to how to redirect the cron output to screen.
PS: when i run the script mannually ,it runs very well,so my script is good. And i have tried :
1.30 8 * * * root /home/birth.sh >/dev/console
it shows nothing
2. 30 8 * * * root /home/birth.sh >/dev/tty1
the same as 1
3. 30 8 * * * root /home/birth.sh >/dev/tty
it shows:/bin/sh: cannot create /dev/tty: No such device or address
hello I tried to find a good subject but it was the best of mine, anyway I'll explain it here. some time I do some thing like installing a new application in Linux terminal of my office PC but it take a long time and I have to go home during its installation or configuration process that it is not good to cancel it.My current solution is abandoning the process until next day. I wanted to know is there any way to redirect an input and out put of a terminal to another one, if it works I can continue my abandoned process by ssh to my Linux office PC and redirect that terminal to my new remote sshed terminal from my home.
Code: curl "http://site.com/pages/{1,2,3,4,5}.html" > /home/myuser/allpages.html i need to save each page in a separate page by the way i have tried this command Code: curl "http://site.com/pages/{1,2,3,4,5}.html" > /home/myuser/{1,2,3,4,5}.html but it displays error Code: ambiguous redirect is there any way to do that
I want to compare the quality of a sound file (voice) before and after its transmission via a softphone (sjphone in my case). For this, i need to redirect the sound played, to the sound input (microphone or line-in).