have a doubt, suppose in the script i include an exe type of file for ex. ./a.out, which takes input as a pdb file namely 1sn3.pdb, now i want my output text file to contain the name 1sn3.txt , how do i do that using the shell script
I am new to shell scripting.What i am trying is to write a shell script which take the input file and output should like as mentioned below.Output file should have data till SOK (marked in red)from every second line and then the selected data(marked in green) from 4th line.So selected data from 2nd and 4th line in one line of O/P file and then similarly selected data from 6th and 8th line in second line of O/P file.Input File:
I am creating a script to sync my important documents between two system. I want my script to generate a log file for the last action. can you suggest me a way to achieve this.Question: If I execute the rsync command with -v flag, it will print a lot of messages on the console. Is there any way. So, I can redirect these logs to a file?
This seems so simple when doing it from command line but I'm not able to accomplish it inside a script. I am trying to put output of following command into a text file:
CMD= mysql -uroot -psecret -e 'SHOW SLAVE STATUS G;' FIL=~/replication-`date +%F`.txt MAILTEXT=~/mailtext.txt touch $FIL $CMD > $FIL
Where FIL is a variable that contains path of the file to which to output command. I am running this command in a shell script from where I want to email contents of $FIL as attachment using mutt. But I am always getting 0 byte file. Also if I examine in directory the file is of 0 byte length.
I use 'grep -Ri "mypattern" .' to search for all files in the current directory recursively that contain "mypattern". But this command returns every single occurence, so that if a given file has several occurences of the pattern, the screen fills up pretty quick. More than likely, there's a way to restrict the output so that it only displays each file once, no matter how many occurences it contains, but I couldn't find how to do it.
I was trying to redirect the output of two variables to different columns of a .csv file in MS excel like this,
Code: echo "$a $b" > abc.csv But I am getting both $a and $b in the same column, is there anything I can use instead of to move the value of $b to the next column? Or is there a good different approach to do it?
need to monitor pecific processes over a time frame in terms of the amount of memory and cpu usage it utilizes. I can do this using the top -p <pid> option and using ps to retrieve the pid's. However, seeing that the pid's might differ and it needs to be run on about 13 different machines, I would like to write a script for this that can be run at set intervals. My problem that I have is this:
- When running top -p <pid> I can specify a comma seperated list of the processes required to monitor at that specific time.
- I can use ps -ef | grep <process> | grep -v grep| awk '{ print $2 }' to retrive the list of pid's and output this to a file.
However, how can I output these to the file as a comma seperated list without having to manually do this every time? The reason for this is (an example), lets say I want to monitor the cpu and memory usage of postgresql as well as all its child processes, then I would ps grep for postgres and get the list of pid's for instance.This list then needs to be passed to top -p as a comma seperated list of pid's I suspect that awk or sed might have some options available for this but I do not know this well enough.
I have this code that is 'bashed' regularly with crontab and basically it will send me an E-Mail of most of the output but it misses out some of it!
Here is the crontab code to automatically run the script:
Code:
So that sends me an E-mail with most of the output of the following code:
Code:
It sends me everything up to echo "*******" "Begin compressing and transferring files" "*******" but it wont output the tar bit.. so it should give me a list of files that have been tarred.
I have a small doubt regarding Assembly file compilation. I have two .s files. When I compile two .s files I am getting corresponding .o files. But when I compare the both .o files with diff command, it is resulting that two files are differing. How could/ what are commands we should use to understand the difference between two .o file's output.
For example I want a file to be processed by sed, and then overwrite the file with sed's output. I would try this: Code: sed '<regex goes here>' myfile > myfile But it doesn't work as expected, instead it empties the file (I am thinking that as the first byte comes out of sed, it overwrites the whole file and sed has nothing more to do). How can I make this work?
What is the most primitive way to output a file to the printer? I mean data transfer from the file to the printer. I suppose it must be 'cp some_file /dev/<printer device>. For the console, I know the devices are /dev/ttyN, N= 1,2,... But I do not know what are the devices for the printer.
I'm working on a alias/script that will make it easier to look at my environment via the set command.
On my ubunto when running set it also displays some 10 pages of code pertaining to something called "imagemagik". at the end of the output. This code begins with the { character.
This is annoying if I want to look at my environment when working on scripts. How could I use something like grep, awk, sed, or whatever to ignore everything after the "{" character.
That seems to be the simplest way as long as there are no leading brackets in my environment. (And if you're thinking I should just remove {imagemagik}, I might just do that. But I still would like to know how to do this).
When I run a script with nohup, the output is forced to the out.log file by default; is there a way to force the output to the screen instead of the file? I need to be able to see when my script gives me a "process started" message, and I don't want to clog up the system with log files.
I'm using Ubuntu and I'm programing with eclipse CDT. My goal is to execute a php file and read the output to my c++ program. To do so I thought I should use fork(), dup2() and execl. When in shell, the call "php myscript.php" worked just fine, but when in c++ I tried: execl("usr/bin/php", "php", "home/geiger/workspace/SemiServer/server_content/myscript.php", NULL); And it didn't work (the process wasn't terminated and I got no output). I tried different version of this call, like losing the "php" string and/or drop "home/geiger" from the path string, to no better result.
i have 10 vi files . these files contain some system related information. i need to combine the output of all these files into a single file. the final file should contain contents of all these 10 files and the output should be in a tabular format.
is there any command in vi that i can use to create a table ?
I have a file that contains "ls -la" output. I would like to display only the filenames, none of the other information before it such as permissions, ownership, size, and date.Would the cut command be the best way to hit this, or should I use Vim or sed?
it compares two files using md5... if they are same , a corresponding character is output to a text file .. but the problem is it gets appended by default.. is there any way to output in a normal way because the text is a message and it should be of proper format here is my script
Code:
#!/bin/bash g=`tail -1 new.txt|head -n 1` array=( a b c d e f g h i j k l m n o p q r s t u v w x y z ) for((i=1 ; i <$g+1 ; i++))
[code]....
the message is supposed to be hello , i need to get rid of the endlines somehow..
file1: have DNA sequences and each sequence will begin with > symbolfile2: have protein sequence and each sequence will start with > symbolfile3: BLAST result of file2 and each result will start with query= .my problem is i have to make a report file by combining these three in such way that first sequence from file1,first sequence from 2nd file and first result from file3 should be printed in a report file
I have a python script that when run outputs to screen.
eg. ./international_sms_check.py 0403000511 919227434827 TS 21 check ok TS 22 check ok sms successfully delivered from 61403000511 to 919227434827 But when I try:./international_sms_check.py 0403000511 919227434827 > test
The file test is created but there is nothing in it.if I try ls > test this works fine with output of ls redirected to file test.
I know how to redirect the output of a terminal to a file. For example, if I want to list all the files in ~/Documents and output to a file called test.txt, I would do this: ls ~/Documents > test.txt The question is, can I copy the output to test.txt AFTER I have carried out the command? This would mean that I wouldn't have to know in advance whether I want to copy the output to file. I want to do something like this: ls ~/Documents Then this: <bash command for copying standard output to test.txt>
I'm having a slight dilemma on reading data from a text file and outputting it into a table then displaying it. Basically I'm writing a shell script that takes information from text files then outputs the data into a table with 4 headings.he extracting of the data is fine, but creating a table i'm having problems with. I think it is possible to do it using the awk function, but so far i'm having a lot of difficulties.