General :: Read Data From Text File And Output Into A Table?
Apr 12, 2010
I'm having a slight dilemma on reading data from a text file and outputting it into a table then displaying it. Basically I'm writing a shell script that takes information from text files then outputs the data into a table with 4 headings.he extracting of the data is fine, but creating a table i'm having problems with. I think it is possible to do it using the awk function, but so far i'm having a lot of difficulties.
I'm having a slight dilemma on reading data from a text file and outputting it into a table then displaying it. Basically I'm writing a shell script that takes information from text files then outputs the data into a table with 4 headings. The extracting of the data is fine, but creating a table i'm having problems with. My code extracts the data outputs the string to another file which works fineThe text file looks like this
mr smith 1 purchase oct 2007 mrs smith 2 purchase nov 2006 i want it to look like this
I've got lines of data in the following format: space1=number of times error has occured space2=IP address space3=Error
I've set this out nicely with printf and made it email me, the problem is - it's not entirely clear what each column/space is and the IP and occurances can sometimes seem confusing. Is there any (easy) way to output this into an ascii like table? There will always be 5 occurances, and the format will always be the same
im trying to output a list of running processes via a shell script. At the moment i got this which outputs the processes to a text file called out.
echo $(ps aux) >>out
The problem is though, the processes are all just one big block of text which makes it hard to read. Does anyone know how to sort the output to a text file so that it prints to the text file at 1 process per line? I know its probably simple but im very new to linux.
I am new to shell scripting.What i am trying is to write a shell script which take the input file and output should like as mentioned below.Output file should have data till SOK (marked in red)from every second line and then the selected data(marked in green) from 4th line.So selected data from 2nd and 4th line in one line of O/P file and then similarly selected data from 6th and 8th line in second line of O/P file.Input File:
I'm using Ubuntu and I'm programing with eclipse CDT. My goal is to execute a php file and read the output to my c++ program. To do so I thought I should use fork(), dup2() and execl. When in shell, the call "php myscript.php" worked just fine, but when in c++ I tried: execl("usr/bin/php", "php", "home/geiger/workspace/SemiServer/server_content/myscript.php", NULL); And it didn't work (the process wasn't terminated and I got no output). I tried different version of this call, like losing the "php" string and/or drop "home/geiger" from the path string, to no better result.
file1: have DNA sequences and each sequence will begin with > symbolfile2: have protein sequence and each sequence will start with > symbolfile3: BLAST result of file2 and each result will start with query= .my problem is i have to make a report file by combining these three in such way that first sequence from file1,first sequence from 2nd file and first result from file3 should be printed in a report file
it compares two files using md5... if they are same , a corresponding character is output to a text file .. but the problem is it gets appended by default.. is there any way to output in a normal way because the text is a message and it should be of proper format here is my script
Code:
#!/bin/bash g=`tail -1 new.txt|head -n 1` array=( a b c d e f g h i j k l m n o p q r s t u v w x y z ) for((i=1 ; i <$g+1 ; i++))
[code]....
the message is supposed to be hello , i need to get rid of the endlines somehow..
I'm trying to write a shell script which finds bits of data from a text file. at the moment i'm using grep and basically i need a function which will look through the text file and take the data out of it. the file has days, months, years etc and i want it so i can type feb 06 and it finds all of the data for feb 06.
the problem i have is i can type feb and all the information comes back for feb, but i can't get it more precise e.g. feb 2009 and it finds just feb 2009, it seems to ignore that latter half. I've tried experimenting with egrep and having two inputs but i can't seem to fuse them together, it only takes the first input.
Am having issues getting the output from a script to be logged in a file. I need the script to output both the stderr and stdout to the same text file.
Im having xeltek eeprom programer but I cannot read the chip data on the buffer file, when I read the chip using programer the datas are being sent to the buffer I can just see the adress line ,hex line and ascii line then I dont know which is the exactly data ,
I want to write expdp output in a text file using a shell script
If i write like below:
It will write whatever is there in log file to text file
But, sometimes export fails with out start taking export (without generating log file) because of job already exists error. such times, we dont know about that error until we check manually... so i wrote like below:
But still it is not writing anything in to text file using above stmt...
I wonder capability of awk to manipulate data in consecutive multi files by read one batch file.for example I have files: data1.dat, data2.dat,data3.dat and listfile.txt
I have a script almost working except for 1 thing. What I'm trying to do is read a file that has the files that need to be FTP'd using a bash script. I have everything working except the reading of the file. It works outside of the ftp script I've wrote but once I put it in the FTP script it doesn't.
Here's the Script:
#Here's where the problem is that I know of
I've been playing w/ the exclamation points to see if that could be the problem, but so far no luck.
I have a text file that contains a single word and I want to write a bash script that will read the word from the text file... The following is my incorrect attempt, as it assigns the name of the textfile to the variable as opposed to the word stored within the textfile:(assume I have a text file value.txt that has its contents a single word, say wordone)
Code: #!/bin/sh for f in value.txt do echo $f done
so the output of the above script is value.txt, however I want it to be wordone.to summarise: how do I assign the value of the word contained within a textfile to a variable?
I am trying to send data through a USB port to a printer which can recieve text file. Now to send the data to the printer I am doing echo "Hello World" > /dev/usb/lp0
but the data doesnt seem to be sent to the printer because I have an analyzer to monitor the data sent to the printer. Is there anything else which I need to do
I have a problem with arrays in awk. What i want is to take some data from a file (ssh log) and print it to a html table. I have managed to print some stuff (user logged in and how many times they have logged in) What i want more is to take all the ip that each user logged in from and print it in a row next to the username and times (in the code i typed blabbla where i want the ip to be shown. How do you think i should approch that, multidimensional arrys maybe?
I am trying to import a data file from old DOS application into MYSQL table The file is clear text file with fixed-width columns, without column delimiters
Example file : Code: 4444333666666 2222666555555 iiiiwwwcccccc
I download some movies those are with 'mkv' , but couldn't be played, I tried other players , like mplayer , dragon , xine, even swich OS to windows , didn't work . not all of those files , but some of them. one of them named 'the.other.man', 2GB.I opened a terminal and executed "file the.other.man.mkv "utput is "data", and command 'strings the.other.man.mkv", output like as follow:
i have wrote a long piece of code above with the "main" which is calling openFile( &fout, filename )filename contains the txt name in a form of "data.txt"i wanna read the data from the file and output it into fout for later use.the data in that file is a vector looking interger group.i have the following code:
had trouble viewing partition table using fdisk, now realised i just cudnt view the whole table from Rescue terminal, please remove this thread, i can't find how ))
a project using bluetooth to send data byte by byte to external devices buti'm not familiar using arrays to read file from another location before sending the data.If you could,do correct my codes.Here's my code,
What are the possible problem when Windows access the file from Ubuntu got Read Only even though have a full permission to read, write and execute the file? Ubuntu to Ubuntu accessing the file there is no problem only Windows got a problem.