General :: Multiple Grep Outputs Appended To Single Row Of CSV File?
Sep 12, 2010
how to update a series of values from multiple grep commands outputs to be appended to a single row of a csv file? Work on a linux envir. The values from grep output will be numeric values.
Output sold look like:
1,3,4,5,7,0,5
Each of these values will be odtained from multiple grep commands piped with wc -l Is it possible to update a single row of a csv file if so pleas ehelp me with the command to be used to redirect the output into the csv file
I have a huge binary log file. There are lets say 4 id's that I want to find in a log file. I know that those 4 id's will be present in the log file and I also know in what order they will be present. I want to find 1st id from the log then 2nd id and then third id and so on..
Simple/inefficient solution is: Loop through the id's and then grep in the log file. Problem with this solution is for each id grep will search from the beginning of the file.
Better/efficient solution would be: Sine I know the order in which id's will be present in the log file. Loop through id's, grep 1st id and then move on to grep 2nd id and so on...this way I can grep all id's in one pass. Is this solution possible ?
I have 500000 + values to find in log files and I have to find efficient solution for it.
I have a list of words that I want to grep in many files to see which ones have it and which ones dont. in the text file I have all the words listed line by line, ex: list.txt:
check try this word1 word2 open space list ..
I want to grep each line one by one. like I want it to
grep "check" *.log grep "try this" *.log grep "word1" *.log .. etc how can I do this?
I have a requirement like this:Cut the characters from each line of a file with following positions: 21-24, 25-34 ,111-120.Thse fields now need to be placed in a tab delimited output file.Currently this is how I am achieving it:
I am using Xfce as the desktop enviroment and Mozilla Firefox as the webrowser. Within the webrowser window, I do File>Save Page As. I save it, and the result is almost always foo.html and directory foo_files. But I think under KDE I could choose the format, one of them being something like "Single page" (only one file; the colecction of .png, etc is embedded into that file). And this is the format I want Xfce (or Firefox) to use when downloading to hdd.
I have files a, b, c and d. They're all relatively large and are served up by a static web server optimized for this purpose. I can get requests that look like this:
/abcd /ad /bacdac ...
Each request is basically a request for a concatenation of the files in the order of the letters. The list of possible requests is finite, but large enough that disk space will run out very quickly and be very expensive if I create all possible files via concatenation.Is there a way to create a pointer file like abcd that is essentially a multi-file symlink that first points to a then to be then to c then to d? So if the contents of the files were as follows:
I've searched high and low, and can't seem to find a solution to this. I'm running a Dell Inspiron with an HDMI output with 10.10 through the tv. I want to get HDMI sound output for VLC, but I also want S/PDIF output (to the stereo) for Musicplayer. I can test and use the HDMI in the sound preferences sound/preferences/sound, but when I try to do the same for the internal card and click 'test speakers' the sound program closes itself. When the machine was a windows machine, it had PowerDVD outputting to the TV and Mediaplayer outputting to the stereo. I'm aiming for a similar set up in Ubuntu.
I've searched and searched and can't find a straight answer about this. I want to sent the same signal out of the digital output and one of the analog outputs on the soundcard (Intel HDA) on my motherboard. I'm using ALSA and Pulse Audio.
I have question about the UNIX sockets. my goal is to connect multiple sockets from a single client to a single server and keep them open...I'm not sure if that is possible to create or not. Do you have any suggestion or an example of code?
I'm on Ubuntu 11.04. I have read around about how to use curl to download a list of URLs from a text file, and everyone says to use Code:curl -K URLlist.txt. This is what the curl man page says as well. However, for even a simple file with one URL, this command outputs a bunch of weird symbols for me instead of downloading the file.For example, I have a text file "test.txt" with one line in the following format:
Code: url = "http://www.example.com/image.jpg" I use the curl command to download this file:
Right now i have some code to catch the inputs, using a variable "z":
Code:
Then:
Code:
I'm almost positive that the problem is in the bolded line above (for one thing, it always leaves off the initial "-e"). So basically i want a string that gives me "-e input" and concatenates as many times as necessary.
I am having a problem with two linux boxes I have. They are running Centos 5.3 and Centos 5.4. The problem is that when I log in, this file /etc/host, under the etc directory get appended the username I am logging in as, the IP address I am logging in from, and worse, the password in clear text.This is the format it uses:username@IP (password in clear text) [Tue Jan 12 2010 13:00:26 -0500]Is it possible for someone to tell me what is this about, and how to stop it?
if ! [ -f ${PATH}/myfile.txt ];then echo $(date +%Y%m%d"_"%H%M%S)": Nu am gasit fisierul ${PATH}/myfile.txt" ps -fxu pin | grep "/usr/local/coreutils/bin/tail -f ${LOG_PATH}/x.log$" | awk '{system("kill "$2)}' cat ${LOG_PATH}/x.log | sed -n -e '/LONG/{x;1!p;g;;}' -e h > ${PATH}/myfile.txt
[code]....
I have a continuously growing log file (x.log) in which i have to look for certain lines that contain "Long". The line above each line containing the word "Long" it contains a time stamp. I want to extract each line containing the time stamp into myfile.txt and check the difference between time stamps. Whenever there is a difference i need to run another script (${CAL_PATH}/${APP_NAME}), then sleep 1, then continue searching. Lines with "Long" do not appear continuously, but in blasts. The script runs fine until the first pause encountered. Starting with the first pause, tail -f doesn't write in myfile.txt anymore. Can someone help me understand why "tail -f" it stops writing into myfile.txt? Or does someone know an alternative to "tail -f" of achieving the initial scope of the script?
Is there any limitation to the number of transactions through a single port if so then if we assign multiple port to that particular service then the performance is increased (what i suppose)
so: Is there any way to assign multiple Ports to a single service. like for a web server the main service is httpd or some thing like that to be running on the server and now if we assign multiple ports to that service then the performance increases.
If i have two domins [URL] and [URL], can i point it to same IP Address in DNS?.I had already added namevirtualhost in my Apache.If possible, is there any risk,disadvantages.
I am new in client server programming and i have written code for single server to single client communicating one port(3490). but i have no idea how my single server will be communicate with different client on different port.how this will be happen ?
Any idea to understand the logic or send any good link or any piece of code in c , i searched on net but all help was mostly for java progamming.
I need to transpose a file with over a 1000 rows of 5 columns of numbers into a file with a single column of numbers. The numbers are separated by a single space and range from one digit to 5 digits each. I tried using awk, but can only get it to grab one column of numbers.
Input:
1 2 3 4 50 600 7 8 9000 10 11 12000 13 14 15
Desired output:
1 2 3 4
[code]....
Tried using: awk '{split($0,a,""); print $NF}' <filename> and got:
Possible Duplicates: Dual Booting Linux and Windows XP Booting Multiple Operating Systems I have a machine with Windows on it, and I would like to be able to reboot to Linux. I am certain this is possible. How can I achieve this?
I have an unencrypted DVD that is one big title, ie you cannot skip next / prev. What I want to do is reauthor the DVD with title at certain points, so that I can skip next / prev when watching it.
i have 30 linux PCs running. i need to check the performance of all pcs( memory,ram and process usage) in single command or in GUI mode.In solaris we have perf script to check performance in GUI mode. i need same type in linux?
Does anyone know of any software that will convert a few csv's into a single xml?I have a windows program that has been doing it up until now but the info is pulled from one of my linux servers and then sent back to it afterwards, im looking to script it all rather and keep everything on the linux box.
I want to do something like svn add dir1 dir2; svn ci dir1 dir2 but have it be only 1 revision. Is there a way to do this? Is this the correct way to add new folders (with contents) to the repository? We are restructuring the trunk, so I cleared it out and plan on putting these directories with their contents in it.