General :: How To Convert A Simplte Text File Into Csv Using Awk Script?
Sep 14, 2010
i am collecting usb usage details of all users and convert it into csv files so that i can export it into some database..the output desirable is in csv format for database with some batch or awk script.
I need to be able to convert HTML email messages saved as text files (.eml or .msg) to PDF documents, one PDF per email, retaining formatting and images.
Are there any Linux tools that will allow me to do this from the command line (so it can be scripted)?
I have an pdf file on my linux RHEL 4.7 machine. I can open that file but when i click on 'saveas' to save the file in 'Text' format there are no options i see there. I need to save the 'pdf' file to 'text' format. could anyone tell me how to save the pdf file to Text format. Iam using 'KDE'
i'm trying to convert a html file into a text file when i simply run "html2text <filename>" the output displayed is the way we want but when i redirect the same using "-o" or ">>" the file is having extra characters in it. i even tried -ascii,but no much use.
In order to make this conversion I have to use a text editor. This is tedious. Is there an easier way to do it, like some program I can run from the Linux or OSX terminal?
I want to convert many text files(copied from windows workstation) into utf-8 encoding file. Yes, iconv is available for it. However, I have to give source file encoding at the command line parameters! The problem is, at most case, I am not sure the source encoding of it. And, I also want to use a script to convert many files recursively.
i just touch linux, may i know how can i convert the core dump file to a readable textfile, which include all the information, which is in core dump, such as all variables, threads information, call trace for each tasks, and so on. i know use the GDB can view this, but it won't dump all the informations to one text file. but sometimes, people want to view the core dump reason without Linux environment.
I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.
The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).
I need to know:
1. Is this possible to do with the terminal without super advanced knowledge.
2. Can one please point me in the right direction; something to read or an example
i working with a simulator tool that i need to pass to it a file in .BIN format, basically i need to convert from a tex plain file to BIN file How can i do that? there is some command(s) that allow me do
I have a gedit text file 2.2MB. I want to convert it into two orhree smaller files/volumes, so I can upload them seperately to web pages. Does anyone know a quick and easy way to do this?
I need to convert a very large latex project (made up of many .tex and style files) into .html (or something similarly non-.pdf). Can someone recommend a quality converter program? Preferably, one that is:
I have a number of text files (26 per database x 100+ databases) which need 'correcting' in order to import into postgresql. I think that I have identified all the problem characters and I need to automate the process as much as possible. I have a script to convert the characters and I do them one by one (not effecient but easier to understand).
I am making a text search engine. I need to first convert binary documents to text. I want to go with cross-platform (we develop both on windows and linux) command line (so that I can get the output via python subprocess). What are the choices for this?
im trying to output a list of running processes via a shell script. At the moment i got this which outputs the processes to a text file called out.
echo $(ps aux) >>out
The problem is though, the processes are all just one big block of text which makes it hard to read. Does anyone know how to sort the output to a text file so that it prints to the text file at 1 process per line? I know its probably simple but im very new to linux.
I am trying to convert my batch file into a .sh file and i think i have it perfect but it just will not work, so obviously not perfect. This is the code for my batch file.
[Code]....
This works perfectly on my own computer without any problems. I want to host this on my Linux VPS (CentOS 5) and need it to be converted into run.sh. This is the code for my run.sh.
Recently I tried to convert a .flv file to an mpeg file using ffmpeg. Although I changed directory to the directory in which the.flv file resided FFMPEG said the file did not exist. However when I gave the "ls" command the file was present. Where is my mistake?
How do you convert Open Office (ODT) documents to Text files? I have made a report using libre office. Now I wish to continue editing the document using lyx (latex front end). So the ODT file needs to be saved as some .tex file.
I don't see an option to do this in File menu (export/save as). So is there any other plugin to do this?