General :: Xfce: Save Page As: Single File Vs Multiple Files?
Nov 28, 2010
I am using Xfce as the desktop enviroment and Mozilla Firefox as the webrowser. Within the webrowser window, I do File>Save Page As. I save it, and the result is almost always foo.html and directory foo_files. But I think under KDE I could choose the format, one of them being something like "Single page" (only one file; the colecction of .png, etc is embedded into that file). And this is the format I want Xfce (or Firefox) to use when downloading to hdd.
how to update a series of values from multiple grep commands outputs to be appended to a single row of a csv file? Work on a linux envir. The values from grep output will be numeric values.
Output sold look like:
1,3,4,5,7,0,5
Each of these values will be odtained from multiple grep commands piped with wc -l Is it possible to update a single row of a csv file if so pleas ehelp me with the command to be used to redirect the output into the csv file
Does anyone know of any software that will convert a few csv's into a single xml?I have a windows program that has been doing it up until now but the info is pulled from one of my linux servers and then sent back to it afterwards, im looking to script it all rather and keep everything on the linux box.
I want to combine 4 ps(or pdf) page to single page and for this i have try a2ps command as:
a2ps -4 1.ps 2.ps 3.ps 4.ps -o outfile.ps
Event through it divide the page into four section. But only give one output of my four input. other three section is blank. Is i am doing some wrong or Is their any other command to do so.
I'v installed wmctrl to have a terminal on desktop and I'v configure it with a script I'v found online.I'v add the script to startup menu interface from xfce (I don't know the us name cause I'v it version on system language), when I shutdown I'v save the session, and on the next log in seems that it is run twice, so if I disable the save session button when I log of but in that case it run in the previous status (dir/position)and I want that it run from startup menu from script file....so when I save the session where it save the status??...how can I skip that it run twice with the script running at startup???
i have 10 vi files . these files contain some system related information. i need to combine the output of all these files into a single file. the final file should contain contents of all these 10 files and the output should be in a tabular format.
is there any command in vi that i can use to create a table ?
i have started using linux for less than 6 months. now i have come across a problem with pdf files in linux. i want to join different pages from different pdf files into single pdf file.i have come across softwares that do this but they perform this using page numbers from pdf files.but i need to do this based on keywords in different pages .for eg there 3 pdf files
now i have to create a pdf file langunage.pdf ,combining the topic languanges from three pdf files america.pdf,india.pdf,china.pdf how can i do it?? whether there is any open source software for doing this?.
I am quite new to script programming and I am facing an uphill task to rename files in one folder. I have gone through similar posts but most of them deal with renaming files by changing the file extensions.Problem : I have a folder which contains files like bild01.jpg,bild02.jpg. There are more files in the folders which should remain untouched. I want to rename these 'bild' files as follows:
bild01.jpg -----> 1c.jpg bild02.jpg -----> 2c.jpg bild30.jpg------>30c.jpg I would like to create a script as: #!/bin/bash npics=`ls -1 bild*| wc -l`
I have files a, b, c and d. They're all relatively large and are served up by a static web server optimized for this purpose. I can get requests that look like this:
/abcd /ad /bacdac ...
Each request is basically a request for a concatenation of the files in the order of the letters. The list of possible requests is finite, but large enough that disk space will run out very quickly and be very expensive if I create all possible files via concatenation.Is there a way to create a pointer file like abcd that is essentially a multi-file symlink that first points to a then to be then to c then to d? So if the contents of the files were as follows:
I have a huge binary log file. There are lets say 4 id's that I want to find in a log file. I know that those 4 id's will be present in the log file and I also know in what order they will be present. I want to find 1st id from the log then 2nd id and then third id and so on..
Simple/inefficient solution is: Loop through the id's and then grep in the log file. Problem with this solution is for each id grep will search from the beginning of the file.
Better/efficient solution would be: Sine I know the order in which id's will be present in the log file. Loop through id's, grep 1st id and then move on to grep 2nd id and so on...this way I can grep all id's in one pass. Is this solution possible ?
I have 500000 + values to find in log files and I have to find efficient solution for it.
I know Ghostscript can convert PDFs to JPGs, and in the case of a multi-page PDF, can rip each page to an individual JPG. But is it possible to have it rip them to one JPG, so that the pages are pasted below each other, e.g. the top half of the JPG is page 1, the bottom half is page 2? Or do I have to use another program (and can ImageMagick do this?) to combine the JPG pages into one image?
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I have question about the UNIX sockets. my goal is to connect multiple sockets from a single client to a single server and keep them open...I'm not sure if that is possible to create or not. Do you have any suggestion or an example of code?
I have a file which contains the data i retrieved through prstat and an array that contains all the unique process ID's of that particular file. i want to compare each and every line in the file with each and every element of the array so that i can create multiple files for the multiple value in the array.
I'm working on some scheduled task script files to keep nightly backups of some of our database information in place, and it's a bit annoying when they blow up. I know how to redirect stdout and stderr to a flat file I can view when I come in, and I know that 2>&1 maps them both to the same file (whatever was named in 1). However, I'm running into some cron-time situations where it's easier to have the two streams together, and other cron-time situations where it's easier to have them separated. I can't really tell which is going to happen; is there some way I could create both kinds of output file for my scripts, so that I've got a std_err only file and an interleaved std_out/std_err file?
Note: I've looked at the 'tee' command, but I don't think it will work for what I'm after. 'tee' appears to only work with stdout; I'm trying to work with stderr.
Being new to this area .I have been assigned a task which i am unable to do . Can any one please help me .
I have requirement where i have input file XYZ_111_999_YYYYMMDD_1.TXT and with header and series of Numbers and Footer.
I want to create a mutiple output files with each file having a seperate code which is stored in text file and create XYZ_222_999_YYYYMMDD_1.TXT . and add date in the contents next to series of numbers .Like this
Is there any limitation to the number of transactions through a single port if so then if we assign multiple port to that particular service then the performance is increased (what i suppose)
so: Is there any way to assign multiple Ports to a single service. like for a web server the main service is httpd or some thing like that to be running on the server and now if we assign multiple ports to that service then the performance increases.
A function by name abc is called in many files. I want to copy all the lines with the function call to an output file.A simple grep on function name doesn't help me as the function call is spanning across multiple lines as follows:
abc(parameter1, parameter2, parameter3);
So I want to copy all the three lines (till semicolon) to the output file.The problem is because there are more than 200 calls for the same function and I cannot do it manually
If i have two domins [URL] and [URL], can i point it to same IP Address in DNS?.I had already added namevirtualhost in my Apache.If possible, is there any risk,disadvantages.
I am new in client server programming and i have written code for single server to single client communicating one port(3490). but i have no idea how my single server will be communicate with different client on different port.how this will be happen ?
Any idea to understand the logic or send any good link or any piece of code in c , i searched on net but all help was mostly for java progamming.
I need to transpose a file with over a 1000 rows of 5 columns of numbers into a file with a single column of numbers. The numbers are separated by a single space and range from one digit to 5 digits each. I tried using awk, but can only get it to grab one column of numbers.
Input:
1 2 3 4 50 600 7 8 9000 10 11 12000 13 14 15
Desired output:
1 2 3 4
[code]....
Tried using: awk '{split($0,a,""); print $NF}' <filename> and got:
I have an audiobook in 64 small mp3 files and I need them combined into a single file of any format in the proper order or order that I add them in. What program can I use?
Possible Duplicates: Dual Booting Linux and Windows XP Booting Multiple Operating Systems I have a machine with Windows on it, and I would like to be able to reboot to Linux. I am certain this is possible. How can I achieve this?