the VBO file that i want to burn into a dvd is around 7.9 gb. i want to know if it's possible to split them in order to burn the files into two diferent dvds.
I have Ubuntu 9.04 and just installed Sound Converter. I am trying to convert a bunch of .ogg files to mp3 to play on my iPod and it's not working so well. In the Sound Converter options I have is set to convert to high quality mp3. I choose the folder that the files are in and after a moment (slow laptop) Sound Converter populates, I hit 'convert' and it shows that the conversion completes in two seconds. All that it did was create the new folder structure of artist/album but there is nothing in there. Not sure what I am missing. I used Sound Converter before and it worked fine.
I'm trying to use convert, I have installed the imagemagick. I use this line:convert *.jpg test.pdf but I'm only able to convert to pdf 1 single jpg file, not multiple files at once. When there's more than one file, I get the following error: Segmentation fault
I have a lot of .flac files downloaded from several sites. Most of them come with a .cue file, and the .jpg with the cover, etc. It seems it is the intention of the uploader that one rebuilds the original CDDA. However, if I had a stand-alone CD/DVD player with flac I would hardly see the point of converting the flac to cdda. Furthermore, I could even play the flacs with a software player although, in this case, the audio quality would not be so good due to the noise picked up by the signal from the PC digital circuits.
I'm trying to upload my tar'ed site to server but I have upload limit. My ftp program extracts tar after uploading it. How do I split this one tar into two so I could upload one tar and let it extract itself, and then upload second one and let it extract itself too?
I have a file of +1 1GB in size. I want to split it in either 100MB rar files or 100MB zip files.
Anybody know what command I need for this?
Please keep it simple, I'm very new to linux. So I'm hoping somebody can give me a command in the lines of "blabla "put file name here" "put file size here" "put folder to rar/zip" here.
I am using rsync to create rotating snapshot style backups of my web files and sending them via SSH to a remote location in order to burn them for offsite storage. This is all working perfect. The remote machine is a Windows Server 2003 which has data that I combine with my web files before burning. I have cygwin installed on the remote server in order to archive and compress the entire backup using tar. (This is not a post about cygwin, I just thought I would mention it in case anyone was wondering how I was running Linux commands after transferring it to the Windows box). After compression, the backup is over 12gb. The next step in my process is to split this tar.gz file into smaller chunks in order to burn them to DVDs. I use dual layer DVDs which are 8.5gb storage size.
I also use cygwin to split the tar.gz into multiple 2gb files using the split command. When I burn them, I only put 3 files on each disk totaling 6gb to leave some padding in case this was a problem. The burn completes and says successful, although it errors out when in verification. I have tried this multiple times and it seems to fail verification at the same point every time which leads me to believe that it has something to do with the data. I have also done tests such as creating smaller backups with completely different data and brning that to a CD-R which worked fine, so I'm convinced this process can work, I just cant get it to work in the right situation. I have also tried burning one of the 7 split files to a dual DVD which also worked fine. I'm wondering if their is a chunk of data that is causing this problem in one of the other split files?
I'm looking for a free alternative to split files into .partXX files. I know this can be accomplished through rar, but it's a shareware and I was wondering if there's a free alternative that accomplishes the same job.
Rather simple question, but is there a way to make an archive (simple tarball, no compression needed) out of a very large file and split it into parts? Basically I need a ~1GB file in 25MB pieces.
im trying to reconstruct / extract a file that was too large to fit onto a floppy, used 7zip to create and split the file into multiple parts in tar.bzip format. this was done in windows. Then moved all the parts of the file to tiny linux on a really old laptop. no cd drive, no usb or network. so have to rely on floppy drive. i do know that reconstruction while extracting using commands is possible. but not working.tried tar -xMf file.tar.001 but nothing.
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I am downloading a set of files that were split by a program called ffsj
http://www.jaist.ac.jp/~hoangle/filesj/
The Fastest File Splitter and Joiner.
I have been googling it, but I am not finding anything that is telling me how I might join these files using my CentOS Linux. How can I join these files using CentOS?
I need to split up a large file on windows so I can upload it in parts to a linux machine. I'm looking to do the opposite to this hopefully with some native utilities to keep it simple.
I understand the linux side of the equation to be cat filea fileb > file
what is the simples way to split files on a windows machine which can then be joined together via cat on a linux machine?
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
I have a log file on ubuntu 10.04 that has 500 lines of log data in it. What command could I use in a terminal to split the single 500-line file into generate ten files each with 50-lines of log files each?
I am backing up parts of my computer with DD, and i was wondering if there was a quick way to split the files created into 4.4GB sized files that will fit onto a DVD. Anyone have any idea of how to do this?
I used split -b 32m "file.bz2" "file.bz2.part-" to split a file and it created more than 50 parts. From googling, the way I found to reassemble the parts is to cat file.bz2.part-aa file.bz2.part-ab > file.bz2, while enumerating all the 50+ parts. Is there an easier way to reassemble the parts wherein I no longer need to list all those parts explicitly?
streaming videos i sometime want to save them, after watching. generally they're saved as flash videos or mpeg-4 video, which is all fine.however periodically they're split-up and saved in smaller chunks with names like data_1, data_2, data_3 etc. these range from 14.0 - 44.0 MB. the file in question (currently i'm trying to save from the cache) was from divxden, or possibly divxstage.eu either way i think it used the totem plugin.so, my question is: does anyone know if these files can be stuck back together, or if this feature can be changed so streamed files are kept intact?
Today encoders are getting smarter. They can compress Blu ray similar quality in 700MB. It seems header of video file contain info about frame rate, audio/video encoder etc. which can't be guessed. In MPEG audio , every part of file is independently playable. If a movie is binary split into 6 parts & I don't have the first part then it is unplayable.
Code: example ls -rwxrwxrwx 1 root root 280M 2010-12-07 20:23 irn2-cd1.mkv -rwxrwxrwx 1 root root 50M 2011-05-26 13:09 last-50M-cd2 -rwxrwxrwx 1 root root 50M 2011-05-26 13:44 first-50M-cd1 file * first-50M-cd1: Matroska data last-50M-cd2: data irn2-cd1.mkv: Matroska data
I have installed this program ok but I am new to command lines in terminal.
I want to convert some wav files to wma files. I have the wav files currently in a folder called Test to make it easy. So I have entered the following command line:
ajpearson@ajpearson-laptop:~/Desktop/pacpl-4.0.5$ pacpl --to wma home/ajpearson/Desktop/Test and the error message I get is:
error: the following is not a file or directory: home/ajpearson/Desktop/Test
It does not matter what directory I use I get the same error. I am sure the answer is obvious - but not t me.
How do you convert Open Office (ODT) documents to Text files? I have made a report using libre office. Now I wish to continue editing the document using lyx (latex front end). So the ODT file needs to be saved as some .tex file.
I don't see an option to do this in File menu (export/save as). So is there any other plugin to do this?
i have a large directory of .bsp files that i would like to convert .bz2 archives. I've been searching for some time and all i can find is the obvious compress multiple files into one large archive. If anyone knows how to convert each file individually, while retaining the original file name (testmap.bsp would be archived as testmap.bsp.bz2)
I have screen.log and putty.log files which has keystrokes characters like ^M, Esc,@ etc.it is in unreadable format. How i can convert it to human readable format?os is rhel 5.2.