General :: Reassemble Files Split Without Enumerating All Of Parts Explicitly?
Apr 13, 2010
I used split -b 32m "file.bz2" "file.bz2.part-" to split a file and it created more than 50 parts. From googling, the way I found to reassemble the parts is to cat file.bz2.part-aa file.bz2.part-ab > file.bz2, while enumerating all the 50+ parts. Is there an easier way to reassemble the parts wherein I no longer need to list all those parts explicitly?
Today encoders are getting smarter. They can compress Blu ray similar quality in 700MB. It seems header of video file contain info about frame rate, audio/video encoder etc. which can't be guessed. In MPEG audio , every part of file is independently playable. If a movie is binary split into 6 parts & I don't have the first part then it is unplayable.
Code: example ls -rwxrwxrwx 1 root root 280M 2010-12-07 20:23 irn2-cd1.mkv -rwxrwxrwx 1 root root 50M 2011-05-26 13:09 last-50M-cd2 -rwxrwxrwx 1 root root 50M 2011-05-26 13:44 first-50M-cd1 file * first-50M-cd1: Matroska data last-50M-cd2: data irn2-cd1.mkv: Matroska data
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
I am facing a problem while splitting a text file, I need to split a file into some parts and each split file should have 2000 lines, when I do it through "split" command the mother file is kept intact but as per my requirement I need to cut mother file into some parts thus it should not be kept intact.
I'm trying to upload my tar'ed site to server but I have upload limit. My ftp program extracts tar after uploading it. How do I split this one tar into two so I could upload one tar and let it extract itself, and then upload second one and let it extract itself too?
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I need to split up a large file on windows so I can upload it in parts to a linux machine. I'm looking to do the opposite to this hopefully with some native utilities to keep it simple.
I understand the linux side of the equation to be cat filea fileb > file
what is the simples way to split files on a windows machine which can then be joined together via cat on a linux machine?
I am backing up parts of my computer with DD, and i was wondering if there was a quick way to split the files created into 4.4GB sized files that will fit onto a DVD. Anyone have any idea of how to do this?
the VBO file that i want to burn into a dvd is around 7.9 gb. i want to know if it's possible to split them in order to burn the files into two diferent dvds.
Could someone help me find a way to rename a file to a different name containing parts of its old name?
For example:
Original file name: filename1.abc.xyz.some.other.stuff Final file name: filename1.abc.xyz
The length of the file name is not constant. the abc.xyz is not constant but that format is (three numbers.three numbers) the .some.other.stuff is not constant and its what i want to get rid of
I run a memory-hungry process (mkcromfs) which consumes more memory than I have physical memory on my latop, so it is paging and swappin and thrashing all the time and loadavg is about 2 (compcache is already in use with usual swap partition as well), but slowly moving forward (Although I afraid it will finally try to allocate >2GB and crash draining 2 days of thrashing).
When I want to use the laptop for something else, I stop the process, start X server, firefox and other programs. The problem is that when I start Firefox the loadavg jumps to 10 and the system becomes almost unresponsive at all (long time to turn on/off caps lock, slow mouse cursor position updates, slow switching from X server to Linux console, slow login).
The stopped mkcromfs still holds a lot of memory (464.8 MiB and slowly falling) and moves it to swap only when more memory is needed for some other program, which results in a great slowdown.
How to tell the Linux to swap out this process entirely (e.g. I'm not intending to resume it in short term), possibly waking from swap other data? Also it will be useful to be able to specify the exact swap device to swap the given process out (for example, mkcromfs's memory is useless in ramzswap).
Update: Now I just write a 400-600M of data from /dev/erandom to tmpfs and it makes mkcromfs to shrink. Is there more proper way?
How can I split my local mail box into an individual files for each mail. The senario of mine is I fetch some emails from a mail server into my local linux box with fetchmail command but I want each fetched mail in a different indivitual file for easy file processing and manipulation for example sending those email through sms and so on
I have a file of +1 1GB in size. I want to split it in either 100MB rar files or 100MB zip files.
Anybody know what command I need for this?
Please keep it simple, I'm very new to linux. So I'm hoping somebody can give me a command in the lines of "blabla "put file name here" "put file size here" "put folder to rar/zip" here.
I am using rsync to create rotating snapshot style backups of my web files and sending them via SSH to a remote location in order to burn them for offsite storage. This is all working perfect. The remote machine is a Windows Server 2003 which has data that I combine with my web files before burning. I have cygwin installed on the remote server in order to archive and compress the entire backup using tar. (This is not a post about cygwin, I just thought I would mention it in case anyone was wondering how I was running Linux commands after transferring it to the Windows box). After compression, the backup is over 12gb. The next step in my process is to split this tar.gz file into smaller chunks in order to burn them to DVDs. I use dual layer DVDs which are 8.5gb storage size.
I also use cygwin to split the tar.gz into multiple 2gb files using the split command. When I burn them, I only put 3 files on each disk totaling 6gb to leave some padding in case this was a problem. The burn completes and says successful, although it errors out when in verification. I have tried this multiple times and it seems to fail verification at the same point every time which leads me to believe that it has something to do with the data. I have also done tests such as creating smaller backups with completely different data and brning that to a CD-R which worked fine, so I'm convinced this process can work, I just cant get it to work in the right situation. I have also tried burning one of the 7 split files to a dual DVD which also worked fine. I'm wondering if their is a chunk of data that is causing this problem in one of the other split files?
I'm looking for a free alternative to split files into .partXX files. I know this can be accomplished through rar, but it's a shareware and I was wondering if there's a free alternative that accomplishes the same job.
Rather simple question, but is there a way to make an archive (simple tarball, no compression needed) out of a very large file and split it into parts? Basically I need a ~1GB file in 25MB pieces.
im trying to reconstruct / extract a file that was too large to fit onto a floppy, used 7zip to create and split the file into multiple parts in tar.bzip format. this was done in windows. Then moved all the parts of the file to tiny linux on a really old laptop. no cd drive, no usb or network. so have to rely on floppy drive. i do know that reconstruction while extracting using commands is possible. but not working.tried tar -xMf file.tar.001 but nothing.
I am downloading a set of files that were split by a program called ffsj
http://www.jaist.ac.jp/~hoangle/filesj/
The Fastest File Splitter and Joiner.
I have been googling it, but I am not finding anything that is telling me how I might join these files using my CentOS Linux. How can I join these files using CentOS?
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
I have a log file on ubuntu 10.04 that has 500 lines of log data in it. What command could I use in a terminal to split the single 500-line file into generate ten files each with 50-lines of log files each?
streaming videos i sometime want to save them, after watching. generally they're saved as flash videos or mpeg-4 video, which is all fine.however periodically they're split-up and saved in smaller chunks with names like data_1, data_2, data_3 etc. these range from 14.0 - 44.0 MB. the file in question (currently i'm trying to save from the cache) was from divxden, or possibly divxstage.eu either way i think it used the totem plugin.so, my question is: does anyone know if these files can be stuck back together, or if this feature can be changed so streamed files are kept intact?