Rather simple question, but is there a way to make an archive (simple tarball, no compression needed) out of a very large file and split it into parts? Basically I need a ~1GB file in 25MB pieces.
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
the VBO file that i want to burn into a dvd is around 7.9 gb. i want to know if it's possible to split them in order to burn the files into two diferent dvds.
I'm trying to upload my tar'ed site to server but I have upload limit. My ftp program extracts tar after uploading it. How do I split this one tar into two so I could upload one tar and let it extract itself, and then upload second one and let it extract itself too?
I have a file of +1 1GB in size. I want to split it in either 100MB rar files or 100MB zip files.
Anybody know what command I need for this?
Please keep it simple, I'm very new to linux. So I'm hoping somebody can give me a command in the lines of "blabla "put file name here" "put file size here" "put folder to rar/zip" here.
I am using rsync to create rotating snapshot style backups of my web files and sending them via SSH to a remote location in order to burn them for offsite storage. This is all working perfect. The remote machine is a Windows Server 2003 which has data that I combine with my web files before burning. I have cygwin installed on the remote server in order to archive and compress the entire backup using tar. (This is not a post about cygwin, I just thought I would mention it in case anyone was wondering how I was running Linux commands after transferring it to the Windows box). After compression, the backup is over 12gb. The next step in my process is to split this tar.gz file into smaller chunks in order to burn them to DVDs. I use dual layer DVDs which are 8.5gb storage size.
I also use cygwin to split the tar.gz into multiple 2gb files using the split command. When I burn them, I only put 3 files on each disk totaling 6gb to leave some padding in case this was a problem. The burn completes and says successful, although it errors out when in verification. I have tried this multiple times and it seems to fail verification at the same point every time which leads me to believe that it has something to do with the data. I have also done tests such as creating smaller backups with completely different data and brning that to a CD-R which worked fine, so I'm convinced this process can work, I just cant get it to work in the right situation. I have also tried burning one of the 7 split files to a dual DVD which also worked fine. I'm wondering if their is a chunk of data that is causing this problem in one of the other split files?
I'm looking for a free alternative to split files into .partXX files. I know this can be accomplished through rar, but it's a shareware and I was wondering if there's a free alternative that accomplishes the same job.
im trying to reconstruct / extract a file that was too large to fit onto a floppy, used 7zip to create and split the file into multiple parts in tar.bzip format. this was done in windows. Then moved all the parts of the file to tiny linux on a really old laptop. no cd drive, no usb or network. so have to rely on floppy drive. i do know that reconstruction while extracting using commands is possible. but not working.tried tar -xMf file.tar.001 but nothing.
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I am downloading a set of files that were split by a program called ffsj
http://www.jaist.ac.jp/~hoangle/filesj/
The Fastest File Splitter and Joiner.
I have been googling it, but I am not finding anything that is telling me how I might join these files using my CentOS Linux. How can I join these files using CentOS?
I need to split up a large file on windows so I can upload it in parts to a linux machine. I'm looking to do the opposite to this hopefully with some native utilities to keep it simple.
I understand the linux side of the equation to be cat filea fileb > file
what is the simples way to split files on a windows machine which can then be joined together via cat on a linux machine?
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
I have a log file on ubuntu 10.04 that has 500 lines of log data in it. What command could I use in a terminal to split the single 500-line file into generate ten files each with 50-lines of log files each?
I am backing up parts of my computer with DD, and i was wondering if there was a quick way to split the files created into 4.4GB sized files that will fit onto a DVD. Anyone have any idea of how to do this?
I used split -b 32m "file.bz2" "file.bz2.part-" to split a file and it created more than 50 parts. From googling, the way I found to reassemble the parts is to cat file.bz2.part-aa file.bz2.part-ab > file.bz2, while enumerating all the 50+ parts. Is there an easier way to reassemble the parts wherein I no longer need to list all those parts explicitly?
streaming videos i sometime want to save them, after watching. generally they're saved as flash videos or mpeg-4 video, which is all fine.however periodically they're split-up and saved in smaller chunks with names like data_1, data_2, data_3 etc. these range from 14.0 - 44.0 MB. the file in question (currently i'm trying to save from the cache) was from divxden, or possibly divxstage.eu either way i think it used the totem plugin.so, my question is: does anyone know if these files can be stuck back together, or if this feature can be changed so streamed files are kept intact?
Today encoders are getting smarter. They can compress Blu ray similar quality in 700MB. It seems header of video file contain info about frame rate, audio/video encoder etc. which can't be guessed. In MPEG audio , every part of file is independently playable. If a movie is binary split into 6 parts & I don't have the first part then it is unplayable.
Code: example ls -rwxrwxrwx 1 root root 280M 2010-12-07 20:23 irn2-cd1.mkv -rwxrwxrwx 1 root root 50M 2011-05-26 13:09 last-50M-cd2 -rwxrwxrwx 1 root root 50M 2011-05-26 13:44 first-50M-cd1 file * first-50M-cd1: Matroska data last-50M-cd2: data irn2-cd1.mkv: Matroska data
Hi am using winff, as converting video is a mystery to me. I have several avi files well about 26 of them. at about 8gb. now i have to convert them into avi again! so my divx dvd machine can read/play them.
I was wondering if I could compress them so they'd fit onto one dvd?
at present I select the 16:9, would the anamorfic be smaller? would it expand properly on my cheep divx dvd machine? I don't mind using the command line.
i have a question about the best choice for a filesystem: We have in our company a productive webserver running debian with nginx webserver to serve images. The number is about 15 million, they are around 2kb or 3kb of size each and they are stored under some subdir in /home with a hashed subdirectory tree which is 2 leves deep. the images are copied with nfs from some other machine time to time. We get high load when this copying is running. Currently we are using ext3, and considering moving to jfs, cause its relativly low load impact, but i am still courios about the performance. I think serving lots of images is often practiced thing, so i would be interested what others are using as filesystem in this scenario, or if someone could make a recommendation for our case.
I have several audiobooks that are each split into many small chapters, and I would like to string together about ten of them at a time, so that I don't have 60 4-5 minute mp3 files per audiobook.If I were to do this all by hand, with audacity or something similar, it would get very tedious, so I'd like to know if there's already a program that would do it for me when told which files to string together.
I had a samsung 1TB HDD that I used for storing data, on an xp machine, so it was formated as NTFS.I moved this HDD to another machine and installed Freenas on it, and the installation worked fine (fyi, I used the tutorial posted here :[URL]..During the installtion, Freenas installed it's system files to a new small UFS partion. After finishing the setup, I realised that I had changed the file system of the other partion (980gb, previously NFTS) to UFS and now I don'T know how to go back. I had about 400gb of data on it and I'm pretty sure it's still there, but don't know how to get it back.
I tried messing around with recovery software such as R-Studio, and I was able to see some of my files so I know they're still there. After quite a bit of googling around, the only solution I seem to find is using gparted which is a tool to modify partions file system without loosing data, but I'm afraid to use it.
So is there a way to browse NTFS data on a UFS partition and convert it so FreeNas can see my files ? Or is there a way to put the partition back to NTFS so I can back up my data to another drive before I lose something valuable ?
Samba seems to crash and come back after some seconds if I copy a lot of small files in a short period of time over the network. How do I fix it?
I have Ubuntu 9.10 Server 64bit running on a D945GCLF2 board sharing two 1TB ext4 formatted HDDs to my Windows PCs using samba. I've been having an issue with reading or writing files through samba. It happens during copying operations or checksumming, anything that reads or writes MANY small files in a small amount of time. I am pretty sure the problem has to do with my server because the server has run on two different LANs in different homes and will crash from activity with any of several other PCs. There is no crashing if I access the files through SSH, although when I do that the max transfer speed is less than 1MB/s.
When I induce the crashing, there is absolutely no output to the server terminal.
As an easy access example of something that will crash samba, extracting Cinebench R11.5 to the server will do the job. It always fails.