Ubuntu :: Compress Files Or Create Split Archives?
Apr 25, 2010
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
How can I group files and create archives accordingly? I have 10,000 files in a folder (no sub-folders) and I want to create 10 zip or tar.gz archives. This means every archive has 1,000 files. How can I do this in Linux?
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
Are there any other archiving tools than tar that preserve Linux file permissions and user/group flags? I want to create archives of a directory tree on my ubuntu box. The directory trees are large, a first run with tar cvpzf archive.tgz /home/foo/bar yielded a 5GB archive. I also want to keep all permissions and other flags and special files.I'm fine with a 5GB archive, however to look inside that archive -- since it is a compressed tar archive -- the whole 5GB have to be decompressed first! (Or so it appears when opening it with the archive viewer -- I'm happy to be corrected.)So I need a way to "backup" a directory (tree) while preserving full filesystem attributes and right and creating an archive with an "index" that hasn't to be decompressed for browsing its contents. An archive is either a single file, or a (small) set of files that carries full and complete information within this file/s. That is, it can live on any filesystem (size permitting), it can be burnt onto a DVD, you can split it (after which the point of this question is really lost - but still), ...
have 6680 wav files with about 500kb size in a folder and i want to merge all of them.the size of the files altogether is 1.5GB. how i can merge them and compress them to create a mp3 or ogg file?
I just read in my Linux+ resources that it is not a wise idea to compress tar files with gzip if my tape drive supports compression. In addition, the resources mention that using gzip for tar files runs the risk of data lost if an error occurs with the compression.
Does anyone compress tar files for major backups/restore policies?
I want to know how to compress files to zip, rar or 7z with password using KDE or Dolphin interface and no command line. I can compress without password, I have not found an option to protect with password. What do I have to install or configure?
I want to package my application for linux, I use zip/tar.gz/7z/rar to do the compress in Windows.Because I do the compress in Windows, so when I extract the package in Linux, the file permission is reset.The problem is Only the tar.gz can let me extract runable files, other format (zip/7z/rar) not (the file permission is reset to 644).So my question is how to compress my files using zip/7z/rar while reset the permission to 755.
I currently have a bash script that runs and backs up my daily files onto a separate partition using Rsync, but I thought it would be good to use the Ubuntu-one service as an ADDITIONAL backup for really important files.
How do I compress then encrypt those files, and can I add any commands that will do this to my existing bash script?
I've found a website [URL] that I've downloaded Basilisk II [URL] and SheepShaver [URL] from, I converted both inton deb archive using the command sudo alien, but while they both successfully converted and installed, and show up in the Applications menu, only Basilisk II launches. When I click on the SheepShaver icon, nothing happens. I had this problem with another SheepShaver RPM I downloaded elsewhere. I really want to use SheepShaver to run Mac OS 9 on my laptop, is there something I can do to get it to run?
Is there is anyway, with tar, zip, gzip, or any file compression type to compress without causing high CPU. In other words, limit how hard the CPU works to compress it? Of course I understand that this would cause the compression to take longer but time isn't too big of a concern.
the VBO file that i want to burn into a dvd is around 7.9 gb. i want to know if it's possible to split them in order to burn the files into two diferent dvds.
Today I tried to compress some folders containing backup files from last year. I right-clicked on the folders and selected compress as tar.gz. I let it work, and found that hours later, the folders were still compressing. How long is it supposed to take, anyway? I was trying to compress the two sets of backups simultaneously; together they're around 1.5 GB. They have many subdirectories.
I'm creating a script that creates files from svn checkout and compress them using tar.gz the script gets the repository name from command line argument i need to capture a number from the last line of the output and create a file name from it.
The svn returns output of all the file names from the repository and in the end it says: revision number xxxxx. i need to get this number and then rename the tar.gz to it. how do i save the output to a variable and get this number.
I download a lot of files that have been compressed into separate pieces by winrar, so that a 4gb file will be shown as dozen or so 93mb peices. In windows, using winrar, extracting them was rather easy but in Kubuntu whatever default program is being used throws a fit. In a perfect world I would want to get a open source alternative from synaptic, alternatively anything I can download in RPM I can eventually figure out. I want to avoid trying to get winrar working under wine.
I have brought a virtuell server to get Magento ready. My server is debian with PHP Version 5.2.6-1+lenny3
For that i need Pear. I want to install it global so i tried this command "apt-get install php-pear". Following i get after this:
Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: php5-cl Suggested packages: code.... 1+lenny3_all.deb 404 Not Found E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
I am using rsync to create rotating snapshot style backups of my web files and sending them via SSH to a remote location in order to burn them for offsite storage. This is all working perfect. The remote machine is a Windows Server 2003 which has data that I combine with my web files before burning. I have cygwin installed on the remote server in order to archive and compress the entire backup using tar. (This is not a post about cygwin, I just thought I would mention it in case anyone was wondering how I was running Linux commands after transferring it to the Windows box). After compression, the backup is over 12gb. The next step in my process is to split this tar.gz file into smaller chunks in order to burn them to DVDs. I use dual layer DVDs which are 8.5gb storage size.
I also use cygwin to split the tar.gz into multiple 2gb files using the split command. When I burn them, I only put 3 files on each disk totaling 6gb to leave some padding in case this was a problem. The burn completes and says successful, although it errors out when in verification. I have tried this multiple times and it seems to fail verification at the same point every time which leads me to believe that it has something to do with the data. I have also done tests such as creating smaller backups with completely different data and brning that to a CD-R which worked fine, so I'm convinced this process can work, I just cant get it to work in the right situation. I have also tried burning one of the 7 split files to a dual DVD which also worked fine. I'm wondering if their is a chunk of data that is causing this problem in one of the other split files?
I'm looking for a free alternative to split files into .partXX files. I know this can be accomplished through rar, but it's a shareware and I was wondering if there's a free alternative that accomplishes the same job.
im trying to reconstruct / extract a file that was too large to fit onto a floppy, used 7zip to create and split the file into multiple parts in tar.bzip format. this was done in windows. Then moved all the parts of the file to tiny linux on a really old laptop. no cd drive, no usb or network. so have to rely on floppy drive. i do know that reconstruction while extracting using commands is possible. but not working.tried tar -xMf file.tar.001 but nothing.
I'm trying to upload my tar'ed site to server but I have upload limit. My ftp program extracts tar after uploading it. How do I split this one tar into two so I could upload one tar and let it extract itself, and then upload second one and let it extract itself too?
I have a file of +1 1GB in size. I want to split it in either 100MB rar files or 100MB zip files.
Anybody know what command I need for this?
Please keep it simple, I'm very new to linux. So I'm hoping somebody can give me a command in the lines of "blabla "put file name here" "put file size here" "put folder to rar/zip" here.
I have a log file on ubuntu 10.04 that has 500 lines of log data in it. What command could I use in a terminal to split the single 500-line file into generate ten files each with 50-lines of log files each?
Rather simple question, but is there a way to make an archive (simple tarball, no compression needed) out of a very large file and split it into parts? Basically I need a ~1GB file in 25MB pieces.