I have a large list of folders that I want to compress using tar.the script I want to right is to travers all the folders and for each I would tar the contents to a tar file with the same name of that folder. and the it should be stored inside the same forlder.
I want to package my application for linux, I use zip/tar.gz/7z/rar to do the compress in Windows.Because I do the compress in Windows, so when I extract the package in Linux, the file permission is reset.The problem is Only the tar.gz can let me extract runable files, other format (zip/7z/rar) not (the file permission is reset to 644).So my question is how to compress my files using zip/7z/rar while reset the permission to 755.
My hard disk is failing and I amnot able to boot into the system! Currently I have logged into the system uing Live CD! Any way to compress and back up the data in my hard disk in an efficient way!
Today I tried to compress some folders containing backup files from last year. I right-clicked on the folders and selected compress as tar.gz. I let it work, and found that hours later, the folders were still compressing. How long is it supposed to take, anyway? I was trying to compress the two sets of backups simultaneously; together they're around 1.5 GB. They have many subdirectories.
i am using classpath-0.98,jamvm-1.5.4 and arm9 cortex processor. so my question is after install classpath and jamvm on arm9 , i am getting around 30MB FAT file. so tell me some tips how to reduce the size of FAT file as much as possible.
I have been trying to write a script that will take a directory, for example /accounts compress it into a .tar file with the filename containing the date of compression, for example accounts030210.tar and then place that file into a directory called /archive
I also want the script to delete files in /archive that are older than 7 days.
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
I want all files and folders and subfolders and images and everything inside of public_html. Also to be saved on the same folder so i can find it easy How can i do it?
I just read in my Linux+ resources that it is not a wise idea to compress tar files with gzip if my tape drive supports compression. In addition, the resources mention that using gzip for tar files runs the risk of data lost if an error occurs with the compression.
Does anyone compress tar files for major backups/restore policies?
I want to know how to compress files to zip, rar or 7z with password using KDE or Dolphin interface and no command line. I can compress without password, I have not found an option to protect with password. What do I have to install or configure?
I've googled til my brain went boom... So the short story goes; I have lots of ip address ranges in multiple files which need to go into an iptables firewall... Sounds simple right?
Example of files: 1.0.1.0-1.0.1.255 1.0.1.0-1.1.0.255
I need to be able to compress into multiple pieces while still being able to be read by someone else who has windows, I can only download unrar but not rar. Why is this?
I currently have a bash script that runs and backs up my daily files onto a separate partition using Rsync, but I thought it would be good to use the Ubuntu-one service as an ADDITIONAL backup for really important files.
How do I compress then encrypt those files, and can I add any commands that will do this to my existing bash script?
I have a compressed text file. The method of compress is unknown.I can see the file contents by using Midnight Commander without a problem but I would like to view the file just with cat. So I am trying to uncompress the file with unzip or gunzip but it does not work.How to check the method the file is compressed with? Is any way to find it with Midnight Commander?
I need to recreate my initrd.img after having extracted its contents. Bash by itself; pointing me to similar threads in this forum and google are useless to me and a waste of everyone's time as that has all failed. I need a working example. Apparently, I am supposed to use this bash command (s): "zcat ../initrd.gz | cpio -i -d." The preceding command is unintelligible to me. I cannot compress the initrd.img file and folders back into an initrd.gz file with a compression level of 9, so that I can rename with a .img extension.
My understanding of recompressing folders back into the initrd.img: Google and this forum all point to bash involving either zcat or cpio and then gzip with a compression level of 9. However, I require exacting instructions for using these commands to compress the folders that have been extracted from the initrd.img back into one homogenous initrd.gz archive so that that the created initrd.gz can be renamed initrd.img
Note: posting bash without that an example is a waste of everyone's time as I found that on Google and it was useless as I lack the requisite computer science degree or years of Linux guru experience needed to figure out how to specify the arguements proprerly. What I need is a working example, not just bash.
Note2: To save time, the answer to why I need to edit the initrd.img is this: Two different utilities (based upon the same parent system & kernel) use the same initrd and the same file paths. When they are installed on separate partitions and the one farthest from the mbr is selected for boot, it will begin to boot and then switch to the one closest to the mbr, which results in a failed boot. If one is removed, the other boots fine, so it's not a menu.lst or a lilo config problem.
in imagemagick,i have a png format picture, but it seems too big for me,i wanna compress it ...how can i do ...command line...Code:convert "png.png" -strip -units PixelsPerInch -density 96 -quality 60 "png2.png"this command can't compress the picture, the worse is: the size will be bigger...
Is there is anyway, with tar, zip, gzip, or any file compression type to compress without causing high CPU. In other words, limit how hard the CPU works to compress it? Of course I understand that this would cause the compression to take longer but time isn't too big of a concern.
have 6680 wav files with about 500kb size in a folder and i want to merge all of them.the size of the files altogether is 1.5GB. how i can merge them and compress them to create a mp3 or ogg file?
I have a few '.flv's now and most are small enough to upload to my Joomla web site and play with AllVideos (extension) but a few are over the 10M limit, I have one that is actually 50M! Is there something I can use to either splice the videos (make 2 or 3 out of one, upload seperately) or to compress down to 10M- w/o wrecking the quality?
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
I am trying to compress a folder and the contents within, while keeping the permissions the same. I then need to check if the compress file is corrupt or not. Base on that result I need to transfer the file.
cd /home/ops/Desktop/temp tar cvzfp backup-"$(date +%d-%b-%y)".tar.gz /home/ops/Desktop/dir1 gunzip -tl backup-"$(date +%d-%b-%y)".tar.gz
I'm creating a script that creates files from svn checkout and compress them using tar.gz the script gets the repository name from command line argument i need to capture a number from the last line of the output and create a file name from it.
The svn returns output of all the file names from the repository and in the end it says: revision number xxxxx. i need to get this number and then rename the tar.gz to it. how do i save the output to a variable and get this number.
I've just noticed my Dolphin compress service menu doesn't compress anything except for '*.zip'. It used to compress '*.tar.gz' and '*tar.bz2' but now none of these work, and the bz2 option is gone altogether.Does anyone know what might have happened?I have openSUSE 11.3 & KDE 4.6, but this happened before I upgraded to KDE4.6 just a few days ago, so it's not KDE4.6 related, maybe 4.5? maybe Dolphin?I also tried to install the extract/compress plugin but it says "Could not find program 'compress_TARBZ2.sh'" which all are in my /bin folder. I also checked I have all the extract/compress file installed: zip, tar, gzip, bzip and so on.I also tried 'Compress to...' which open Ark, then I selected either 'gz' or 'bz' and still nothing happened.