General :: Backup Script Multiple Gzips Into One Bzip2 Or Gzip?
Feb 18, 2010
I have a backup script basically is this
Code:
BACKUP_DIRS="/etc /boot /root /home"
BACKUP_FOLDER="/tmp/system_backup/
for DIR in ${BACKUP_DIRS}
do
[code]....
All the folders get dumped into seperate gzip files. Now I want all the gzip files in the backup folder into one final gzip or bzip2 file. My goal for this is to get one file instead of multiple so I can scp or ftp the one file to another file share. Which would be easier to send one file than a bunch of files.
View 2 Replies
ADVERTISEMENT
Jun 22, 2011
I sometimes get confused by the varying command line options I need to run common Unix archiving and compression software (e.g. gzip, bzip2, zip, tar).
Is there a program out there that can just Do What I Mean for common cases? For example:
View 2 Replies
View Related
Jun 22, 2010
I am using suse for a long time but I did not yet experience this strange error:
I installed 11.2 and now 11.3RC1 on a new HW: ASUS M4N72-E with AMD Phenom2 X4 965, 8GB Memory, standard settings.
I experienced unusual errors when copying large files to the new system with scp/ssh (connection broken).
Even stranger I could not read .tgz files created on the older machine, same OS 11.2, but MB ASUS M2N32-SLI with AMD Phenom X4 9850
Testing further reveals that if I create a .gz file from the same source it produces every time a different output:
-rw-r--r-- 1 root root 3875041280 Jun 21 01:51 wwtest.tar
-rw-r--r-- 1 root root 1326141906 Jun 21 01:55 wwtest1.tar.gz
-rw-r--r-- 1 root root 1326137319 Jun 21 03:26 wwtest2.tar.gz
-rw-r--r-- 1 root root 1326146273 Jun 22 07:32 wwtest3.tar.gz
[Code].....
I never saw something like it, I guess it must be something with the underlying library not working on the new processor. I tried with slower memory speed, newest BIOS etc ...always the same errors. Other wise the system works absolutely stable.
When I compare the generated files with one created on the older system it looks like there are single bits missing in a random number of bites.
View 2 Replies
View Related
Mar 29, 2011
I have 100 files: cvd1.txt cvd2.txt ... cvd100.txt
How to gzip 100 files into one .gz file so that after I gunzip it, I should have cvd1.txt, cvd2.txt ... cvd100.txt separately?
View 4 Replies
View Related
Apr 20, 2011
so this is really the result of another problem. There seems to be an issue with CPU spiking to 99% forever (until reboot) if I run apt-get or synaptic or update manager while an external USB drive is plugged in. Note, other USB peripherals are no problem, just an external HD.
So my work around was to eject the drive when doing apt-get or other installation work, then reattaching it to remount. Now, on to the present problem. I'm using the basic backup script (the rotating one) found in the Ubuntu Server manual. It uses tar and gzip to store a compressed version of the desired directories on my external USB. (which sits in a fire proof safe - this is for a business)
However, it seems tar and gzip which run nightly 6 days a week via cron as root, don't ever want to die, and they don't release the drive. I have to reboot the system (I can't logoff) to release the drive, unplug it, the I can do update/install work.
Of course, if apt etc. would work fine without conflicts with the external device, I'd not care about the tar/gzip problem other than it generally isn't a proper way for them to function and it chews up some CPU cycles. (they run about 0.6 and 1.7 percent respectively) I also can't kill them via kill or killall. They seem undead.
View 6 Replies
View Related
Nov 2, 2009
I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.
I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.
View 5 Replies
View Related
Jul 3, 2010
I am looking to use DD to compress an entire disk (sda) to an image file on sdb. A friend told me to mount sdb / partition 1, then as root, type the following command:
Code:
dd if=/dev/sda | bzip2 -9 >/media/sdb1/disk_image.img.bz2
This did not work. I just get an error message:
Code:
Invalid command line: Not enough files given. Aborting...
I was also told to restore (assuming sda has the image & sdb is the destination), I could do the following:
Code:
bzip2 -cd /media/sda1/disk_image.img.bz2 | dd of=/dev/sdb
Of course, since the first part didnt work, I couldnt test the second. How to use DD to byte for byte image the entire drive & compress it (its mostly empty space).
View 8 Replies
View Related
Apr 12, 2011
Given a gzip compressed file, how do I know what compression level (1-9) was used for it?
View 2 Replies
View Related
Apr 5, 2010
I want to be able to write a shell script for downloading files (only *.tar extension) from multiple folders (the sub folder's names may vary) in a FTP site and be able to untar them and then gzip them and then move them to the real folder.
View 4 Replies
View Related
Feb 21, 2011
I have a existing zipped file , I want to use gzip command to append some files to it , I tried man gzip but can't find the key word "append" , can advise how can I do it ?
View 1 Replies
View Related
Mar 11, 2010
Does gzip have the capability to decode gzipped traffic? I have been beating my head against the wall with this issue. What I'm trying to do is capture traffic between a web server and clients, and I've got it set up where it's redirected to a file for ease of grepping, however it's seemingly incapable of decoding gzipped encoding. I know I can do this with tshark, I'm curious as to whether tcpdump has this capability (i.e. only using tcpdump, and not some additional tool like tcpshow or what-not).
I can't find much on this issue in the man page for tcpdump, but it is fairly lengthy, so it's possible that I missed something, but I don't see that as especially likely.
View 2 Replies
View Related
Jun 4, 2011
I have a script which periodically backs up a directory using the command "tar -czvf [name] [directory]" but my problem is that the script has recently been putting a lot of stress on the server (Minecraft SMP) and tends to lag players as it backs up, which recently has been taking nearly 5 minutes.So I need to know if there's a way to control the GZip compression rate at the same time that it archives and backs up the files?I understand that I can first tar the files and then GZip them separately with a different compression rate afterwards, but this would not work because it names the files with the current server time, which sometimes changes in between commands.
View 1 Replies
View Related
Apr 7, 2010
I need to find TCSH shell and gzip version number by running a acript on several boxes through ssh. How can i do that? I made a script for tcsh but it is not working by ssh , it only works on my box . I dont know from where to find the gzip version info.
View 5 Replies
View Related
Jun 28, 2011
I am currently working on managing multiple linux servers in remote locations, servers particularly user for web hosting. I need to backup data to a backup server but rsync which i currently using doesnt helps is there any tool to backup every server with out modifying it bcos there are hundreds of servers so installing a tool in every server is time consuming process.
View 7 Replies
View Related
Jan 19, 2011
I need to backup about 25GB of stuff onto 6 disks, I was going to use deja-dup but It doesn't seem to have that feature, does anybody know of a program (with a GUI) that can do this for me?
View 9 Replies
View Related
Dec 13, 2010
I made a bzip2 file by
bzip2 -c /home/os/picture1 > /home/os/Desktop/pic.image
bzip2 -c /home/os/picture2 >> /home/os/Desktop/pic.image
But now extracting pic.image by bzip2 -d /home/os/Desktop/pic.image returns
bzip2: Can't guess original name for pic.image -- using pic.image.out
And then it just creates one file pic.image.out.
How do I access picture1 and picture2 from pic.image?
View 2 Replies
View Related
Oct 23, 2010
I am trying to upload an IOS in the cisco NAC Appliance. The IOS version has to be updated as 4.8. I am getting the below error when i tried. File is not in gzip format Child return status 1 Error exit delayed from previous errors. I am using the below command to unzip the IOS file. tar xzvf ccca_upgrade-4.8.0-from-4.6.x.tar.gz.
View 3 Replies
View Related
Jan 18, 2010
have a gzip file ABC_000023232.gzipBCD_023232032.gzipI want to split these files into smaller files but keep the extension same because I am using this as a variable in a script
Code:
for i in `ls *.gzip`
split -b 500K $i $i
[code]...
View 3 Replies
View Related
Jun 5, 2010
I have a large collection of pictures (12GB and growing) - way too big to fit on one CD or DVD.I want to back them up to CDs or DVD's in standard (I think it's iso 9660) format that Windows can read.I know how to do this the hard way - by manually selecting a pile of pictures that will fit on one disc, burning it and then going on to the next pile.There must be a way to tell k3b or a similar program to do this for me - to automatically make a backup of the whole thing using as many discs as necessary.Can anyone tell me how to do this?
I don't want to use tar or another archive/compression scheme because I want the pictures accessible to someone with minimal technical expertise who doesn't even know how to spell "Linux".
View 3 Replies
View Related
Sep 15, 2009
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
View 2 Replies
View Related
May 21, 2011
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
1. /backup/
2. /media/backup/
3. /mnt/backup/
4. /home/chris/backup/
View 7 Replies
View Related
Oct 31, 2010
how to install a bzip2 file
View 3 Replies
View Related
Aug 25, 2009
I have a problem to install bzip2, when I put make install something like this appear:
This is the first time that I install something after I install Ubuntu 9.04.
View 2 Replies
View Related
Feb 26, 2010
I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.
At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.
That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)
View 4 Replies
View Related
Sep 13, 2010
Can some one give me a sample of a crontab for backing a directory please, System is Ubuntu 9.04Quote:
#!/bin/bash
# this file is an automated backup script, backup.sh.
# this backs up my domain site.
[code]....
View 7 Replies
View Related
Sep 8, 2010
I have used this guide to install kernel in my linux cyberciti.biz/tips/compiling-linux-kernel-26.html stage 1 worked perfectly. Now at stage 2 when I write the command dir/ls in the tmp directory I see the file: linux-2.6.26.tar.bz2 (2.6.26 is the version of my kernel) And now if I try to the tar function on my file and write it like this: Code: tar -xjvf linux-2.6.26.tar.bz2 -C /usr/src it gives me couple of errors like:
tar: bzip2: can not exec: no such directory
tar: error is unrecoverable: exiting now
tar: child returned status 2
tar: error exit delayed from previus errors
View 2 Replies
View Related
Aug 18, 2009
-bash-3.2# tar -xzvf lzo-2.03.tar.gz
gzip: stdout: Cannot allocate memory
lzo-2.03/
lzo-2.03/src/
[code]....
This is on a VPS with 256mb memory. vmstat and /prov/meminfo both show that over 200mb is free.
View 1 Replies
View Related
Mar 28, 2011
when iam opening my dec_backup folder the folling error is appering :
the error is gzip :stdin:input /output error
/bin /gstar :unexpected EOF in archive
/bin /gstar: error is not recoverable :exesting now
View 1 Replies
View Related
May 29, 2009
I am trying to use a shared bzip2 library in a program I'm writing. For the life of me I can't figure out the correct gcc switch to link it in. I've tried pkg-config, but i don't know the name of the .pc file. Is there any rule for these things?
View 3 Replies
View Related
May 4, 2011
I am going crazy with a gzip file. I can decompress the file in Windows using WinRAR but it is impossible on any UNIX operating system. the file seems to be ok. If I do file the_name_of_the_file.gz
I get: the_name_of_the_file.gz: gzip compressed data, from Unix, last modified: Sun Jan 30 14:10:21 2011
But if I do gunzip -f the_name_of_the_file.gz I alsways get: gzip: the_name_of_the_file.gz: unexpected end of file The same problem happens when I try to extract the file using the GUI tool in Ubuntu or MacOSX,
View 4 Replies
View Related