Ubuntu :: Bzip2 Compressed File Too Large

Feb 26, 2010

I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.

At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.

That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)

View 4 Replies


ADVERTISEMENT

Ubuntu :: How To Install A Bzip2 File

Oct 31, 2010

how to install a bzip2 file

View 3 Replies View Related

Ubuntu :: File Extension Z - Unix Compressed Archive File?

May 6, 2010

Anyone know how to compress a file to extension z?not tar.gz , zip, 7zip

View 6 Replies View Related

General :: Unexpected End Of File. Gzip Compressed File?

May 4, 2011

I am going crazy with a gzip file. I can decompress the file in Windows using WinRAR but it is impossible on any UNIX operating system. the file seems to be ok. If I do file the_name_of_the_file.gz

I get: the_name_of_the_file.gz: gzip compressed data, from Unix, last modified: Sun Jan 30 14:10:21 2011

But if I do gunzip -f the_name_of_the_file.gz I alsways get: gzip: the_name_of_the_file.gz: unexpected end of file The same problem happens when I try to extract the file using the GUI tool in Ubuntu or MacOSX,

View 4 Replies View Related

Ubuntu :: Get A Compressed File From Mysqldump?

Jul 13, 2010

I would like to have my backup script that I am writing to create a sql dump of my database and go directly into a tar file. Does anyone know how I could do this with one command?

To be more clear I would like to go from

mysqldump -u xxxx -pXXXXX tablename> currentbackup.sql
tar -czvf backup-XXXXXXXX.tgz currentbackup.sql
rm currentbackup.sql

To a single command somehow. Does anyone know how I could accomplish something like this?

View 2 Replies View Related

Ubuntu / Apple :: Backing Up Compressed File To An External Hardrive?

Apr 21, 2010

I am attempting to be careful in case my system crashes, and although highly unlikely my first question is if there is a way to first compress my Linux Partitions. After running the diskutil command in OSX's Terminal, I basically end up with this poartition scheme:

Quote:
Macintosh HD = 130GB
disk0s3 = 1MB
disk0s4 = 30GB
Linux Swap = 1.3 GB

I am sure there is a way in the Terminal to first compress disk0s3, disk0s4, and Linux Swap, and then output the compressed partitions into my external Harddrive. I have already read some of the suggestions that only /HOME, /etc/fstab/, list of installed packages, /opt, and /var/cache/apt/archives/-where all installed packages are stored, is what I should backup. But, please correct me if I'm wrong. Wouldn't it take quite a while to install all those packages again in case of a system failure. Or would it just be easier to untar all of them in their directories once Linux has been reinstalled. The closest command I have found so far in being able to achieve this is:

Quote:

sudo tar cvf - files | (cd target_directory ; tar xpf -) The above code is very suitable for what I am looking for because it enables you to copy files into another location by using the tar command where you would create In my case the new location would be my external harddrive. My external harddrive already has its own Linux partition which I am able to mount in Linux and that Linux sees as free space.

View 7 Replies View Related

General :: Access Compressed File Without An Intermediate?

Jul 19, 2010

I'm trying to figure out how to access compressed files without uncompressing them beforehand, and also without modifying the application/script I am using. Named pipes do the trick, but only seem to work once

In one terminal I do this:

Code:
$ echo "This is a file I'd like to be able to read." >> my_file
$ gzip my_file
$ mkfifo my_named_pipe
$ ls
my_file.gz my_named_pipe
$ gunzip -c my_file.gz >> my_named_pipe

[Code]...

View 3 Replies View Related

General :: Need Writable Compressed File System

Mar 29, 2010

Can anyone recommend a file system similar to SquashFS but writable?

View 2 Replies View Related

Ubuntu :: Gnome Commander Does Not Add Files When Creating A New Archive/compressed File?

Feb 19, 2010

When trying to create a new compressed/archive file in Gnome Commander (GM) the file is created but the selected files are not added. I can open the new (empty) archive file and then add files to be compressed. I have tried using several different formats (zip, tar.bz and others) with the same results. The "file roller" is shown as a plugin but has no configuration other than the compressed file type.

View 2 Replies View Related

General :: Windows - Creating Compressed Iso Image File?

May 17, 2011

I want to create a compressed ISO image file and mount that file to one of the virtual drives and access the content (read-only) without worrying about manual decompression/extraction.For Windows and Linux (Ubuntu) OSes.

View 1 Replies View Related

General :: Extracting A Bzip2 File Throws "Can't Guess Original Name" And Does Not Extract Separate Files

Dec 13, 2010

I made a bzip2 file by

bzip2 -c /home/os/picture1 > /home/os/Desktop/pic.image

bzip2 -c /home/os/picture2 >> /home/os/Desktop/pic.image

But now extracting pic.image by bzip2 -d /home/os/Desktop/pic.image returns

bzip2: Can't guess original name for pic.image -- using pic.image.out

And then it just creates one file pic.image.out.

How do I access picture1 and picture2 from pic.image?

View 2 Replies View Related

General :: Make Sure The Compressed File Wont Be Larger Than 300mb?

Jul 14, 2011

i am using the following command to backup and sql file:

tar -zcvf "$BACKUP_DST/$FILE_NAME.tgz" "$BACKUP_DST/$FILE_NAME.sql"


i want to make sure the compressed file wont be larger then 300mb, if it exceeds 300mb, split it into several files.

View 1 Replies View Related

General :: Security - Safely Zero Fill A File In A Compressed Filesystem?

Aug 29, 2011

I had read that the shred doesn't safely work for compressed filesystems when shredding a file, how this can be accomplished in a compressed fs ?

View 1 Replies View Related

OpenSUSE Install :: Mount A Live Compressed File System For Reading & Writing From A LiveCD/DVD Image?

Jul 9, 2011

On a Linux CD/DVD, there are compressed filesystem images for the live version for KDE or Gnome for example, but they have no extension, but they are clearly an image file ( compressed filesystem images for the live version before installation ) !!

I was wondering, How do I mount these compressed filesystem images, after I copy the ISO content of the CD/DVD on my system .... I want to edit some files or packages and make some changes, like if I want to customize a live version of gnome for example ! ... ( I know you might be tempted to tell me to use KIWI etc to customize etc ..... ) ... but I want to be able to mount the compressed file system image, then edit it for reading and writing while it is in a subdirectory on its own ... i want to open it ! ... is there a way to do this ??? ... these type of files have no extension ...

i can open this compressed filesystem image then to edit for read & write ... before I roll it back again ..... If and when I succeed .... what should I watch out for ? ... will the same compressed file image but slightly modified work again ?

PS. that same question could be kind of translated or be extended like : how do I use unionfs/squashfs programs on the command line to mount these image files with no extension for read & write mode ???

View 9 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

General :: Slackware LiveCD "Intinte" - Can't Mount Compressed SquashFS File System

Mar 7, 2010

I have a problem, I'm trying to make my own LiveCD, but I can't mount compressed SquashFS file system. Here I give you my limited LiveCD version... If somebody would take a look [URL]

View 5 Replies View Related

Ubuntu :: "Error Splicing File: File Too Large" - Copy To A 500G USB Drive?

Sep 10, 2010

I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?

View 3 Replies View Related

Ubuntu :: Large .tar.gz File That Trying To Extract?

Jan 4, 2011

I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:

Code:
tar zxvf file.tar.gz
and the error is:

[code]...

View 9 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: 699MB 10.04 File Too Large For 700MB CD?

May 2, 2010

I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.

Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!

View 6 Replies View Related

Ubuntu :: 10.04 Hangs During Large File Transfers?

Aug 1, 2010

I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:

Intel D945GSEJT w/ Atom N270 CPU
2GB DDR2 SO-DIMM (this board uses laptop chipset)
External 60W AC adapter in lieu of internal PSU
133x CompactFlash -> IDE adapter for OS installation
2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4

Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.

Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.

I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.

View 4 Replies View Related

Ubuntu Servers :: Large File Transfer On LAN?

Nov 11, 2010

I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.

The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.

View 2 Replies View Related

General :: Using DD Command With Bzip2?

Jul 3, 2010

I am looking to use DD to compress an entire disk (sda) to an image file on sdb. A friend told me to mount sdb / partition 1, then as root, type the following command:

Code:

dd if=/dev/sda | bzip2 -9 >/media/sdb1/disk_image.img.bz2

This did not work. I just get an error message:

Code:

Invalid command line: Not enough files given. Aborting...

I was also told to restore (assuming sda has the image & sdb is the destination), I could do the following:

Code:

bzip2 -cd /media/sda1/disk_image.img.bz2 | dd of=/dev/sdb

Of course, since the first part didnt work, I couldnt test the second. How to use DD to byte for byte image the entire drive & compress it (its mostly empty space).

View 8 Replies View Related

Ubuntu :: Error "File Too Large" Copying 7.3gb File To USB Stick

Nov 24, 2010

I am trying to copy a 7.3gb .iso file to an 8gb USB stick and I get the following error when it hits 4.0gb

Error while copying "xxxxxx.iso". There was an error copying the file into /media/6262-FDBB. Error splicing file: File too large The file is to be used by a windows user, and I'm just trying to do a simple copy, not a burn to USB or anything fancy. Using 10.4.1.LTS, AMD Dual Core, all latest patches.

View 2 Replies View Related

Server :: NFS Large File Copies Fail - Error Writing To File: Input/output Error?

Jun 27, 2009

I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:

"Error writing to file: Input/output error"

I've run out of ideas as to what could cause this problem. I have tried the following:

1. Different NFS versions: NFS3 and NFS4
2. Tried copying the files to different physical drives on the server.
3. Tried copying the files from different physical drives on the client.
4. Tried different rsize and wsize block sizes when mounting the NFS share
5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.

Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.

Some other notes.

1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64

I am out of ideas. Has anyone else experienced something similar?

View 13 Replies View Related

Ubuntu :: Can't Copy A Large 30gig Image File?

Jan 3, 2010

I have some large image files that are 30 gig and more. I am running Ubuntu 9.10 whenever I try to copy one of these files to another drive I get a error saying the file is too large. I am trying to copy from an external Hard Drive or a slave drive does the same thing. I have a friend who has expressed the same issue. This must be a widespread bug.

View 9 Replies View Related

Ubuntu :: File System Format For Mac OSX 10.5 For Large Files?

Sep 19, 2010

Is there a file system that both Mac OSX 10.5 and linux can read/write for large files (like 4gb files)? My desktop is Ubuntu and I run most from there, but I want to back up my MacBook and linux box on the same external hard drive. Seems there are some (paid) apps for Mac that will mount NTFS but I'm wondering if there is just a shared files ystem that will work for both.

View 9 Replies View Related

Ubuntu :: Large File Size With Xsane Save To PDF?

Jun 17, 2011

CanoScan LiDE 210 running under 10.10 on a Tosh Tecra M11-130 laptop.Currently trying out xsane to archive some paperwork in monochrome, as the bundled Simple Scan utility can only save in either colour or greyscale. The problem is that the same A4 page saved as monochrome has a file size about three times larger in Ubuntu than in Windoze.

The scan mode options are either 'Colour', 'Greyscale' or 'Lineart'. There is no 'halftone' setting available, as shown in some of the xsane manuals. Don't know whether this is significant to this issue. Xsane's main option window shows 3508 x 2480 x 1 bit for a 300 dpi A4 monochrome scan when 'lineart' is selected, but the intermediate file size is 8.3MB instead of just over 1MB before packing for the PDF. This is consistent with each pixel not being recorded as a 1 or a 0, but as a greyscale 11111111 or 00000000, i.e. monochrome/halftone, but stored in an eight bit field. how to tweak xsane for true monochrome intermediate .pnm files and saved PDFs?

View 5 Replies View Related

Ubuntu :: Network Shuts Down When Trying To Transfer A Large File?

Jun 6, 2010

When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I

View 6 Replies View Related

Software :: Error 1 To Install Bzip2 1.0.5

Aug 25, 2009

I have a problem to install bzip2, when I put make install something like this appear:

This is the first time that I install something after I install Ubuntu 9.04.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved