Software :: Gunzip Truncates Dd.gz File To Zero Size .dd File?
Jun 18, 2010
I am using DD to backup entire system partitions and now I am trying to restore one. The resulting disk image from my buggy process has zero bytes. D'oh.It apparently thinks the image was trailing garbage and ignores it. It deletes the original file and replaces it with a zero byte .dd file. I have the original copy of the image in a dd.gz file. It's 6.3 GB so it may still contain the data.How do I get the original image back without destroying it again?
View 1 Replies
ADVERTISEMENT
Jan 15, 2010
I have a number of files.tar.gz that I have to gunzip to get .tar then I tar -xvf .tar, to get a readable file. Is there a way to do that all in one command. I tried to pipe it being new and all and I get errors.
View 2 Replies
View Related
Jun 14, 2011
Is there a way to gunzip a file that isn't ending with a .zip even though I know for a fact the file is indeed a zip?
View 2 Replies
View Related
Jun 10, 2010
Are there software that can split big file size into small file size in Linux?
View 1 Replies
View Related
Apr 22, 2011
I am curious if perhaps I am doing something wrong extracting pages from a pdf doc using pdftk and creating a new file. I am only extracting the odd pages from the file and outputting them to a new file that is now only 20 pages instead of the input's 40 pages, yet the new output file is still 1.4Mb in size, the same as the original.
It seems strange to extract only half the pages of a large document and end up with a result that is the same size. how to streamline the resulting pdf's using pdftk?
BTW this is the command I am using, in case perhaps I am missing an option to optimize file size or something:
Code:
pdftk A=ch15.pdf cat A1-40odd output odd.pdf
View 1 Replies
View Related
Feb 23, 2009
I'm researching about symbolic links been used with samba / CIFS:I'd like that the user that uses a MS-Windows OS could see my shared folder on CentOS 5 and the symbolic links that are inside this folder. Well, it works but, the user will see that the size of the file is bigger than the real file. Apparently, CIFS gets the size of the symbolic link (aproxim.32K) and add it to the size of the file.Example 1: 100KB file, used with shared folder, MS-Windows's user will see 100KBExample 2: 100KB file, used with symbolic link inside a shared folder, MS-Windows's user will see 132KB. (Sym link + size of file)Is there a way to allow the user only see the size of the file, and not the file + symbolic links ?
View 1 Replies
View Related
Jun 13, 2011
I was just testing specifying limit on file size to a user and have added the following to /etc/security/limits.conf bob soft fsize 100 This basically should have said not to allow bob to create anyfile greater than 100Kb in size.
But the interesting thing is, if bob already has any file which is greater than 100Kb in size, it even doesn't allow to log him into the system both from console and SSH. Also nothing is logged in logs.. How do I configure it so that, bob can login to the system even though he has any file greater than 100Kb (but doesn't allow him to create file which are greater than 100Kb) ??
View 3 Replies
View Related
Jul 12, 2010
We have some large files with sampling data in it. Don't want to delete these files. But want to quickly overwrite the file with 0s and/or 1s and preserve the original file size.
View 3 Replies
View Related
Jan 19, 2011
is lvresize with --resizefs options re-size the Logical Volume and then re-size the file system? i mean we don't need to use resize2fs?I looked at man pages but it doesn't explain this option.
View 3 Replies
View Related
Dec 14, 2010
How can we find the maximum size of the inode table and what decides it, and how the maximum size of volume of file system is decided ?
View 4 Replies
View Related
Apr 19, 2010
I've got a vnc log file on a barely used server hitting 124 gigs.
On one of our main systems it's at 5 gigs.
Both to large but what could cause such a large log file?
And what can I do to limit it?
View 1 Replies
View Related
Dec 28, 2010
I want to read a file in C, but i dont know the size of the file. Is there any way to find the size of the file in C...?
View 2 Replies
View Related
Mar 15, 2010
At some point my wine install died. I haven't used it a lot and I update my Fedora 11 regularly so I'm not sure what made it break. I thought "ok, just see if there's an updated version". 'yum info wine' says there is an update version and the file is 27k in size. Tried installing and no joy. Tried erasing wine and then installing; no joy. Yum says that the X86-64 and the i686 version are both 27k in size. I know for sure that is wrong. On a semi-nonFedora note, I tried compiling my own version of wine. It compiled fine after installing some dependencies and '-devel' files, but it gets the same crash as the Fedora version was getting.
View 10 Replies
View Related
Jan 27, 2010
I have a large number of folders that each contain quite a few files of varying sizes (from a few bytes to 400kb or so), mostly smaller ones. I need to get the actual (not the disk usage) size of these folders. Is there any way to do this with a command like 'du'?
View 4 Replies
View Related
Mar 15, 2010
Is there a maximum size of file that soundKonverter can deal with? I can convert a 10.4MB wma file to mp3, but am unable to convert a 40.2MB wma to mp3 (it produces only a 0byte mp3 file). I'm using it under Hardy Heron.
View 1 Replies
View Related
Jun 20, 2010
Is there any command to get file size in MB in Ubuntu?
View 3 Replies
View Related
Jul 7, 2010
I made 272 Photos from one book. Total size was 920MB. I used than gscan2pdf to create pdf document, and reduced there quality size. Result 116 MB. Than i converted it to grayscale [URL] with:
Code:
gs -sOutputFile=grayscale.pdf -sDEVICE=pdfwrite
-sColorConversionStrategy=Gray -dProcessColorModel=/DeviceGray
-dCompatibilityLevel=1.4 -dNOPAUSE -dBATCH color.pdf
Result: 108MB
Is it anyway possible to reduce size more? Something like Google does with their books?
View 5 Replies
View Related
May 30, 2011
After screwing up an update to Ubuntu 11.04 I decided to do a clean install. I tried downloading the AMD64 DVD image of 11.04 but I have found that some of the files cannot be downloaded and appear to have bad file size. In several mirrors and repositories I found the image size to be only 46.1 MB! (Yes, thats "Mega"-bytes, not Giga-bytes. I ftp'd to the repositories/mirror sites and confirmed this.) Yet in many of the HTTP pages it shows as 4.0 GB.I can't believe that the true size is 46.1 MB as the i386 DVD image is over 3 GB. 4.0 GB sound right, but doesn't match the actual file size. So, how long until it gets updated?
View 6 Replies
View Related
Mar 22, 2010
There seems to be a limit of 2 gb on files. Im trying to add my music files and have about 7 gb. Is there a way to make a file that will allow the extra info.
View 7 Replies
View Related
Sep 7, 2010
I am using FTP to read a file. Before the transfer I need to check the size of file.
I found the keyword to check the size of file in C. But I want to know how to use it.
ftp command to check the size is --> SIZE
ftpXfer(IPaddr, FtpUser, FtpPass,
"", "SIZE %??", RootDir, FileName,
&ctrlSock, &dataSock);
fileSize read
read(dataSock, ??, ??);
View 2 Replies
View Related
Feb 27, 2010
Firstly, I did perform a search on this problem in these forums, but didn't quite get what I was looking for. So I hope I don't yelled at for making a duplicate post. So I used rsync to backup my webroot to another nix machine. du -hs gave me 1.3 G on the source machine and 1.1 G on the backup machine. I tried to compare the individual files and noticed a trend. The files on backup machine were always smaller than the files on source machine. The source uses SATA drive, destination uses IDE. So this time I rsynced locally to another folder on the source machine. Same size anomaly. So i did a simple cp file ~/file and same size anomaly. So it's not a rsync issue.
I took a file and ran md5sum on both, the source file and destination file. To my surprise, even though the file size was different, they had the same md5sum. Now, let it be known that the source machine is a production server and the dir i rsynced was being used, serving pages to the web. I googled about this and came up with stuff like open descriptors and holes. I don't understand this stuff and was wondering if this was really the case. What are those if it is the case? And my backup copy is 100% identical right? There are thousands of files and I ran md5sum only on couple. Can I take comfort that when time comes, I can restore using my backup without any problems?
View 3 Replies
View Related
Mar 11, 2010
I found one weird behavior.The file i copied from a server is almost doubled the size from the server.How come?
example:
720K ./xxxx vs 360K ./xxxx
(my pc) vs (server)
I have checked both filesystem are ext3
View 2 Replies
View Related
Oct 31, 2010
How to reduce the size of a file. Is truncate related to this?
View 12 Replies
View Related
Nov 30, 2010
i have this directory with multiple images 'pics' and the size is 20mb, and i want to make a .zip or .rar package of this directory but with an increased size so the .zip/.rar file will be 100mb, and then when you extract it the file size is the original 20mb
View 10 Replies
View Related
Jul 27, 2010
I am getting a segmentation fault with a core dump,running a new C program, but the core file size is set to zero ("ulimit -a") so no core file to use with gdb. I tried "ulimit -c unlimited" both as me and as root, but it doesn't change. Still zero..
View 4 Replies
View Related
Apr 1, 2011
I have a Deskstar 80GB PATA drive with a 40GB Partition (sdb4) which serves as a store for BackupPC which is backing up 2 machines.
Currently File, Properties in Konqueror tells me that I have 51GB of files on this partition and 1.5GB free!
I have copied all the files (261,000) to a new hard disc (also a Deskstar 80GB) and the file size is still shown as 51Gb. This time however the free space indication seems correct.
Can I assume that there is an error somewhere? Is it likely to prove fatal? How to deleting everything and starting a new set of backups?
View 3 Replies
View Related
Feb 17, 2010
I was cloned an entire disk (with one NTFS partition of 70GB, only 15GB used) using ddrescue with the following command...
Code:
ddrescue --no-split /dev/sda /media/usb250/ntfs.img /media/usb250/ntfs.log
The img file it's around 70GB. Now I want to restore it to another computer with 30GB of space. How can I reduce the size of img file?
View 13 Replies
View Related
Mar 25, 2010
script that will check if my log file is a certain size? EX: I want to limit the size of my rsync log to say 5MB, if that's true I would move it and create a new one.
View 4 Replies
View Related
Apr 3, 2010
Using bash, is it possible to get the average file size of each file in a directory of ~2000 files?
View 7 Replies
View Related
Mar 11, 2010
I'm all new to linux. I've got Fedora core 12 - I'm ex windows user. I have these 3 websites to maintain: These are in finnish language. So called pikalaina sites:
pikalainat
pikavipit
vipit
And I have to add pictures to these pages. I don't know how to do even that I don't know web programming or HTML. But my images are about 1 mb in file size - I use to have windows and photoshop and there is this save for web feature where file size is reduced.I have this GIMP -program now - it's terrible compared to photoshop, but it's free. In GIMP there is no feature how to reduce file size for ex. 1mb to 20 Kb. How do I do this? Do you know any good program to do it?
View 8 Replies
View Related