Software :: How To Edit The Initrd.img (decompress And Compress)
Oct 18, 2010
I need to recreate my initrd.img after having extracted its contents. Bash by itself; pointing me to similar threads in this forum and google are useless to me and a waste of everyone's time as that has all failed. I need a working example. Apparently, I am supposed to use this bash command (s): "zcat ../initrd.gz | cpio -i -d." The preceding command is unintelligible to me. I cannot compress the initrd.img file and folders back into an initrd.gz file with a compression level of 9, so that I can rename with a .img extension.
My understanding of recompressing folders back into the initrd.img: Google and this forum all point to bash involving either zcat or cpio and then gzip with a compression level of 9. However, I require exacting instructions for using these commands to compress the folders that have been extracted from the initrd.img back into one homogenous initrd.gz archive so that that the created initrd.gz can be renamed initrd.img
Note: posting bash without that an example is a waste of everyone's time as I found that on Google and it was useless as I lack the requisite computer science degree or years of Linux guru experience needed to figure out how to specify the arguements proprerly. What I need is a working example, not just bash.
Note2: To save time, the answer to why I need to edit the initrd.img is this: Two different utilities (based upon the same parent system & kernel) use the same initrd and the same file paths. When they are installed on separate partitions and the one farthest from the mbr is selected for boot, it will begin to boot and then switch to the one closest to the mbr, which results in a failed boot. If one is removed, the other boots fine, so it's not a menu.lst or a lilo config problem.
View 14 Replies
ADVERTISEMENT
Feb 18, 2011
I am trying to compress a folder by right clicking and selecting Compress...I get the following error:
An error occurred while adding files to the archive.No such file or directory
I want the folder and its content to be compressed to a .ZIP file which is natively accessible by Windows.
View 6 Replies
View Related
May 18, 2010
When I compile a custom kernel with this command: make-kpkg --initrd kernel_image kernel_headers and then install the .deb, there's no initrd in /boot and I have to create it manually. I've thought that the --initrd option should take care about this, but somehow it doesn't.
It behaves like this for about two years at least (since I've compiled my first kernel). Of course, it's no big deal to create it manually, I was just wondering whether do I do anything wrong or whether should I fill a bug report..
View 1 Replies
View Related
Aug 6, 2010
I'm trying to decompress the Packages file from:[URL] I get these responces:
tar zxf Packages.gz
tar: This does not look like a tar archive
tar: Skipping to next header
[code]....
View 6 Replies
View Related
May 20, 2015
In the past (with Wheezy and before) I often used "decompress" via double click on compressed folders or "compress" via right click on folders (or files) in Nautilus. Since I installed Jessie this option has vanished. I added several packages like "zip", "7z", "unzip" and so forth. Now I can do similar things via command line, but I just don't find any option anywhere to enable compressing and decompressing in Nautilus again. There seem to be no options for configuring such things in Nautilus.
I have the odd feeling my Jessie installation is broken since many little things are missing from the beginning. Should the old behaviour of Nautilus be standard in Jessie also?
View 3 Replies
View Related
Apr 18, 2010
How to compress a PDF document (open in vim, hold down D for a few seconds) and that's worked, but now the document won't open anymore. How do I decompress it?
View 2 Replies
View Related
Jan 3, 2011
I have recently upgraded to Bugzilla3 and I wanted to restore my bugzilla database with my backup but when I attempt to tar -xvvzf file.tgz I get the error:
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors
My script that creates the backup is:
#!/bin/sh
datestr=`date +%m-%d-%Y`
bakdirpart="bugzilla.backup.$datestr"
bakdir="$HOME/$bakdirpart"
mkdir "$bakdir"
(cd /etc; tar cvzf $bakdir/mysql.conf.tgz mysql)
(cd /etc; tar cvzf $bakdir/apache2.conf.tgz apache2)
(cd /usr/share; tar cvzf $bakdir/bzreport.share.tgz bzreport)
(cd /usr/share; tar cvzf $bakdir/bugzilla.share.tgz bugzilla)
(cd /var/lib; tar cvzf $bakdir/mysql.hotdb.tgz mysql)
(cd /var; tar cvzf $bakdir/www.tgz www)
(cd "$HOME"; tar cvf "${bakdir}.tar" "$bakdirpart")
Is there any way to recover my backup copy?
View 7 Replies
View Related
Feb 7, 2011
I'm running a virtual machine of CentOS 3 and I am trying to decompress a tar file, but I run out disk space. I created the VM with 80 GB of disk space. When I look at the partititions, (du command) I have /dev/sda2 with a partition of 70GB mounted on /home with < 1% used.
Here comes the n00b question: How do I use the 70GB of space on sda2? I thought working in the /home directory, where sda2 is mounted, would give me access to that disk space, but the tar files fill up the /boot partition.
View 3 Replies
View Related
Apr 8, 2011
I have tried to plan my backup plans. As I want it simple I am gonna use only tar.gz combination of some files that are important. My question then is the following:
-I have a 100GB hard disk with 20Gb free space only. I would like to backup the rest 80Gb to an external hard disk. I run my scripts which end up saving a 75Gb(due to compression) to my external hard disk.
-->Then comes the times to try to see the contents of my archive (just to make sure that I can recover what is inside the 75GB disk file). Do you know if tar.gz needs to decompress the 75Gb file in some /tmp space in my hard disk for showing me the contents inside it? In that case it will not be easy at all to ever look at what is inside it in my hard disk, as there is no 80Gb of free space in my hard disk (20gb only).
View 9 Replies
View Related
May 26, 2010
I have a large list of folders that I want to compress using tar.the script I want to right is to travers all the folders and for each I would tar the contents to a tar file with the same name of that folder. and the it should be stored inside the same forlder.
View 6 Replies
View Related
Jul 31, 2010
I want to package my application for linux, I use zip/tar.gz/7z/rar to do the compress in Windows.Because I do the compress in Windows, so when I extract the package in Linux, the file permission is reset.The problem is Only the tar.gz can let me extract runable files, other format (zip/7z/rar) not (the file permission is reset to 644).So my question is how to compress my files using zip/7z/rar while reset the permission to 755.
View 2 Replies
View Related
Oct 4, 2010
How can i Compress a file in ubuntu 10.04??
View 5 Replies
View Related
Feb 11, 2010
I copy my partition with this command
Code:
dd if=/dev/sdb2 of=/home/sam/partition.image bs=4096 conv=notrunc,noerror
But I would like to compress at the same time. So, the output file to be some rar or zip ot tar.gz file.
View 10 Replies
View Related
Oct 25, 2010
I just read in my Linux+ resources that it is not a wise idea to compress tar files with gzip if my tape drive supports compression. In addition, the resources mention that using gzip for tar files runs the risk of data lost if an error occurs with the compression.
Does anyone compress tar files for major backups/restore policies?
View 1 Replies
View Related
Dec 23, 2010
When I right click on a file and click "compress" I get a .zip file that is exactly the same size as the original file.
View 2 Replies
View Related
Mar 31, 2011
I want to know how to compress files to zip, rar or 7z with password using KDE or Dolphin interface and no command line. I can compress without password, I have not found an option to protect with password. What do I have to install or configure?
View 1 Replies
View Related
Sep 19, 2010
I've googled til my brain went boom... So the short story goes; I have lots of ip address ranges in multiple files which need to go into an iptables firewall... Sounds simple right?
Example of files:
1.0.1.0-1.0.1.255
1.0.1.0-1.1.0.255
[code]....
View 3 Replies
View Related
May 23, 2011
I am about to decompress a source file $$$.tar file? In what place do you decompress a tarball? I do know that I will be entering ./configure, make.
View 3 Replies
View Related
Dec 19, 2010
I need to be able to compress into multiple pieces while still being able to be read by someone else who has windows, I can only download unrar but not rar. Why is this?
View 2 Replies
View Related
Oct 30, 2010
I currently have a bash script that runs and backs up my daily files onto a separate partition using Rsync, but I thought it would be good to use the Ubuntu-one service as an ADDITIONAL backup for really important files.
How do I compress then encrypt those files, and can I add any commands that will do this to my existing bash script?
I am running Ubuntu 10.04
View 1 Replies
View Related
Feb 1, 2011
I have a compressed text file. The method of compress is unknown.I can see the file contents by using Midnight Commander without a problem but I would like to view the file just with cat. So I am trying to uncompress the file with unzip or gunzip but it does not work.How to check the method the file is compressed with? Is any way to find it with Midnight Commander?
View 4 Replies
View Related
Feb 23, 2011
I want a huge file of .jpeg files. How can I can compress these files to save my space? Is there any s/w for this??
View 2 Replies
View Related
Jan 12, 2011
in imagemagick,i have a png format picture, but it seems too big for me,i wanna compress it ...how can i do ...command line...Code:convert "png.png" -strip -units PixelsPerInch -density 96 -quality 60 "png2.png"this command can't compress the picture, the worse is: the size will be bigger...
View 4 Replies
View Related
Oct 23, 2009
Is there is anyway, with tar, zip, gzip, or any file compression type to compress without causing high CPU. In other words, limit how hard the CPU works to compress it? Of course I understand that this would cause the compression to take longer but time isn't too big of a concern.
View 3 Replies
View Related
Jun 18, 2011
have 6680 wav files with about 500kb size in a folder and i want to merge all of them.the size of the files altogether is 1.5GB. how i can merge them and compress them to create a mp3 or ogg file?
View 4 Replies
View Related
Aug 18, 2011
I'm looking for a way to compress a large file (~10GB) into several files that wont exceed 150MB each.
Any thoughts?
View 2 Replies
View Related
Jan 9, 2010
I have a few '.flv's now and most are small enough to upload to my Joomla web site and play with AllVideos (extension) but a few are over the 10M limit, I have one that is actually 50M! Is there something I can use to either splice the videos (make 2 or 3 out of one, upload seperately) or to compress down to 10M- w/o wrecking the quality?
View 2 Replies
View Related
Apr 25, 2010
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
View 4 Replies
View Related
Feb 14, 2011
I am trying to compress a folder and the contents within, while keeping the permissions the same. I then need to check if the compress file is corrupt or not. Base on that result I need to transfer the file.
cd /home/ops/Desktop/temp
tar cvzfp backup-"$(date +%d-%b-%y)".tar.gz /home/ops/Desktop/dir1
gunzip -tl backup-"$(date +%d-%b-%y)".tar.gz
View 2 Replies
View Related
Apr 30, 2011
My hard disk is failing and I amnot able to boot into the system! Currently I have logged into the system uing Live CD! Any way to compress and back up the data in my hard disk in an efficient way!
View 6 Replies
View Related