General :: How To Untar An Archive From One Directory To Another
Jul 19, 2010
I want to untar a package from one directory to another directory, directly from the command prompt. I want to untar the joomla package into the htdocs directory of xampp. How to do that directly from command prompt ? The reason i am asking this is if I try the "drag and drop" way, it won't be possible as xampp is stored in /opt directory and without super-user authentication nothing can be saved into it. You can argue saying that why did I untar xampp initially in su mode ,but that had to be done so that apache doesnt give me any start-up problems.
1. is there a way to prohibit a program from writing data on the hdd? 2. can i have different icons on each gnome workspace and how? 3. how to untar to current directory? "tar -xvvf blablah.tar.gz" does not work "tar -xvvf blabla.tar.gz -C ./" does not work in both cases, the files go into some strange random locations. 4. how can i change the way colors are displayed for different content in the xfce terminal? I used to have red for archives, blue for dirs etc. in kde, but lost all that after changing to gnome.
I have been playing around with the tar command and I know this is how to use it. Code: tar -cf [filename] [directory] But what I want to make an archive from the current directory I thought just to not enter a directory but that doesn't work. I get an error about creating a empty archive so how to do I make it so how do I tell it to do the current directory?
Ubuntu 10.04. As part of my nightly backup script I archive my home directory with the following command tar -cvpzf /quitelarge/_mirror/mirror1/home-ken.gz /home/ken 2>> /quitelarge/_mirror/tar-error.log
It seems to work fine and I have recovered files from the archive on occasion. Actually I keep 7 rolling daily backups and a monthly burn to DVD. I had an sftp connection made by Nautilus to my server. Ubuntu for whatever reason places an icon on the desktop showing the connection. When I ran the script it decided to archive everything on my server - all 1.4 TB. I caught the problem when home-ken.gz was about 5 GB. I stopped the process, closed the sftp connection, rolled back the backups and tried again. This time I got a file of the expected size - about 45 MB.
In the error log I did find that the tar process was trying to suck the entire contents of the server into the archive file. tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c/sub0: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/scsi: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/event: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/fadt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/dsdt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/irq/21/smp_affinity: Cannot open: Permission denied
Is there an option I can place on the tar command to tell it NOT to follow the ssh connection which is sitting on my desktop? The closest thing I see in the documentation is -h which tells tar to "follow symlinks; archive and dump the files they point to." I am NOT specifying -h so if the ssh connection is treated as a symlink by tar I would still not expect the remote contents to be tarred.
untar a bunch of files located in different folders, with folder deep unkown.Found an old post about this matter but the suggestion extracts all files in the same folder (your current).I wan't to extract files to the same folder as the tar file.The solution from the old post (extracts all files to current folder)find . -name "*.tar" -exec tar xvf {}
Is there any way to untar and only extract those files that are above a certain date including directory structure??
I restored a backup on a play server but it was a few days old. However I have a tar archive of the entire structure that is more up to date and healthy so now I want to extract all files (including directory structure) based on a date filter on the files if possible?
Every shortcut... every program... every link... everything opens in Archive Manager. And then it reports that the archive is not supported... I gotta launch everything directory related through a terminal just to get to navigate through a window manager.
How to archive content of directory to file and also how to extract archive from file to directory just as below.
If first argument is a directory and second argument is a file, user is prompted to choose whether to archive content of directory to file or extract archive from file to directory.
Is it possible to have Archive Manager create the specified folder when I enter it into the path field? Like if I want to put something in "/Main/Sub1/Sub2" But "sub2" doesn't exist, can I just have Archive Manager create it and put the files there?
I am new to the world of linux and when attempting to verify a tar archive I am displayed the following error. When running the command tar cvfW archivename.tar filename directoryname does not yield any errors.
I'm trying to install Adobe Reader 8 and the command #dpkg -1 AdobeReader_enu-8.1.3-1.i386.deb comes up with the message error processing AdobeReader Cannot access archive. No such file or directory. Error was encountered while processing.
I'd like to ask about archive mounter feature, can I mount zip file with read write mode? can gvfsd-archive do that?, or I must use fuse-zip to mount it? If I must use fuse-zip, how I wrap it so I can use it via nautilus or via gvfs-fuse-daemon
I would like to ask you how I can convert the file permissions of a tar.gz file. I have one tar.gz file when I try to extract I get the following output
Cannot change ownership to uid 3361, gid 5000: Permission denied
i'm having some difficulties will my wireless on my friends hp dv4. It has a bcm4312 card and so far i've downloaded the kernel headers and the devel pakages, aswell as the broadcom hybrid driver. After i've un tar'd the file i type in make and it come up with
KBUILD_NOPEDANTIC=1 make -C /lib/modules/`uname -r`/build M=`pwd` make: *** /lib/modules/2.6.34.7-61.fc13.i686.PAE/build: No such file or directory. Stop. make: *** [all] Error 2
but as for my knowledge of linux i'm at my limit, if it wasn't for Archive Manager i probably wouldn't have been able to untar the file
Will unzipping always append files if the directories already exist? What about tar?
I unzipped an archive the archive had 3 folders and dozens of subfolders within those. app, skin, js
The folder I unzipped it to also already had those 3 folders as well as many child folders etc.
My website is still working so I'm just wondering. Is this safe or is there anything I should worry about. The other option was to manually upload each file manually that is in those 3 folders and the dozens of sub folders.
i am trying to install symantec endpoint on a linux server by this command rpm -ivh sav-1.0.3-8.i386.rpm but it gives me the following error error: unpacking of archive failed on file /opt/Symantec/bin/navdefutil;4ceb8d6b: cpio: mkdir failed - No such file or directory
I'm new to Ubuntu, and everytime i've tried to download a program like iTunes, the "Archive Manager" comes up and says "An Error Has Occured While Loading the Archive". how to fix this or download programs ?
I am stuck trying to write a script that does the following :
1. loops through all subdirectories of a given directory
2. for each found subdirectory, first create an archive carrying the same name as the subdir itself
3. then moves the actual subdirectory to a different path
Here is what I have so far: my base dir is /home/bob/Bureau, and it contains two sub dirs, "florissant 86 a" and "saule 84" I would like to create one archive for each subdir in /media/public/atelierPhotoArchive and then move the folder entirely to /tmp/photo.write
Everything goes well until I have to either tar or loop through the file with spaces in names
This last statement outputs different lines after each space in my lst file
Question 1: Is it possible to make it output once for each line in the file ?
Question 2: Is it possible to do some exec tar in the find command? I had difficulties extracting the "short" name for the archive ("saule 84") without /home/bob/Bureau, it is possible only with the printf %f, but how in the world can I get this value in to the exec option ?
Is it possible to compress the mysqldump output into say db_backup.sql.tgz. Then add that to an existing archive e.g. backup.tgz in one command or on the fly to save space and deleting it?
I'm trying to use tar to tar files before transfer, so I can keep the entire file path rather than losing it along the way. However, when I try to tar an empty folder, it tells me that it is cowardly refusing to create an empty archive. I want to keep the empty folder on the other end, but don't want to put anything else into the archive to make it non-empty. Is there any way to do this?
I need files to be <= 5GB to put on S3. Right now I have an ugly tar / gzip / cut before upload, then cat / zcat / tar on download, but it's really ugly - and nearly every archiver should support archive splitting right? What's the best way?
I'm searching for a tutorial on how to convert my videos (AVI, WMV and MKV formats) to a format that offers both good quality and small size.The audio part should be 128 kbit/s mp3 lame.I'm using Fedora 14 with an rpmfusion repo. It would be especially nice to be able to batch-convert the AVIs.
The command tar -xvf wpa_tables.tar gives this error
tar: ./xai-0/334Regency: Cannot open: Invalid or incomplete multibyte or wide character tar: Exiting with failure status due to previous errors
Obviously the backslash is giving tar some problem, but I've been all over the docs and can't figure out how to either make it skip this file or interpret the character literally. Here's my command history to show some of the options I've tried which don't work.
tar --no-wildcards -xkvf wpa_tables.tar tar --exclude ./xai-0/\334Regency -xkvf wpa_tables.tar tar --exclude "./xai-0/\334Regency" -xkvf wpa_tables.tar tar --ignore-command-error -xkvf wpa_tables.tar tar --ignore-failed-read -xkvf wpa_tables.tar tar --transform 's/\/slash/g' -xvf wpa_tables.tar ./xai-0/\334Regency
I'm totally out of ideas at this point and would welcome some input from more experienced members.