General :: Extract Tar.gz File In Other Directories?
Jun 15, 2010
I stay in /var/www/upload and I want extract a file with tar command.
The output of tar xfvz /var/www/file.tar.gz is
Quote:
tar: /var/www/esempio.tar.bz2: funzione "open" non riuscita: Nessun file o directory
tar: Errore irrimediabile: uscita immediata
tar: Child returned status 2
tar: Uscita con stato di fallimento in base agli errori precedenti
I'm not sure if this is possible or even where to start. I assume that this can be done with an sh script using tar or similar.I have several very large zip files that contain images for all of the products in my online store. Each image is named after its 13 digit SKU (for example, 9987788000012.jpg). In order to import products into my store, all images are placed into a media directory. Unfortunately, there are over 100,000 images.
So I would like to break the images into sub-folders based on file name. For example, when I extract store_images.zip (or tar or whatever), my extract script would create directories (if they don't already exist) based on the first three digits of each image name, placing each image into the appropriate bottom level directory. For example, "9987788000012.jpg" would be placed in the following directory "media/9/9/8", with media as the root and "8" as the directory that holds any images that start with "998". Perhaps two sub-folders would be less cumbersome.Assuming this requires a script, particularly since it involves scanning image names, creating folders, and saving images to specific directories, which language would serve my needs best? PHP? Has anyone had to do something similar?
I've never really used command line to do such things but I'd like to learn. So, how do I extract all archives that are spread across several directories in one go?
I know a .bin file is an executable file type in linux. We have an error after installing it and it referes to a file name and a line number within the file. I'm trying to find out if the file is part of the .bin file but I need a way to see what's inside of it or extract it.
and I want to extract VAR15 from each line (which can be at any column unfortunately - columns separated with commas - csv file), or VAR15 together with LATn,LONn from each line. Is it possible to do it with awk, grep or something other in linux?
File in question is [URL].. meder@pc:~$ tar -xvjf wkhtmltopdf-0.10.0_rc2-static-i386.tar.bz2 bzip2: (stdin) is not a bzip2 file. tar: Child returned status 2 tar: Error exit delayed from previous errors
I tried to unzip it as well, and I tried a various slew of commands to no avail on my Debian box which has no GUI. I downloaded this on my local desktop ( Ubuntu ) and was able to easily extract w/ my mouse so I'm not exactly sure what the extractor did differently...
Im trying to extract the contents of a zip file but I want to extract it to my own directory. I'v tried -d from unzip but that just puts the contents of the zip into that directory.
But I want to extract the contents of the first (root) directory in the zip if there is only one directory in the root of the zip else just extract the files/folders in the root of the zip file (if there are more then one files).
e.g. test.zip contents the following dir structure:
test.zip /app_v1/ <-The contents of this directory I want extracted to a dir of my choice - folder-1 - folder-2 - folder-3 - folder-4 - file1 - file2
I have an ISO CD image file and want to extract it's contents to a folder. I know there are ways to mount the image and stuff, but it's complicated. I'm looking for a GUI tool to open up the contets and extract needed files. On windows I would use WinRar to do this. K3B only allows me to burn the stuff, Arch does not work with ISO files :(Is there a similar tool on Linux, preferably from KDE world?
I am using a Red Hat enterprise server 5.0 I would like to know if there is a way to extract a single file from inside a war file and display its contents on the screen? For example: I have a file labeled test.war and inside this war there are multiple files/directories. I am interested in seeing the contents of one file labeled MANIFEST.MF without having to unzip the entire war file. does it make sense?
I want to know if there is anyway I can extract the first few contents of a zipped file and then the next fixed and so on? For example, suppose I have a zipped file containing 1000000 natural numbers and I want to extract the first thousand numbers and then the next thousand numbers (1001-2000) and so on till I reach the end. Is this possible?
when I delete a running executable or script, it usually (for me, pretty much always, but I don't know if it will work in every case) continues to run without any problems. So I've got two questions here: Where is the running executable/script being run from? RAM memory? If stored in RAM or where ever, is there a way to extract the executable/script from that location? If it makes any difference, I'm using Ubuntu 11.04.
There are many more of these entries in the file, over 500, all in this same format: each host has a "define host" followed by 18 directives contained in squiggly brackets.
If I want to know all the hosts that are in the hostgroup called SERVER_GRP, I suppose I would need to read every hostgroups line (8th directive in squiggly brackets) that contains SERVER_GRP and output the corresponding host_name line (1st directive in squiggly brackets) from that entry.
I have generated a list of directories that I would like to use ls and grep on, but it is not working. I am using the commandCode:cat directories.dat | xargs lsand I get a whole lot of these errors:Code:ls: cannot access ./foo/bar/baz/grault/*: No such file or directorybut when I try the directories manually one at a time I find that they all exist and all have files in them. Same thing if I try to grep anything. What is going wrong?
The current directory contains:A file called "original.txt" Many directories called "source_001", "source_002", "source_003" ... From the command line how do you copy "original.txt" to "source_001" and "source_002" and "source_003" ...
The total number of these source directories is unknown, it changes every week.
I have found ways to tar a directory and exclude certain directories but is there a way to simply tar multiple directories (they are in the same directory) in to one .tgz file?
I've spent ages trying to build this and had a good look around for a way to do it. I have a directory tree which contains a set of folders and files. Some of the folders contain more than one file but most contain only a single one. I'm trying to move all of the files which are on their own in directories one level below the root into the root. E.g:
Root is: /volume3 Single file in a sub folder: /volume3/20110103/20110103.log File should end up as: /volume3/20110103.log
I know how to flatten the entire structure fairly easily but its the conditional part which I can't figure out how to do.
I am trying to write a script to pick the directory name from a list of file. Here is a detailed picture.Have a file name LIST which contains the follwing for example/apps/oracle/product/test1/apps/oracle/product/test2/apps/oracle/product/test3I need a script that reads these line from LIST and creates foldersin /apps/oracle/product/test1/backup/date/test1 after reading the first line /backup/date/test2 after readin the second line/backup/date/test3 and so on.
I have a dir (pub_html) with 45 sub dirsand in each there is a file with name file123.html) what command can I use to rename all files with this name in all sub dirs to file456.html ? I'm on opensuse 11.3
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
I can do:mkdir messages and then: touch messages/hello.txt Is there a command that will do both - create the directory if it doesn't exist, and then the empty file? Something like: touch -p messages/hello.txt
I'm trying to convert all file extensions for files in many sub-directories from uppercase to lowercase. I have two problems, how to list the absolute path to the files recursively over many sub-directories for which I so far have this:-
Code: find ~/Photos -print which would be fine, except it gives the directories on their own when it finds them rather than just the files with absolute paths. I couldn't find a switch for the "ls" command to do this, so I had to improvise with "find". and once I get grab each absolute file name, to just change the file extension rather than the entire file, which is what I have at the moment.