General :: Compress .gz Folder With All Contents Inside Sub-folders / Images?
Oct 5, 2010
How can i compress to .gz my folder public_html:
media/sda7/user1/public_html/
I want all files and folders and subfolders and images and everything inside of public_html. Also to be saved on the same folder so i can find it easy How can i do it?
Is there a way to recreate all the folders from one directory to another without copying over the contents of the folder? I've been trying to do something like this,
Code:for i in `ls $X`; do mkdir $PATH/$i; doneUnfortunately $i is deliminated by whitespaces in the filenames and not the actual folders.
$X contains only other folders so I dont have to worry about regular files but any kind of more "advanced" solution would work.
I am trying to find a directory named 480debugerror nested under child directories. I don't know the exact path, or even if I have the exact spelling of the directory I ant to find.
Is there any linux command to find directories with a given prefix or suffix, for example directories with a name of debug or debugerror, with unknown some prefix or suffix?
I have a parent folder, with a lot of sub folders. basically, i'd like to bring all the contents from the sub folders into the parent folder, and subsequently delete the sub folders. is there one command for this? or do i need to cp -r over and over again, and then manually delete?
I have around 150+ folders in one directory. All contains some pdf files. Now i want to give some prefix no. to folder only not the files inside. How can i give the prefix to all my folders?Eg : Suppose i want no. 8562 then i want it like as follows
OLD FOLDER NEW FOLDER ABC/ 8562-ABC/ AABC/ 8562-AABC/
Today I tried to compress some folders containing backup files from last year. I right-clicked on the folders and selected compress as tar.gz. I let it work, and found that hours later, the folders were still compressing. How long is it supposed to take, anyway? I was trying to compress the two sets of backups simultaneously; together they're around 1.5 GB. They have many subdirectories.
This is the script I'm running tar tf some.tar somefolder_insidetar And output it's a list with all folders, files, and SUBDIRECTORY Files, the only thing I need it's just show the contents (folder and files) of the current directory choosed, not listing subdirectory files, or subdirectories inside subdirectories.
I have a folderA that contains folderB that contains a lot of files. I would like to get rid of folderB, but not its contents. I want those contents to be inside of folderA. How can I accomplish this on the commandline?
I have a USB drive that has a TON of folders on its root level. I want to remove all those folders and their contents except three of them. I know if I do rm -rf that will kill everything, is there a way to exclude three folders, say folder1 folder2 newfolder, and do it all in one statement?
I am setting up a lubuntu nas with all of my music, movies, etc on it. I want to give my kids access to my mp3 directory, so I can move all of the kid appropiate music into the root of my mp3 dir, in the same order I have all of my music sorted. Under the Music folder, I have them sorted, in folders, by letter. So A, B, C, D, etc... Now, in those folders are the respective artists. So where there may be something approipate in the P folder (say, Paramore), there is something inapproipate (say, Pantera)Now, when the kids go to the P folder, I don't want them to even see the Pantera folder. I just want them to see the Paramore folder. I tried a test using chmod 711 and chmod 700 on a directory with a test user, and the user can't access the directory, but can still see it
I currently have a bash script that runs and backs up my daily files onto a separate partition using Rsync, but I thought it would be good to use the Ubuntu-one service as an ADDITIONAL backup for really important files.
How do I compress then encrypt those files, and can I add any commands that will do this to my existing bash script?
Is there is anyway, with tar, zip, gzip, or any file compression type to compress without causing high CPU. In other words, limit how hard the CPU works to compress it? Of course I understand that this would cause the compression to take longer but time isn't too big of a concern.
In Linux bash shell, for a given directory, how can I list:The create date for that directory The number of files in that directory The number of subdirectories in that directory.
I want to do something like svn add dir1 dir2; svn ci dir1 dir2 but have it be only 1 revision. Is there a way to do this? Is this the correct way to add new folders (with contents) to the repository? We are restructuring the trunk, so I cleared it out and plan on putting these directories with their contents in it.
I created a tarball with multiple files. The rpm generator requires those files to be inside a folder. I don't want to move the files before generating the tar. Is there a way to create this folder while generating the tar or after it?
I have a folder name is /home/kemal. I want to give a permission to an user name is kaplan. I want to see this user name is kaplan that must see kemal folder's contents.
Recently I did a System Mechanics clean-up of invalid and other misc files. The files were moved to the recycle bin, but I couldn't open a file individually while it was in the recyle bin in order to double check whether or not I wanted to perminently delete it. I then selected all items and moved them to a single folder hoping to open it later to review the individual files.I don't know the folder name or what part of the computer this folder was moved to. Is there a way I can find this new folder, name unknown, that was created on 10-12-10? I'm sure it's taking up valuable space in my computer and I want to permently delete the unwanted items individually and not as a whole batch.
I am trying to compress a folder and the contents within, while keeping the permissions the same. I then need to check if the compress file is corrupt or not. Base on that result I need to transfer the file.
cd /home/ops/Desktop/temp tar cvzfp backup-"$(date +%d-%b-%y)".tar.gz /home/ops/Desktop/dir1 gunzip -tl backup-"$(date +%d-%b-%y)".tar.gz
can i use one samba share with a folder showing the contents of another directory.shortcuts don't work on non ubuntu systems and it won't resolve links to files no on the share
I have 500 folders of templates in one folder. Now each folder has file called template_thumbnail.png . now i want something so that all those files get copied to one folder with name same as the home folder name
I have cygwin installed and i can copy that folder in there so basically it will linux shell script
Is there a way to essentially make all of one folders content as a system link in another folder (I have "My Documents" on my external and my computer, they have different things, I'd like to have one "My Documents" which shows both contents and hide the other two
i need to write a short script that will compress a specific folder that`s on the Desktop (and all it`s content) and also will encrypt it with a password that is inside the script --->meaning it wont ask for a password+verification when compressing+encrypting