I have a set of folders in some directory /home/dir, and I'd like to generate zip files for the contents of each folder separately. I'm wondering if there's a quick way to do this with a one-liner, or what the bash script would be.
Directory structure: /home/dir/first/second/thirdand I want three files, first.zip, second.zip, and third.zip. I know zip isn't the best format, but these are for distribution to users on Windows machines and I'd prefer to keep them in the zip format.
I have a USB drive that has a TON of folders on its root level. I want to remove all those folders and their contents except three of them. I know if I do rm -rf that will kill everything, is there a way to exclude three folders, say folder1 folder2 newfolder, and do it all in one statement?
I recently had to rescue files from a dying hard drive using Photorec. However, it dumped the files randomly in many different directories. Is there a way I can transfer the contents of these directories into one master directory without the file structure. For example, I have folders 1 through 10 as subdirectories in folder X. How can I transfer the contents of all those folders into folder Y while getting rid of the subdirectories 1 - 10?
Recently, Ubuntu was doing a standard update. It got stuck in some kind of strange loop. So i put the boot disk in cleared the master drive and reinstalled ububtu 10.4. I have a backup 500gb drive that use to keep the contents of my important information for my fileserver. After the completed install and found the backup drive STILL named "FILESERVER" and still has my folders aka: our pictures, our music, and our video. I opened them up and they're all empty. Am I missing some informaton? I swear i didn't format the drive. I couldn't have since the folders are still there. Where are all my files at?
I have a parent folder, with a lot of sub folders. basically, i'd like to bring all the contents from the sub folders into the parent folder, and subsequently delete the sub folders. is there one command for this? or do i need to cp -r over and over again, and then manually delete?
I know that you can use ls to list the files in a folder, but is there anyway to list the contents of the other folder? Like, say I have a folder labelled "Records", and inside is 12 folders, one for each month. Is there any way to list the contents of the 12 subfolders without just going into each one with cd and using ls?
I've created a new group and a new user called dftp... Now I wanna do one thing... If 'dftp' connects thru ftp he should be directed to a particular location... and he shouldn't be able to see other folder except for his own including the parent folder that contains that location... I changed dftp's home folder to the location I want. However while connecting thru ftp. user dftp has been given permissions to see other folders and check out the contents of the other folders.
I want all files and folders and subfolders and images and everything inside of public_html. Also to be saved on the same folder so i can find it easy How can i do it?
How to list the contents of a folder to a text file. I'm trying to list all my music, including all subfolders, etc. to a text file, but I can't remember the command.
I want to do something like svn add dir1 dir2; svn ci dir1 dir2 but have it be only 1 revision. Is there a way to do this? Is this the correct way to add new folders (with contents) to the repository? We are restructuring the trunk, so I cleared it out and plan on putting these directories with their contents in it.
What I would like to do is to print the contents of all text files in a particular directory, recursively. Problem being that there are directories and possibly binaries scattered around in the filesystem as well.
Trying cat * works as long as there are no directories in there, but when there are it gives an error instead and prints nothing.
I'm sure it's easy using file -f or something but I can't figure it!
I can't seem to find any info on a terminal command that lists the contents of a directory with page breaks so that I can view the contents of a very large folder (such as /usr/bin) because if I use ls, it prints so many names that scrolling up won't even cover all of them. We all know the obvious solution is to use a graphical file browser, but I tend to shy away from depending on graphical utilities simply because command line feels faster.
I have an ubuntu 8.04 server running a couple of web sites using virtual hosts , apache2, mysql and php. I have noticed that by default php sessions are created in /var/lib/php5 and all stored in plain text.
I have quickly created a php script in a separate virtual host to list and display all contents in /var/lib/php5 , and it seems incredibly easy to see what details the other accounts are storing in sessions.
What bash command can I use to rename or change the extension or name of a batch of files (for example, from .php to .html)?
Furthermore, is there a simple bash or python script/command that can be used to open a batch of plain text files one-by-one, search for all instances of a specific word, and replace all of those instances with another word?
I have two text files on Linux. One contains a list of valid IDs. E.g: abcd efgh ijkl etc.
The other contains a list of invalid IDs. But, some of these also appear on the list of valid IDs, in this example "efgh": mnop qrst efgh etc.
How can I easily construct a text file that contains all the lines from the invalid list that do not appear in the valid list? That is, I want to end up with a text file that has: mnop qrst etc. I'd like either some Linux commandline magic of some clever Vim trickery.
I installed Fedora 12 x64. Now everytime I start my Linux the .gvfs directory in my /home/Razorblade -dir is corrupted. So I have to reboot and start an Linux LiveCD, mount my home partition and delete this folder. After that I can login normally. Symptoms: I am able to login normally, start a browser, start my mail client, list the contents of subfolders of /home/Razorblade/... - everything fine. But as soon as I want to list the contents of my /home/Razorblade folder - nothing but this turning blue thing around the curser. The command line does nothing after "ll /home/Razorblade", sometimes even crashes and closes. As root I am able to do "ll /home/Razorblade" And this is what I get:
Ive managed to install samba, I've shared a folder. I can access from a Windows 7 machine via \ubuntupublic. I can put files in the folder form the ubuntu machine and edit them on the windows box. I can put files in the folder/share from the Windows box but then I cannot edit them on the Ubuntu machine (they are read only and have a "Lock" over them). I can fix this by going to the properties of the file/folder in Windows and manually assigning "Everybody" full control (then the lock disappears and all is well.) I want read/write access to all the folders contents from both machines all the time (security is NOT a concern I WANT the permissions wide open) what am I doing wrong?
I want to list my folders and subfolders (recursive) and also show the size of the files in terminal. I started using this:
Code: ls -h -R > /test.txt I got everything but not the size of the folders. Then I tried this: Code: du -h --max-depth=1 > test.txt
Suppose to show me everything, but I can't see subfolders. And this command do not accept recursive. How can I show the size of the files and folders like the second command, but including the subfolders?
if there is an application available to generate a list with files and folders from a location, like a hard drive or a folder? The list could be in any format, even a text file would be just fine.
Is there a way to recreate all the folders from one directory to another without copying over the contents of the folder? I've been trying to do something like this,
Code:for i in `ls $X`; do mkdir $PATH/$i; doneUnfortunately $i is deliminated by whitespaces in the filenames and not the actual folders.
$X contains only other folders so I dont have to worry about regular files but any kind of more "advanced" solution would work.
For reasons I won't get into, I need to copy directories so long as the average system load is low. Can someone help me write a BASH script that will copy the contents of a directory, but check to make sure the average system load is below X before copying each file, and if not, wait Y seconds and try again?