General :: Make All Files Non-executable Recursively Using Find Without Affecting Directories?
Sep 26, 2010
I need to strip the executable flag from all files within a certain directory and sub directories. Right now I'm doing it with a 2 step process
find /dir/ -type f -exec chmod ugo-x {} ;
find /dir/ -type d -exec chmod ugo+rx {} ;
Is it possible to modify the first line so that I can strip exec flag from all non-directory files? Since this needs to be done on a fairly regular basis across a lot of directories and files, I'd prefer not to use a bash script which would slow it down.
The rm command man pages discusses removing files or directories recursively. So what is meant by deleting a file or directory recursively? And what are some reasons for doing so?
I want to copy all directories, files, and hidden files and hidden directories with one command. I want these items to replace any same items in the target directory.
I have tried several things, such as:
cp -r * cp -aR *
but I only seem to get visible files and directories. Obviously, I am missing something. (A brain, probably....)
suppose in my current directory, I have 50 sub-directories. Now, I am interested only in about 20 of those sub-directories (whose names match a pattern). I would like to recursively list the contents of these 20 sub-directories. How do I do that ? I would like to do this in Solaris 10 and Linux(RHEL 5.x).
I made an account under freeshell.org and it has been very satisfactory so far. I recommend everyone getting an account under freeshell.org. But anyways, how do I find files over, for example, 500 KB, in the entire, my shell account?
I need to replace ":" from multiple files names, since I am going to copy those files from a linux partition, which admit the ":" to a FAT32 partition, which does not.
Example: original name: eg06_ana_21-05-06_09:21:03.JPG wished name: eg06_ana_21-05-06_09-21-03.JPG
I have googled a lot but I have not been able to adapt the examples given by people to my aim.
It seems that rename command is what I should use, but I have no idea to build the correct pearl expression.
I am new to linux as well as awk, grep or sed. I need a find and replace command single liner or script that loops trough input file (file1) and find the particular input in file2 and add "!" in front of the found string.
Example: input file: file1 g+h=o+p a+b=c+d file2 (file that need to look for) a+b=c+d1e105 x+y=z+s5e105 g+h=o+pabcdefg t+r=w+qxvyderf
Output file (file3 should look like this) !a+b=c+d1e105 x+y=z+s5e105 !g+h=o+pabcdefg t+r=w+qxvyderf
I have tried many awk and sed method of find and replce but it did not work the way I wanted. This is mainly due to my lack of experience in awk and sed. The program should loop trough file1 and find in file2 and output in file3 for the 1st (g+h=o+p) set then repeat the same process again for set 2 (a+b=c+d).
I have a bunch of files on a Ubuntu box, which have various characters in their filenames that Windows doesn't accept (mostly ":" and "*", but possibly others).What's the simplest way to get these all renamed and moved to a Windows machine? It's OK to replace these characters with something like "[colon]" and "[asterisk]".
Is it possible to change only directories access permissions recursively with some linux command. I need to set x (access) permissions on directories but not execute on files. [URL]
I have a folder that contains my group's website. The ownership of the entire directory is set to "www-data.website" (website being a group). I want to set the sticky bit on this directory such that if anyone creates a new file, either in the main directory or subdirectories, the ownership remains like above.
Q1: I have the sticky bit set on the main directory (drwxrwsr-x). But for some reason, some of the subdirectories don't have the sticky bit set. Is there a command I can use to change the sticky bit on Directories Only (i.e., not on the files)? Q2: Is there a sticky bit that I can set for the ownership (not group) so that it is always set to www-data?
I promise I am carefully studying shell use, and much else, but right now I just need the shell instruction to search directories recursively, for files with .swo & .swp extensions, deleting the files as they're found.
I would like to zip only selected directories(and its child directories as well)I have many directories in the current folder like app, content, db, library etc.But I would like the zip only app and content and its child folders. I am trying the following.
zip -r ../backups/code/20110625 -i app/* -i content/* . *
But I am getting the following error. zip error: Invalid command arguments (nothing to select from)
I have a question which has been in part answered many times but nothing I found relateds completely to my situation. I am sure there will be people who will say RTFM but believe me I did, and searched as well but to no avail. I have a situation where I want to copy files created withing last hour in one directory into another one. The problem is that that the directories are on different levels in the dir tree so the absolute path is different. But I want to keep the relative path the same.
I want to copy new files from /mnt/path_to_webdav/user to /home/user. so if there is new file /mnt/path_to_webdav/user/doc/xy.txt I want it to be copied to /home/user/doc/xy.txt. Also if there is a new dir, say /mnt/path_to_webdav/user/newdir I want a new dir to be created in /home/user/newdir with all the files in it, should there be any. I can do find with exec and copy all the files into one directory.This is not what I want though. How do I preserve the relative path and get the files copied into their corresponding directories?
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
I cant make files executable anymore.using chmod or from file properties.. it isn't working. AS far as I remember, I didn't make any changes in the user settings and my account has the administrator rights. For an un-executable file, say a jpg or a txt, it can be done.but doesn't happen with a .sh, .py or any other xecutable. chmodi-ing shows no error, but the file isn't executable. Through the GUI, when I check the box , its immediately unchecked again.
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
I need to, through a bash script, go through a given directory (given as argument 1) to list out the relative path in this directory (including $1) for eact subdirectory which contains files. Directories which only contain . .. and eventually only subdirectories SHALL NOT be listed. It is this last requirement that makes it difficult for me.
I have been using the tree command for now, but I have not found a way to ignore paths to directories which only contains other subdirs or nothing at all in any easy way. I may offcourse test each directory after they are listed but this gives an extra loop to go through and I beleive it should be possible to do it directly when creatring the list. I guess by using find or ls in conjuntion with the tree command or by itself it should be possible but I am not to conversant of nested script commands.
I hope this post stands in the right section.I have a commandline i need to enter in terminal when i want to run a program. i tought lets put that piece of command in an .sh file and just click the file to run the program (then i dont need to open terminal first an give in the command) however the .sh file does not open the program. so i propably need to make a executable (application/x-executable).
When I am in Nautilus, I want to be able to select a directory, then right click (or some other action) to do a file find on that directory. The gnome-search-tool would be a good candidate for this, if it could be an action in Nautilus. I know I can do a file find through other means, but Nautilus seems to be where I am when I want to search directories.
what i wanted to do was find all the files with a specific name from a tree, sort them by modification time and have their directory appended to them so that i knew where they were (because they all have the same name). i tried a whole bunch of different things and finally did this:
this did the trick pretty well, but as you can see it is far from elegant and i think i'm doing some things wrong and kludgy
first thing i tried was "ls -lRt | grep world.sav" which worked except i couldnt distinguish the files because there were no directories. that took a lot of looking till i accepted i couldnt make ls print directories as well and append them to the files somehow that their relationship would be clear. i tried piping ls to find, doing it in reverse, passing them from grep etc. etc. until i read some more stuff online that got me using gawk and sort. the questions:
1. is there some other, more elegant and simple way to do this kind of detection and sorting?
2. is there any way to use a pipe after using exec? the semicolon seems to prevent this entirely, forcing me to use an intermediate file as above. i could just remove it later, but i'd prefer a straight piping.
I'm using bash under Ubuntu.Currently this works well for the current directory:catdoc *.doc | grep "specificword" But I have lots of subdirectories with .doc files.How can I search for, let's say, "specificword" recursively?
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files 1- directory 2- .txt files 2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
I've found myself in the situation where I need to create a menu in gnome/kde for a directory structure full of documents.The directory structure looks like this:
am new to linux and trying to find a file in sub directories using find command as:find .-name *.jpg -type fBut I am unable to get the result as find command is not permitted by the server administrator.Is there any way to find files without using find command.
I have created a file named as pm under the path /home/ppp/ i.e. /home/ppp/pmTo make it executable, I've used command: chmod a+x /home/ppp/pm while residing in root directory.But while trying to run from root by typing ./pm or within the directory /home/ppp, it was displaying that directory not found.Please help by providing the step by step procedure, so that I would be able to run my file from root or from the directory.