Fedora :: Configure Grep To Search Recursively Through All Files In A Directory?
Oct 1, 2010
somewhere lurking is a file containing the default print resolution, which is not being overwritten by printer settings or cups management. I've asked on the cup forum and nothing successful.
So here's the question:
How can I configure grep to search recursively through all files in a directory, or if need be starting from root to find the pattern "2880" I've looked in the man page for grep and I can't see how to do it, is grep the right tool to use for this ?
This should be a simple thing to accomplish, but I can seem to figure it out. Essentially, I want to have a bash alias or function that will let me recursively grep the current directory. A while back I added this to my .bashrc:
Code:
alias rg="grep -r --exclude=*/.svn/* --exclude=*.swp"
This works fine, (and also ignores any svn and vim swp files), and I can call it like:
Code:
rg foo *
However, 99.999% of the time, I am only interested in searching in the current directory, so the "*" is a bit redundant. Also, I would say 5-10% of the time, I am typing faster than thinking and forget the "*", so grep just sits there trying to read from stdin. It's a pretty minor thing, but ideally I'd like to be able to just type:
Code:
rg foo
I've tried creating a function to handle this:
Code:
function rg(){ grep -r --exclude=*/.svn/* --exclude=*.swp $1 * }
but it behaves exactly the same as the alias above. escaping the "*" with 's doesn't work, and neither does trying `pwd` (or even a hard-coded path) in its place.
I am new in Linux and I need to extract alot of zipped files (different format (e.g tar.gz, tar.gz2)) which are in subdirs and I do not want to go to each subdir and extract each file because it will take alot of time. Is there away to extract all files that are existing in dirs and subdir with "for loop" or is there a script that can do the job automatically.
I am new to linux as well as awk, grep or sed. I need a find and replace command single liner or script that loops trough input file (file1) and find the particular input in file2 and add "!" in front of the found string.
Example: input file: file1 g+h=o+p a+b=c+d file2 (file that need to look for) a+b=c+d1e105 x+y=z+s5e105 g+h=o+pabcdefg t+r=w+qxvyderf
Output file (file3 should look like this) !a+b=c+d1e105 x+y=z+s5e105 !g+h=o+pabcdefg t+r=w+qxvyderf
I have tried many awk and sed method of find and replce but it did not work the way I wanted. This is mainly due to my lack of experience in awk and sed. The program should loop trough file1 and find in file2 and output in file3 for the 1st (g+h=o+p) set then repeat the same process again for set 2 (a+b=c+d).
I know grep can search recursively (ie through all subdirectories to the bottom of the directory tree), but is it possible to ask grep to only search say, 3 levels down? That means the current directory, any directories in the current directory, and within any directories within those?
how to search for those files which contain word "AM_COLLECTION=22". I need to know all the files with this string. ( I know the grep command can do it but either
If I pass in /home, I would like for it to return 4 files. Or, bonus points if it returns 4 files, 2 directories. Basically, I want the equivalent of right-clicking a folder on Windows and selecting properties and seeing how many files/folders are contained in that folder.
How can I most easily do this? I have a solution involving a Python script I wrote, but why isn't this as easy as running ls | wc or similar?
I'm under linux . by default, other user can't read anything under my home directory. let's see my home directory is /home/superman , and I tried to use
chmod +r /home/superman
to let others can acess files under my home directory , but it does not work .
I'm able to use the following to remove the target directory and recursively all of its subdirectories and contents. find '/target/directory/' -type d -name '*' -print0 | xargs -0 rm -rf
However, I do not want the target directory to be removed. How can I remove just the files in the target, the subdirectories, and their contents?
I would like to overwrite files in a directory tree, recursively. The ones I would like to overwrite match the filename "x_alpha*.png" and have a size exactly 456 bytes. Is there any way to search for these recursively in a directory tree, and overwrite them with a reference file, for example "e:mydirgood.png"
I am using Windows 7, but I have UnxUtils, so I can use those too. What I am looking for is something like this, generated automatically: copy /y e:mydirgood.png e:mydiracx_alpha0023.png copy /y e:mydirgood.png e:mydirefgx_alpha0045.png copy /y e:mydirgood.png e:mydirhx_alpha0248.png
i would like to find and backup all *.mp4 files from /Pictures and its sub-directories and move them to a single directory on a remote. I can find and move the files but I don't want the directory structure...just the files to be placed in the remote directory.
After i try to find logfiles follow date/month/year. i want copy this files to another directory with name's directory is time you find(date/month/year).
I have a directory listing with many subdirectories having many files. I want to recursively search for the oldest 5 files starting from the base directory and not 5 from each subdirectory. I am writing a shell script which sorts them using ls -lRtur|egrep "txt|jpg" > /tmp/file1 Now from this /tmp/file1 file I want to sort the files same as what the ls -ltr command does that is oldest file time to newest file time first. How do I sort based on Linux time stamp? The files itself also have Linux timestamps embedded in them So I can sort based after extracting them as well if it is easier. My /tmp/file1 has entries like below.
I'm facing a little trouble with copying a .txt file(only) from a directory and subdirectory to another directory. -R command don't work I think if I want to do this, since I don't want to copy subdirectory.
I have a mail.log file, of which I want to redirect only the search strings of the sender from=<example.sender@exampledomain.com> and the size size=4537 to a file.
In every case the sender string starts as from=<> and the size string starts as size=
What would be the grep command to redirect only the two search strings to a .txt file?
I am using ascript for general users to back up usb drives to lto4 tapes.. I wish to ahve some error checking to check IF is there is a tape in the tape drive to check for the tape:
if i do a sudo mt -f /dev/st0 status i will get back a mt: /dev/st0: rmtioctl failed: Input/output error if there is no tape in the drive or sudo mt -f /dev/st0 status
I have done a bunch of searches on this but the terms seem to get tangled in the more popular search of "colouring the output of grep / awk". I am trying to find a way to grep/awk through the output of a command to find text of a specific colour. The command's output has a range of colours signifying too many different things to specify using text, with colour being the only form of grouping.
I need to search for the following pattern with GREP in a text file:
So I tried already:
But none of those works...I think probably because GREP doens't like the special character > in the middle of the serach pattern.
At the end I just need to now if GREP found the pattern in the file or not, so it should give me a 0 or a 1 back, once I check the value of the variable "?" after using the grep command.
I have a partition that I mount as /data on all of my distros of my multi-boot machine. I am having a bear of a time figuring the right way to address permissions/groups so that any distro can use it (or any removable drive).I tried (in linuxmint) making a group '/data' and assigning the users on my machine to that group, then changing the permissions/groups of the files and folders in that mount as belonging to the /data group, then booted to fedora 15, made the /data group, added the users to that group, I'm not sure that this way will work (it doesn't seem to) or if it's the best way to proceed. some of the things I don't get are:what is the '1000' user and group?is the user/group info on (in or somehow attached) the mount itself?does this seem like a good way to do this?is there on way to 'apply permissions to enclosed files' recursively through the nautilus context menu?
I've been trying to identify all files on my cut-down version of Damn Small which contain the text string "User Agent:" in them. Because it's only 120Mb in its entirety, I'm quite happy to have grep search the whole system. I'm using this command, but it just generates errors as you can see:
I want to pipe the output of a command into grep as the search TERM, rather than the text to be searched, like this for example
Code:
cat /var/log/auth.log | grep date "&b &d"
so that I only see the lines in auth.log for the current day...but obviously that line doesn't work.... is there a way to do this with grep, or even another command?
To search a string pattern in all files in a directory and subdirectories, I am using;
Code: grep -R "myclass::my-func(" mydirectory/ Now I want grep, to search in only specific file types say *.cc. Please help me. I have read manual of grep, but could not deduce any hint. Best Regards.