I have a requirement to find the files having its name as ack_reply. However, there are many other files in the same directory as these resides. Now I have to remove these files from the folder and retain others after 7 days. So I tried to write the below script with grep command.
find $directory -type f -mtime +7 | grep ack_reply
how can I pass this output to -exec command.
If I am not using grep command my script would be as
find $directory -type f -mtime +7 -exec remove.sh {}\;;
I am using grep to filter out directories I am not interested in like this:svn stat | grep -v data/charts | grep -v lib/model | grep -v web/picsIt seems a bit "hacky". Is there a better way to specify more than one string to ignore, so that I dont have to chain multiple grep commands?
For searching a file or directory i normally use grep command. kindly can you guide me the difference between grep and find command. I have used both but that are the difference between them ? are the same or grep is new as comapird to find command.
After typing "grep some_word" on terminal 6, the system doesn't do a thing, just lets me type endlessly. I've tried "Esc", "q" , [CTRL] + x, "exit" and no luck. I bet I'll kick my ass when you tell me but at the moment I can't figure it out. Rebooting would probably solve the problem but there must be a better way.
This has to also show the line count. I can get it to show the files but not the line count. What is the single command used to identify only the matching count of all lines within files under the /etc directory that contain the word „HOST? List only the files with matches and suppress any error messages.
When I used the find command, I almost always need to search the local drives. But, I almost always have super large network shares mounted and these are included in the search. Is there an easy way to exclude those in the find command, grep and other similar commands? Example:
I would like to know how to use grep command to filter the log files created between 3:00 PM to 4:30 PM in buch of log for whole day in different headings. This files resembles like sar file in linux.
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
I want to know that is there any method to grep a particular data from a file without using the "cat --- | grep ' ' " command....I need to use a system call for this functionality.
Is there a way to use the grep command in conjunction with an editor such as nano or vi so that I can remove the commented out lines from a conf file and then proceed to edit it ? I can use grep -v "^#^" squid.conf (example ) which gives me a nicely tidy conf file but I can't edit it.Can this command be used with nano or ?
Code: man -k mail Which lists commands that contain the keyword "mail" in their description.I want the output of this command in less and the words highlighted by grep. Something like
Code: man -k mail | grep mail | less The command doesn't work, how do I fix it?
how to search for those files which contain word "AM_COLLECTION=22". I need to know all the files with this string. ( I know the grep command can do it but either
The thing is that the command for sed resembles the following
[code]...
Now if I want to place another command like grep or cut in the address field how do I do it. Actually I don't know the line number. The user has to give it as an input. How shall I do that?
I want to pipe the output of a command into grep as the search TERM, rather than the text to be searched, like this for example
Code:
cat /var/log/auth.log | grep date "&b &d"
so that I only see the lines in auth.log for the current day...but obviously that line doesn't work.... is there a way to do this with grep, or even another command?
I'm buying this unit from deal extreme: it's a bitorrent downloader, with NAS capability. I'm interested in sharing an external HD in it, with media and backup purposes. I'm afraid of energy problems (don't know if this is the correct term), corrupting my mounted drives (like after a storm), so I thought about buying an UPS that sends a "signal" to my Linux box, and a script in my Linux box would unmout everything to avoid problems.Do this "UPS signal" feature exists? Do you have model suggestions?
Sometimes I type 'sudo su - user' in linux and then realise I'm typing it from an application account rather than a user account. I want to ctrl+c to abort the password entry. When I do this, it always freezes for a couple of seconds before it aborts the process and returns me to the shell.
While I was trying to compile a C shared object library, I accidentally created two symbolic links which point to each other. Is there a way to get rid of them without nuking the whole directory? I read that the only way to break a symbolic link is to delete the file it points to, but I'm sure there must be another way.
I have ~200 c files in my makefile[$(SRCS)], and it compiles all of the files using a single gcc command. So each time I make a change in one c file, it ends up re-compiling all the files, then linking to make the binary. How can I break out the compilation into individual gcc commands for each c file, so that make checks the timestamp and accordingly compiles only the modified files.
The aim of this script is, when the folder reaches 20M then attributes will be set to that particular folder so that no newfiles and folders cannot be created or copied to that samplefolder. whenever i copy a file morethan 20M to that folder its getting copied fully and then the attributes were applied. But i dont want this to happen, when the folder reaches its maximum current write operation to that folder should be stopped automatically with a error.