I've got a script where I have to parse out the last modified time for a large amount of files. Piping the output for "ls" into "cut" seems to work most of the time, but the output is unpredictable.The "fields" argument doesn't find the date modified columns consistently, and using character count is as well since the output can vary in width depending on the file name
I am writing a bash script which includes the find (and rm command with exec) command the options -mtime -15 or +15 so that files older 15 days must be deleted from the home directory. I think that -mtime +15 should be ok, but in which case does the option -mtime -15 really come in handy?? go back 15 days until today's date?
I need to delete all *.trc files that are older than 30 days and I am getting a "Argument list too long" error. There are other files that should not be deleted which is why I am using the "*.trc" and newer files need to be kept as well. I have seen other postings but they do not cover both of the conditions. Below are 2 of the many attempts at doing this but I cannot get this to work.
I'm very very tired, worked all night long, and I did't sleep for hours... So I'm like a zombie now... half awake, and half asleep.
I was trying to clear a directory. Then I run the command cd to enter the directory, and then before thinking I run this dangerous command. on my Linux server: "find / -mtime +1 -exec rm {} ;"
I got a lot of:
Could this command have deleted something inside these directories? I'm afraid that the next reboot the server won't startup...
I have some files on server with the date several months ago, but invisible for `find -mtime 7` search. When I list them as `ls -l` they look perfectly normal: -rw-r--r-- 1 root root 347253 Jun 12 16:26 pedia_main.2010-06-12-04-25-02.sql.gz -rw-r--r-- 1 root root 490144578 Nov 24 16:26 gsmforum_main.2010-11-24-04-25-02.sql.gz "find -mtime" does not work as expected on files with different timezones?
I have 4 Linux machines with cluster.My target is to find all kind of IP address (xxx.xxx.xxx.xxx) in every file in the linux system remark: need to scan each file in the linux system and verify if the file include IP address if yes need to print the IP as the following
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.
I know how to search for normal files but can you let me know " How to search for 5 setuid files on the system. Also explain, for each file, why setuid mechanism is necessary for the command to function properly"
am new to linux and trying to find a file in sub directories using find command as:find .-name *.jpg -type fBut I am unable to get the result as find command is not permitted by the server administrator.Is there any way to find files without using find command.
I want to scan a particular directory recursively and run a particular command with each file as input. For this I am using "find /dir/path". I dont want to write any long script containing loop on the output of "find". I want a single command which will allow me to run a command on each file of the "find" command output.
#!/bin/shLOOK_FOR="NTLMAuthenticationFilter"for i in `find ./ -name "*jar"`doecho "Looking in $i ..."grepjar -e $LOOK_FOR $idoneI wrote the script above, and try to find if there any file name LOOK_FOR exist in those jar,my quest is: grepjar -e $LOOK_FOR $ihere how can I check if there are any successful result , and output them ?
What is the difference between *.xml and *.xml in find command in Linux/macThe results of:find . -name *.xml and find . -name *.xml are different. But why?Also, is locate '*.xml' better than find? Which one is the most commonly used?
How to find info and contents of the SD card via linux terminal. I found a command "mount"in internet. According to it it's in dev/sdb/.. but I didn't found such. I saw dev/sdc/
i have this find command to find modified files and copy them.Code:find $SRC_DIR -type f -ctime -1|xargs -i cp --parents {} $BACKUP_DIR/$DAY/and it works good but it want to exclude files that exist in some folders like $SRC_DIR/cash and $SRC_DIR/woks/tmp
If someone has done something wrong on a shared linux machine. If i want to find out who is that person or ip from where it is been done what are all the possible ways. 1 possibility I thought was to get the PID of the command and get other details from that PID?
t has been long time since i used the command that used to display the most widely used command in the distribution .It was in following format.(I guess it was a combination of history head sort grep or something like that)
50 ls -ltr 3 neat-tui 1 touch abc
I tried finding the command in google but wasnt able to find it
So i think i should fresh install and it would be convenient if i knew the refined commands to locate the saved files in /home hdd a, since 02_06_10 (not including all the hidden or deleted files), and then copy to a memory stick using the nautilus gui.
My goal is to find all pdf files on a remote machine, so I resort to the useful command find. So I type find .pdf or find .pdf" and I get nothing. I do the same on my machine and I get nothing. I do a regular search from the menu on my machine and I find quite a few pdf files. Would somebody please tell me what am I doing wrong?
Is there any command in Linux which will find a particular word in all the files in a given directory and the folders below and replace it with a new word?
We often require to find an equivalent text command for any GUI operation. Just as an example we click on a folder (say ABC) from the current directory in order to see the contents of that directory. The equivalent command for it to happen would be->
Code: cd ABC ls Now the thing is that we often don't know what that equivalent command will be. So I want to know that is there any way out to find it. What I want to do is that I shall perform any operation using mouse in the GUI mode(whatever operation it could be) and then I can see a log file to see what I actually did last(rather what the command would have been if I have worked in the text mode)...
If the filesystem is mounted with noatime option does it influence find -atime behaviour? I tested and it looks that find is able to see access time but why should it if mounted with noatime? Or maybe it depends on the type of filesystem (I`m using XFS)?EDIT: Looks the answer is [URL]htmlIf a file system has been mounted with this option, reading accesses to the file system will no longer result in an update to the atime information associated with the file like we have explained above. The importance of the noatime setting is that it eliminates the need by the system to make writes to the file system for files which are simply being read. Since writes can be somewhat expensive, this can result in measurable performance gains. Note that the write time information to a file will continue to be updated anytime the file is written to.
For searching a file or directory i normally use grep command. kindly can you guide me the difference between grep and find command. I have used both but that are the difference between them ? are the same or grep is new as comapird to find command.