General :: Find Command With -mtime +15 Or -15 Days?
Feb 2, 2011
I am writing a bash script which includes the find (and rm command with exec) command the options -mtime -15 or +15 so that files older 15 days must be deleted from the home directory. I think that -mtime +15 should be ok, but in which case does the option -mtime -15 really come in handy?? go back 15 days until today's date?
I've got a script where I have to parse out the last modified time for a large amount of files. Piping the output for "ls" into "cut" seems to work most of the time, but the output is unpredictable.The "fields" argument doesn't find the date modified columns consistently, and using character count is as well since the output can vary in width depending on the file name
I need to delete all *.trc files that are older than 30 days and I am getting a "Argument list too long" error. There are other files that should not be deleted which is why I am using the "*.trc" and newer files need to be kept as well. I have seen other postings but they do not cover both of the conditions. Below are 2 of the many attempts at doing this but I cannot get this to work.
I'm very very tired, worked all night long, and I did't sleep for hours... So I'm like a zombie now... half awake, and half asleep.
I was trying to clear a directory. Then I run the command cd to enter the directory, and then before thinking I run this dangerous command. on my Linux server: "find / -mtime +1 -exec rm {} ;"
I got a lot of:
Could this command have deleted something inside these directories? I'm afraid that the next reboot the server won't startup...
I have some files on server with the date several months ago, but invisible for `find -mtime 7` search. When I list them as `ls -l` they look perfectly normal: -rw-r--r-- 1 root root 347253 Jun 12 16:26 pedia_main.2010-06-12-04-25-02.sql.gz -rw-r--r-- 1 root root 490144578 Nov 24 16:26 gsmforum_main.2010-11-24-04-25-02.sql.gz "find -mtime" does not work as expected on files with different timezones?
I am using awstats for my website report . but last 3 days i am not using awstats . now i check awstats,, when i give updatenow then it only shows todays report , we cant find last 3 days report.
What offline method is there of finding out days since a certain date. Example: How would someone find the number of days from 1-Jan-2003 to 7-Dec-2010? Could someone write a script that takes in the 2 dates and output the number of days?
I'm trying to truncate a postfix Maildata directory for one of our users. I want to be able to move any files older than <n> days to a new location, but also copying the relevant directory structure. This should be doable in the one comman. I've used find to locate the files, and mv to move them, but I can't figure out how to build the directoryt structure on the fly in the new location.
Can someone show me a simple method to find out number of days between 2 dates?
Example: How would I find the number of days between 25-12-2003 to 25-12-2010? Could someone write a line of code that takes in the two dates and outputs the number of days? Or is there a program that can do this?
I know of websites that do this but I'd also like an offline method.
I have 4 Linux machines with cluster.My target is to find all kind of IP address (xxx.xxx.xxx.xxx) in every file in the linux system remark: need to scan each file in the linux system and verify if the file include IP address if yes need to print the IP as the following
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.
I know how to search for normal files but can you let me know " How to search for 5 setuid files on the system. Also explain, for each file, why setuid mechanism is necessary for the command to function properly"
am new to linux and trying to find a file in sub directories using find command as:find .-name *.jpg -type fBut I am unable to get the result as find command is not permitted by the server administrator.Is there any way to find files without using find command.
I want to scan a particular directory recursively and run a particular command with each file as input. For this I am using "find /dir/path". I dont want to write any long script containing loop on the output of "find". I want a single command which will allow me to run a command on each file of the "find" command output.
#!/bin/shLOOK_FOR="NTLMAuthenticationFilter"for i in `find ./ -name "*jar"`doecho "Looking in $i ..."grepjar -e $LOOK_FOR $idoneI wrote the script above, and try to find if there any file name LOOK_FOR exist in those jar,my quest is: grepjar -e $LOOK_FOR $ihere how can I check if there are any successful result , and output them ?
What is the difference between *.xml and *.xml in find command in Linux/macThe results of:find . -name *.xml and find . -name *.xml are different. But why?Also, is locate '*.xml' better than find? Which one is the most commonly used?
How to find info and contents of the SD card via linux terminal. I found a command "mount"in internet. According to it it's in dev/sdb/.. but I didn't found such. I saw dev/sdc/
i have this find command to find modified files and copy them.Code:find $SRC_DIR -type f -ctime -1|xargs -i cp --parents {} $BACKUP_DIR/$DAY/and it works good but it want to exclude files that exist in some folders like $SRC_DIR/cash and $SRC_DIR/woks/tmp
If someone has done something wrong on a shared linux machine. If i want to find out who is that person or ip from where it is been done what are all the possible ways. 1 possibility I thought was to get the PID of the command and get other details from that PID?
t has been long time since i used the command that used to display the most widely used command in the distribution .It was in following format.(I guess it was a combination of history head sort grep or something like that)
50 ls -ltr 3 neat-tui 1 touch abc
I tried finding the command in google but wasnt able to find it
So i think i should fresh install and it would be convenient if i knew the refined commands to locate the saved files in /home hdd a, since 02_06_10 (not including all the hidden or deleted files), and then copy to a memory stick using the nautilus gui.
My goal is to find all pdf files on a remote machine, so I resort to the useful command find. So I type find .pdf or find .pdf" and I get nothing. I do the same on my machine and I get nothing. I do a regular search from the menu on my machine and I find quite a few pdf files. Would somebody please tell me what am I doing wrong?