General :: Command To Find All Of The Files Which Have Been Accessed Within The Last 30 Days?
May 14, 2010Command to find all of the files which have been accessed within the last 30 days?
View 1 RepliesCommand to find all of the files which have been accessed within the last 30 days?
View 1 RepliesWe have one ftp server. Number of users are using it remotly. My requirement is that suppose any user is not connecting to the server using FTP for 15 days then account should get expired/locked automatically. Is it possible?
View 2 Replies View RelatedWith the find command it is easy to find files that have been modified or accessed within a given period. When a file is created, the acesss time is the same as the modify time. But as soon it is accessed (read), the access time changes, but the modify time does not. I need to find files that been accessed at all, ie. files which have access time newer than modify time. How do I do that?
View 2 Replies View RelatedI am writing a bash script which includes the find (and rm command with exec) command the options -mtime -15 or +15 so that files older 15 days must be deleted from the home directory. I think that -mtime +15 should be ok, but in which case does the option -mtime -15 really come in handy?? go back 15 days until today's date?
View 1 Replies View RelatedI'm trying to truncate a postfix Maildata directory for one of our users. I want to be able to move any files older than <n> days to a new location, but also copying the relevant directory structure. This should be doable in the one comman. I've used find to locate the files, and mv to move them, but I can't figure out how to build the directoryt structure on the fly in the new location.
This is what I have so far:
Code:
find /Maildata/editor -mtime +42 -type f -exec mv -v {} /EditorEmailOld/ ; > ~/editor.txt
Can I delete files in trash that is older than 10 days with a terminal command?
View 3 Replies View RelatedI am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
wc `find . ( -name "*.as" -o -name "*.mxml" ) -exec grep -H HeightResizableList {}` ;
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.
View 2 Replies View RelatedI know how to search for normal files but can you let me know " How to search for 5 setuid files on the system. Also explain, for each file, why setuid mechanism is necessary for the command to function properly"
View 1 Replies View RelatedI am using awstats for my website report . but last 3 days i am not using awstats . now i check awstats,, when i give updatenow then it only shows todays report , we cant find last 3 days report.
View 1 Replies View RelatedHow would I find out all the files a particular process accesses?
I am using Ubuntu 9.04.
My goal is to find all pdf files on a remote machine, so I resort to the useful command find. So I type find .pdf or find .pdf" and I get nothing. I do the same on my machine and I get nothing. I do a regular search from the menu on my machine and I find quite a few pdf files. Would somebody please tell me what am I doing wrong?
View 5 Replies View RelatedWhat offline method is there of finding out days since a certain date. Example: How would someone find the number of days from 1-Jan-2003 to 7-Dec-2010? Could someone write a script that takes in the 2 dates and output the number of days?
View 5 Replies View RelatedI have searched for a way to copy file less than X days old and I found this:http://www.howtogeek.com/howto/ubunt...days-on-linux/ The syntax for deleting files less than 7 days old would be like this:find /path/to/files* -mtime -7 -exec rm {} ;I would like to copy the files to mntas, and I'm not sure what the syntax should be.ould this work?find /path/to/files* -mtime -7 -exec cp {} mnt
as ;
i am a newbie in linux ,i am writing a bash script to identify the files which are exactly 7 days ( a week old) i tried this command find /var/backup -mtime +7 -exec ls -d {} ;but this gives me even the files which are older than 7 days
[root@proxy access]# find . -mtime +7 -exec ls -d {} ;
./access.log.1.gz
./access.log.2.gz
[code]...
Linux command to find files changed in last n seconds. shell script,that we can run from cli or command.
View 3 Replies View RelatedI have the following command which finds all files that have changed in the last day and lists them. How can I exclude hidden files like .bash_history?
View 3 Replies View RelatedI am trying to put together ksh shell and I am new with writing scripts.How do you write a command to delete any files if it's 30 days old and also it's not currently being locked?
View 10 Replies View Relatedfind -type f /path/* -mtime +7 -exec rm {} ;Is this the best way to delete only files (not directories) within /path that are older than 7 days? or is there a better way?
View 3 Replies View RelatedI`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
View 3 Replies View RelatedI have a directory there are many files are writing to it , I would like to write a script to do that , can please provide the advise .
archive all files to one single file in every 30 days , and then remove these old files .
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
Code:
ls -l | grep 'date/time range' | mv /folder
The find command does not seem to find all files in my directory hierarchy. My home directory is automounted from a server. The command to illustrate this is:find | sed -e 's/^.///' | sed -e 's//.*//' | sort -uThe result misses several directories. Likewise, a find of a particular file, like:find . -iname *sample* -printwhere sample_file.txt resides in one of the directories that is missing in the first find command, finds nothing
View 4 Replies View RelatedI found this command that works great finding and replacing a simple string to another in files located in that folder and all sub-folders.
Code: find . -name '*.php' | xargs perl -pi -e 's/OldText/NewText/g'
The problem I have is that I need to replace a more complex string, like this: Old string: /mnt/stor6-wc2-dfw1/627896/982574/ New string: /mnt/stor8-wc2-dfw1/369587/302589/ There I don't know how to do it... since the / is what separates the old from the new strings, and the strings that I want to replace have / in it. Also, I would like to know how to specify under what folder replace the files, for example, I want that it search/replaces all files under /var/www/mysite/htdocs folder.
Few years ago i tried a linux in a browser, i do not remember the distro. I'm currently trying to find a distro that where the X session can be accessed via the browser
View 2 Replies View RelatedI have searched many forums but did not find a way to FIND A CORE FILE WITH OUT USING FIND COMMAND.
View 11 Replies View RelatedI'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
View 4 Replies View RelatedIs there any way to find the core files with out using the FIND command?
View 1 Replies View RelatedIs there a way (I'm on kde) to view files (hopefully in dolphin but I'll take anything) by the date at which they were last accessed?
View 7 Replies View RelatedI recently install 13.37 x64 and I'm attempting to get my video card going but I'm having some problems with X11 currently. After running xorgsetup I checked the xorg.conf file with VI editor and the device was configured as "modesetting". I have the "nouveau" driver blacklisted and attempted to use the more stable "nv" driver with my GeForce 9800 GT and the xserver crashes claiming several files can not be found/accessed. The "vesa" driver issues a highly garbled screen and crashes as well.
The driver now claims a kernel module has possession of the video card and the "nv" driver will no longer work. Anyone have any idea because this did not happen when I used version 13.1 x64 with the same card and had X11 configured with "nv" until I could install the Nvidia proprietary driver.
Edit: (update) I reinstalled 13.37 again and this time it registered the X11 server under "nv" as the driver using xorgsetup, yet it still required the "nouveau" driver edited in to actually enable X11. I'm going the SBo Nvidia packages possibly later tonight or tomorrow anyway. Hopefully it should work fine until then.