General :: Find A Proper Command To Move A Certain Set Of Files According To Date/time Range?
Mar 18, 2009
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
This looks good, the files expected to be seen are output: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -print
But this shows me files that should not be output and likewise when I replace ls with tar it is tarring a whole bunch of stuff I do not want: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -exec ls -l {} ;
In the end I would like to replace the "ls" with "tar cvvfp some.tar {} ;", but can't figure out what is going wrong here.
I need to know all files modified within a date and time range.E.g: All modified files between 20 April 2010, 1100-1200 Hrs."find / -mtime +10 ! -mtime +11" :: this i found for date but how to include time as well.
I would appreciate help with how to extract the date and time from at command jobs. From what I can tell, the date and time is embedded in the file name (/var/spool/atjobs).I'd be using this information in a (bash) shell script.
I'm looking for a c++ code that search for all files in computer between two input dates (example- 3.3.2011 and 11.4.2011)and copy all file in that range in new file .the user run the program and input date and path in dos system
I know find can do what I am looking for, but I am wondering if there is an alternative way to find files on the filesystem either created before/after a certain point, or at a certain time.
Typically I rely on updatedb & locate for most of my file searching needs. Issues with those tools, though, are that it only has directory and file names, and it only creates a database of local directories, not anything mounted via CIFS|NFS or via -o loop (eg, .iso images).
So if I need to find files created after yesterday across the entire system (local and remote filesystems), I am currently needing to use find.
What other tools, if any, would accomplish this in a similar fashion?
I have tried ls and grep, but that requires (in my attempts so far) multiple searches:
ls -lR | grep Aug | grep 10 ls -lR | grep Aug | grep 11
I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
So I was wondering, if I capture this output into a file (ie. one file per line), can anyone help me write a command which iterates through the file and moves the files one by one to a specified directory?
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
I have 10 files of .dat formatsayA MCDR .datB MCDR .datand so one upto 10 dat filehow i convert these 10 dat files to txt files using single command or script
I'd like to change a files modification date "only" without changing the time. I'm aware of the 'touch' command but is seems like it only allows changing both the date and time, and not one of them. Any ideas on an easy way to change a file's modification date without also changing its time? (I have a long list of files and thus would like to run one to command to change them all)Example: Change a file's (month) timestamp from "2010-09-23 11:59:23" to "2010-10-23 11:59:23"Background: I accidentally set the wrong month on my camera and ended up with all photos having a modification timestamp with the wrong month.
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.
I know how to search for normal files but can you let me know " How to search for 5 setuid files on the system. Also explain, for each file, why setuid mechanism is necessary for the command to function properly"
We have a script that FTP files 3 times a day, once ant 02:30, 04:00 and 13:00. Once the process runs it puts a copy of the file sent in the processed folder. What I'm trying to do is check to see if the files are there and if not send an alert /email. The file names are IVF_20100806_*.150, PLAZ_ 20100806_*.151, TRAN_20100806_*.152 and TRAN_20100806_*.151
I'd like to measure network latency for SNMP GET request. There is a free command line tool time which can be used to find timing statistics for various commands. For example it can be used with snmpget in the following way:$ time snmpget -v 2c -c public 192.168.1.3 .1.3.6.1.2.1.2.2.1.10.2IF-MIB::ifInOctets.2 = Counter32: 112857973real 0m0.162suser 0m0.069ssys 0m0.005sAccording to the manual, statistics conists of:
the elapsed real time between invocation and termination, the user CPU time (the sum of the
when we do enter on a folder it take some time for loading the folder depending on the no of entries in the folder . If the folder has more entries it take more time to load and if less no of entries then correspondingly less time . the delay in loading the folder varies due to reading of the folder entries in advance . SO what i want to know is that what is the MAX no of entries read in advance while opening a folder in linux and also how can we calculate this
I have to copy and move files over two systems all the time. So when I am on system 1, I simply use the command
Code: $scp * system2:/some_directory
There are many files in PWD of system1 with different extensions. Of all the files in the PWD on system1, I don't need a file called *residual.dat as it it particularly big and wastes a lot of time copying.How can I make a shortcut so that every time I do scp, it copies everything but the *residual.dat file?
I wrote this little script and I need some help, I am trying to achieve following:Every day I receive new file in the /home/denis/MyData/ folder and I don't know what the file mane will be but I want to move any file that arrives there to the new location /media/DataBackup/Linux/backup/ (media/DataBackup/ is external 500GB USB drive)to automatically create new folder with the date and time stamp every day and then to move content of the /home/denis/MyData/ into the new folder with current date stamp. So every day there will be new folder and will contain files for that day only.My script is as follows:
cd /media/DataBackup/Linux/backup/ mkdir MyData_$(date +%Y%b%d_%HH%MM) #this creates file MyData_current date and time
I want to search in my apache log, for events which have occurred say between 11:00 AM to 2:00 PM. I have got few scripts/commands but they are not conclusive, some of then are trying to do an exact match(awk) and for some i am just getting the pattern wrong (eGrep)
My goal is to find all pdf files on a remote machine, so I resort to the useful command find. So I type find .pdf or find .pdf" and I get nothing. I do the same on my machine and I get nothing. I do a regular search from the menu on my machine and I find quite a few pdf files. Would somebody please tell me what am I doing wrong?
I have the following command which finds all files that have changed in the last day and lists them. How can I exclude hidden files like .bash_history?
We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.
Here are the commands I have mainly been trying with various edits:
Code:
Code:
So far the most common complaint I have gotten "missing arguments to execdir".