Programming :: Find Files Older Than X And Count Them?
Nov 29, 2010
I have log files that should be parsed and then deleted by a script on a regular basis. Sometimes things don't work for a variety of reasons and the log files sit and sit and are never dealt with. What I need is a small script that can give me the files older than X days and a count of those files.
What I have so far helps me take care of things manually but I need a little automation in my life Here is what I have: I can count all the files in the necessary directories recursively with this: ls -laR | wc -l And I can find all the files that are older than 10 days that haven't been deleted yet by doing this: find /home/mike/logs -type f -mtime +10 But how do I put both of them into a script that will just give me the end number of both?
View 5 Replies
ADVERTISEMENT
Feb 25, 2010
I'm working on a bash script that will go through a directory, find the sub-directories that have been created since the last time the script ran, count the results, and output that integer (will most likely be '1' or less per each instance run) to a file. Give the circumstances, my previous (and very limited) experience with bash is not sufficient for me to pull this off. since it probably has bearing, is that my mail server stores files that it flags as viruses in a folder. It creates a sub-directory for each virus that it quarantines .I want to count those subdirectories and graph them with MRTG. Hence the script. I'm going to post what I've got so far and the purpose of it, because I'm told I have a very odd and efficient way of doing scripting.
[Code]...
But then it dawned on me that it wouldn't work because I would have to not count the directories that have already been counted and count the ones that have not been counted. Given that the purpose of this is to generate a graph about every 5 minutes, using find won't work because, to my knowledge, that will only find things based on whole day values, I need it almost down to the minute.
View 1 Replies
View Related
Sep 11, 2009
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
wc `find . ( -name "*.as" -o -name "*.mxml" ) -exec grep -H HeightResizableList {}` ;
View 10 Replies
View Related
May 31, 2011
I used following command to sort one day older log files
Quote:
find /opt/TimesTen/tt_transaction_log/ -name "mtsDB.log*" -mtime +1 -print
following are log files which are existing, I have to delete one day older files from this location but when use above mentioned command it won't print one day older files, as i understand "-mtime" modified time, "+1" means one day older. am i correct?
Code:
-rw-rw-rw- 1 ablddb dba 268435456 May 30 17:11 mtsDB.log126985
-rw-rw-rw- 1 ablddb dba 268435456 May 30 17:17 mtsDB.log126986
-rw-rw-rw- 1 ablddb dba 268435456 May 30 17:23 mtsDB.log126987
[code].....
How can i print one day older logfiles?
View 3 Replies
View Related
Jan 11, 2011
I want to count the number of files in each sub folder of a directory structure. At the moment I can do:
ls -1R /Folder | wc -l
Which lists the item count for all the folders as one. I can do:
ls -1R /Folder wc l
Which lists all the folders in the top level and all the items. Is there any way to get the list of folders and then item count for each folder?
View 1 Replies
View Related
Jul 8, 2009
I need to count files in a dir which were updated yesterday.
ls -lth | grep -i 'Jul 7' | wc -l
The dir holds files of last 15 days and total count is as 2067476. Is it efficient to count the files using perl? I have developed the following perl script making use of system().
Code:
#!/usr/bin/perl
@months = (Jan,Feb,Mar,Apr,May,Jun,Jul,Aug,Sep,Oct,Nov,Dec);
($sec,$min,$hour,$monthday,$mon,$year,$wday,$yday,$isdst) = localtime(time);
[code]....
View 12 Replies
View Related
Apr 7, 2011
Word Count for all files in a directory
View 1 Replies
View Related
Feb 24, 2010
I have the following perl/DBI script:
Quote:
#!/usr/bin/perl
use DBI;
my ($db, $user, $pw) = ('dbname', '****', '***********');
my $dbh = DBI->connect("DBI:mysql:$db",$user,$pw) or die "Cannot connect to $db: $DBI::errstr
[code].....
The error message is
[Wed Feb 24 13:03:27 2010] myscript.cgi: DBD::mysql::st execute failed: Column count doesn't match value count at row 1 at myscript.cgi. [Wed Feb 24 13:03:27 2010] myscript.cgi: DBI::db=HASH(0x8a30c60)->errstr
View 2 Replies
View Related
Aug 26, 2010
I am new to perl scripting and wrote a perl script to read the directories and files and count the no of files in each directory and generate a log file. The problem is it is not printing anything to the log file. I am copying the script below.
!/usr/local/bin/perl
$dir= 'c:My ProjectsPerl ScriptsNew Folder' ;
$directory_count = 0;
$file_count=0;
[code].....
View 14 Replies
View Related
Jan 3, 2011
I am new to unix, I am looking for a script to delete files older than 7 days but i also want to exclude certain directories (like arch,log .....) and also some files with extensions ike .ksh, .ch, ..............) in directories and sub directories
View 4 Replies
View Related
Jul 13, 2011
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
View 4 Replies
View Related
Oct 11, 2010
I found this command that works great finding and replacing a simple string to another in files located in that folder and all sub-folders.
Code: find . -name '*.php' | xargs perl -pi -e 's/OldText/NewText/g'
The problem I have is that I need to replace a more complex string, like this: Old string: /mnt/stor6-wc2-dfw1/627896/982574/ New string: /mnt/stor8-wc2-dfw1/369587/302589/ There I don't know how to do it... since the / is what separates the old from the new strings, and the strings that I want to replace have / in it. Also, I would like to know how to specify under what folder replace the files, for example, I want that it search/replaces all files under /var/www/mysite/htdocs folder.
View 1 Replies
View Related
Dec 17, 2009
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
View 3 Replies
View Related
Jul 30, 2010
Is there any way to find the core files with out using the FIND command?
View 1 Replies
View Related
Mar 31, 2010
I already know about Ubuntu Tweak but the list of kernels seems to show only my 9.10 kernels. I checked GRUB and the 9.10 kernels are linux 2.6.31-17 and 2.6.31-19 but (acording to GRUB) the ones I am looking for should be version 2.6.28-17.
View 9 Replies
View Related
Sep 1, 2010
I am fairly new to linux but I want to write a function to find any file with only a partial name. I can only use sh shell and busybox applets for this.I could do something like the sad code below...
Code:
TEST_ONE=$(find /path/to/directory -name *$1*)
TEST_TWO=$(find /path/to/directory -name $1*)
TEST_THREE=$(find /path/to/directory -name *$1)
[code]....
fi I just made that up but obviously it is pretty bad I'm sure there is a much better way to do it but I just can't think of a way. I also would like to have the file found even if capital letters are used and the file is all lower case.
View 2 Replies
View Related
Oct 22, 2010
I have spent the last hour searching for a solution to this, but I can't get it to work. Here is what I am trying to do:
I have directories for different months in one folder. So for example Code: ../folder/Jan/
../folder/Aug/
etc. Some of the folders have a dot in front of the month as so: Code: ../folder/.Sep/
../folder/.Oct/
[Code].....
I am trying to find all the csv files EXCEPT those in a folder that has a dot. For example I want all the csv files in ../folder/Jan/ but I want none in ../folder/.Oct/.
I also want to exclude all the files in the /Aug/ folder that represent days 10-31.
Here is what I have so far: Code: find /some_path/folder/ ( ! -name "Aug[10-31]*.csv" ! -path "/.*/" -name "*.csv" ) | more This command lists all the .csv files except those in the /Aug/ files. So it just ignores the /Aug/ folder completely but lists every other .csv file.
View 3 Replies
View Related
Feb 28, 2009
I'm trying to find ssh logs from up to 6 months ago. I can only access the /var/log/secure* logs up until the beginning of this month. Any way to find the older ones? Do they get archived somewhere else?
View 4 Replies
View Related
Oct 19, 2010
Need to make sure a security line is added in to every webpage on a site, trying to find out how to list only the filenames of the pages that are missing the text. awk or grep? o what I want is to list all files NOT containing the word 'securemasthead'
View 2 Replies
View Related
Dec 22, 2010
How to find a word from different files in linux ?
Is there any command like (find . / -name *****), that can search files in the system for a particular word in Linux?
View 5 Replies
View Related
Mar 11, 2011
I am trying to find a nightly backup if it was successfully copied over, rename it and curl, but it's always passing the check even if the file is older than specified. From the command line it does as it should. Example is here;
Code:
find /backup -type f -mmin +4440 -exec echo "found" {} ;
- nothing returned (good). Then I change the time
[code].....
View 4 Replies
View Related
Jun 16, 2010
I want to use kmalloc() to allocate contiguous memory on ram. But I can not seem to find the required header file(s) like linux/slab.h. I suppose I do not have the required library and I certainly do not know what and where to look.
View 14 Replies
View Related
Nov 9, 2010
I have these but not working.
size_t count_words(const std::string& s)
{
std::istringstream is(s);
return distance( std::istream_iterator<std::string>(is),
std::istream_iterator<std::string>());
}
and
std::string s; // word count
unsigned int wordno = 0;
while(getline(myfile, s))
{ ++wordno;
if(s.empty())++ wordno;
else wordno += count_words(s);
std::cout << s << '
';
}
myoutput1<< "Number of words : " << wordno << '
';
myoutput1 << "
Word" <<" " <<" Occurrence" <<endl;
myoutput2<< "Number of words : " << wordno << '
';
myoutput2 << "
Word" <<" " <<" Occurrence" <<endl;
View 6 Replies
View Related
Oct 12, 2010
I have a really deep directory tree on my Linux box. I would like to count all of the files in that path, including all of the subdirectories.
For instance, given this directory tree:
/home/blue
/home/red
/home/dir/green
/home/dir/yellow
/home/otherDir/
If I pass in /home, I would like for it to return 4 files. Or, bonus points if it returns 4 files, 2 directories. Basically, I want the equivalent of right-clicking a folder on Windows and selecting properties and seeing how many files/folders are contained in that folder.
How can I most easily do this? I have a solution involving a Python script I wrote, but why isn't this as easy as running ls | wc or similar?
View 5 Replies
View Related
May 3, 2010
I would like to count all the jpgs in my home folder I need a command like this:
~$ Sudo count -R /*/*.jpg
View 3 Replies
View Related
Oct 18, 2010
Code:
find "$SOURCEDIR" -type f -name "*$ITEM" -printf "%P
"
I want to apply some shell script to the files outputed by the find command.
How can i do this.?
There are multiple files directories and also multiple files.
View 14 Replies
View Related
Feb 3, 2011
i want to find all files with .h or .c extension and print them on the screen. How can i do it with bash script programming?
View 5 Replies
View Related
Dec 20, 2010
So this is my code:
Code:
Modification of code I found here. It works, but I don't really know why.
Q1: Why is each filter hit counted only when the conditional is not true?
Q2: I've tried taking the file type, (.old), and put it into a variable for better usability, but then the script fails.
View 14 Replies
View Related
Mar 24, 2009
can anyone provide me with the path where i can find the library files. stdio.h, sys/types.h.......
View 4 Replies
View Related
Feb 1, 2011
I have written a code on Linux that searches a long dictionary. I have used hsearch() function but the problem is it does not work. This is my code://Search the count values from the dictionary.
Code:
#define _GNU_SOURCE
#include<stdio.h>
#include<stdlib.h>
#include<string.h>
#include<search.h>
#include<inttypes.h>
#include<math.h>
[Code]...
I open each DIC file, get the word from it and search the hash table and extract the key from it. The problem with the above code is that it is able to make the hash table but it returns NULL when searching. It should not return NULL in any case because all words from DIC files are there in the dictionary. I am not able to figure out why?
View 1 Replies
View Related