I need to count files in a dir which were updated yesterday.
ls -lth | grep -i 'Jul 7' | wc -l
The dir holds files of last 15 days and total count is as 2067476. Is it efficient to count the files using perl? I have developed the following perl script making use of system().
I have log files that should be parsed and then deleted by a script on a regular basis. Sometimes things don't work for a variety of reasons and the log files sit and sit and are never dealt with. What I need is a small script that can give me the files older than X days and a count of those files.
What I have so far helps me take care of things manually but I need a little automation in my life Here is what I have: I can count all the files in the necessary directories recursively with this: ls -laR | wc -l And I can find all the files that are older than 10 days that haven't been deleted yet by doing this: find /home/mike/logs -type f -mtime +10 But how do I put both of them into a script that will just give me the end number of both?
#!/usr/bin/perl use DBI; my ($db, $user, $pw) = ('dbname', '****', '***********'); my $dbh = DBI->connect("DBI:mysql:$db",$user,$pw) or die "Cannot connect to $db: $DBI::errstr
[code].....
The error message is
[Wed Feb 24 13:03:27 2010] myscript.cgi: DBD::mysql::st execute failed: Column count doesn't match value count at row 1 at myscript.cgi. [Wed Feb 24 13:03:27 2010] myscript.cgi: DBI::db=HASH(0x8a30c60)->errstr
I'm working on a bash script that will go through a directory, find the sub-directories that have been created since the last time the script ran, count the results, and output that integer (will most likely be '1' or less per each instance run) to a file. Give the circumstances, my previous (and very limited) experience with bash is not sufficient for me to pull this off. since it probably has bearing, is that my mail server stores files that it flags as viruses in a folder. It creates a sub-directory for each virus that it quarantines .I want to count those subdirectories and graph them with MRTG. Hence the script. I'm going to post what I've got so far and the purpose of it, because I'm told I have a very odd and efficient way of doing scripting.
[Code]...
But then it dawned on me that it wouldn't work because I would have to not count the directories that have already been counted and count the ones that have not been counted. Given that the purpose of this is to generate a graph about every 5 minutes, using find won't work because, to my knowledge, that will only find things based on whole day values, I need it almost down to the minute.
I am new to perl scripting and wrote a perl script to read the directories and files and count the no of files in each directory and generate a log file. The problem is it is not printing anything to the log file. I am copying the script below.
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
I have a lot of pdf files and I want to convert them to a lower quality for the web. I tried to use the following command (using ghostscript): Code: gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/default -dNOPAUSE -dQUIET -dBATCH -sOutputFile=output.pdf input.pdf Is there a way to make a batch to do this for all pdf-s in a folder?
I want to delete all files within a specific folder without actually deleting the folder, what is a good bash command for this?. I found this one but encountered some errors even though I am executing it within the specific folder:
useratdebian:/home/user/folder# find . -type f -exec rm -rf {} ; [1] 5052 useratdebian:/home/user/folder# find: missing argument to `-exec' [1]+ Exit 1 find . -type f -exec rm -rf
The command as it appears is:
find . -type f -exec rm -rf {} ;
how to delete only the files contained within the folder called "folder" for example?
I have a script that checks a folder for zip files than moves them to a different folder. I want to check every 5 maybe 10 seconds and since cron is setup to run at least a 1 minute increment I'm not sure how to do that time check as probably a loop within the script. One other thing is once the time check is in the script how would a cron job be setup to run this script? Once the script is running cron doesn't need to run it again, is there a feature to check if it's running and if it's not then run it?
Can someone please help me on how can i create a script that will monitor file creation on a single folder and sending the newly created file on a separate folder? Only the new created file must be transffered or copied to the other folder. The old ones remains.I urgently need this for production deployment.
Way to test permissions on all files/folders into a folder recursive, then if those are not user:user then do :
Code: chown user:user thatconcernedfile
The problem with that
Code: chown user:user -R /folder
is that it is doing changes on file permissions whihch are already ok. If you wanna maintain a specific permission on a folder this is really not good this :
Code: while [ 1 ] ; do chown user:user -R /folder # /folder contains 6.0 Tb sleep 2s done
If I pass in /home, I would like for it to return 4 files. Or, bonus points if it returns 4 files, 2 directories. Basically, I want the equivalent of right-clicking a folder on Windows and selecting properties and seeing how many files/folders are contained in that folder.
How can I most easily do this? I have a solution involving a Python script I wrote, but why isn't this as easy as running ls | wc or similar?
I have written a code on Linux that searches a long dictionary. I have used hsearch() function but the problem is it does not work. This is my code://Search the count values from the dictionary.
I open each DIC file, get the word from it and search the hash table and extract the key from it. The problem with the above code is that it is able to make the hash table but it returns NULL when searching. It should not return NULL in any case because all words from DIC files are there in the dictionary. I am not able to figure out why?
And I'm trying to count the number of slashes in each line. I figured (with my limited knowledge of bash) that the best thing to use would be sed. So I ran this to print "not /": sed '!s////g' file # and eventually adding " | wc -m" to it. and I got the same result as if I ran cat, no modification at all:
Unfortunately, the second grep is greedy swallowing everything up to the last </ul> close tag. (The desired result is 2.) Speed is an issue as I will be searching through 350,000 files.
I am trying to count no. of characters in a word but it is coming one more than what it actually should be.
Code:
I can have a work around by subtracting 1 from the output (6-1=5 in this case). BUT, I am just curious to know, why the character count is coming as 6 and not 5.
What I want to do is from a file having block like
<event> 8 3 0.2685416E-02 2 -1 0 21 -1 0
[code]...
The first line after the "<event>" is its process-id, so I would like to have at the end a summary of how many "event" block I have for each type, ie how many
6 1 0.2685416E-02
or how many
7 2 0.2685416E-02
etc etc
I do not know in advance how many different-kind of block I will have, so it has to be a bit smart to scan the file, and make an new "summary" info for each unique type I was using something like
I need something to make a script that will search some logs and extract IP hits from one country only. Let's say UK. I guess I need to use GeoIP or some database. I just need a very simple bash, perl, php script that will do this job. Just search threw logs (apache) and then give me number of hits found from UK.
This has to also show the line count. I can get it to show the files but not the line count. What is the single command used to identify only the matching count of all lines within files under the /etc directory that contain the word „HOST? List only the files with matches and suppress any error messages.