General :: Logrotate Not Rotating Files With Date Extension?
Oct 25, 2010
I am trying to configure logrotate on APP/DB servers.As per my backup policy,logs will compress in daily basis and and will be moved to a Central storage device.
My tomcat generate several application logs with date extension as well as .log extension.For eg app.log,app.log.2010-10-23-14,catalina.out,catalina.2010-10-25.log etc.
Currently my tomcat logrotation /etc/logrote.d/
#cat /etc/logroate.d/tomcat/
/usr/local/tomcat/logs/*log {
[code]....
But its rotating logs only with .log extension..ie app.log.2010-10-23-14 (with date extension) is not rotating.If i put "*" instead of "*log",its rotating all files including rotated files. How can i rotate files which is having date extension.Also i dont want to keep rotated logs for more than 3 days.
how I should go about rotating files that end with a date stamp. This is the configuration I have to rotate my Apache access files, but it is not working:
I am trying to find a command which will copy all the files in the folder with extension ".log" which is created one day before the current date. By going through other threads in this forum I found the half solution to this problem
find /mnt/hd -mtime -1 -exec scp {} /mnt/usb ;
This command copying the all the files created one day before(not only *.log) to the /mnt/usb folder. what is the modification required to above command.
I need to logrotate logs in directories in /var/log/httpd/.
There are 4 directories in /var/log/httpd/... these directories are /var/log/httpd/access/ /var/log/httpd/debug/ /var/log/httpd/error/ /var/log/httpd/required/
Each of the access, required, error and debug directories have around 20 to 30 access log files of different locations for example:mumbai-access.log, pune-access.log etc..same is the case for 'error' dir 'required' dir and 'debug' dir in /var/log/httpd/
I need to clean up the logfiles in all the 4 directories access, error, debug and required...
I have made a custom logrotate file as follows:
Is the above config correct?
Am I missing something? Will this logrotate the files in /var/log/httpd/access, /var/log/httpd/error, /var/log/httpd/required and /var/log/httpd/error ?
Do i need to include following line in postrotate " /bin/kill -HUP `cat /var/run/httpd.pid 2>/dev/null` 2> /dev/null || true" ?
Is the logrotate.conf settings global/apply to what is in logrotate.d/? I have olddir /var/log/old_logs in logrotate.conf but logrotate is not placing old rsyslogs in /var/log/old_logs for logrotate.d/rsyslog
I have CentOS 5. From sometime logrotate is not working and maillog for example is very big. It is the same for all logfiles. I run "logrotate -d -f /etc/logrotate.conf" but nothing happened. Cron seams to work as I see it with ps -ef |grep cron
I would like to set up tcpdump to rotate log file every 1 hour and retain files for the lat 14 days but I don't think any combination of -C and -W would allow me to do that (Atleast I haven't been able to figure it out), so I am trying to rotate the files every X number of MB and retain the last 20 files. This seems to be fairly simple with the '-C X -W 20' option but I am having some trouble in customizing the names of the log files. I have tried '-w capture-$(date +%Y-%M-%d-%H:%M-)' thinking that each file would start with the current date and time but all files are using the date and time when the capture was started so the only difference is the number at the end (which is done by -W). if I can customize the names of the file so that it has the date and time when the capture in started. In fact if I can do that, I dont need the numbers that '-W' appends at the end but I dont know how to get rid of them.
I'm looking for a method for modifying some jpg photo files last modification date with the corresponding timestamp creation date of each file.The reason is that shotwell import pictures in folders according to last modification date which is stupid on my opinion.
writing a script that would keep the last three versions of tcpdump files.Due to the version of tcpdump I must use -C and cannot use -G. Using -C generates a new file after X MB's have been written and adds a .x after each new one. The problem is that these files are filling up the disk too quickly. The main part of the script will kill tcpdump when a certain condition is met but in the meantime I need to purge and only keep say the three last iterations of the dump file. So for example, there is dump.pcap.1, dump.pcap.2, dump.pcap.3, dump.pcap.4 and dump.pcap.5. I'd like the script to look at the datestamps and delete dump.pcap.1 and dump.pcap2 since the other three are the three newest files. comparing files based on dump.pcap.*, check the dates and only keep the three 'youngest' files?
Suppose there is a directory named mydir containing ... aaa.cpp aaa.h bbb.cpp bbb.h Makefile a b Where a,b are executable files. What I want to is to only copy a and b to another location. Is it possible? (other than by manually issuing copy a,b another_dir).
In Fedora, I used the ls -l command to see the directory listing. But I noticed that while all .c files were being shown in green , there was one .c file which was being shown in black.How can two files having the same extension be executable /normal?
I want to install Forms & Reports Developer Suite in Linux OS So I have downloaded Oracle Developer Suite 10g for Linux (Including Forms & Reports) as_linux_x86_ids_101202_disk1.cpio as_linux_x86_ids_101202_disk2.cpio
So in the Oracle Documentation Site they have given Guide as Follows: To extract the cpio file, move the cpio file to an empty directory, then do: cat filename.cpio | cpio -icd .... but its not extracting.
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
i want such a shell script or single line command to delete all the files with extension specified in script i have bash !! ex... delete all files of extension .obj
I had a program run riot and it has created hundreds of spurious files in one directory. Fortunately they are all dated 4th November so are easily identified. What bash command can I use from the console to delete them all?
Is there any way to untar and only extract those files that are above a certain date including directory structure??
I restored a backup on a play server but it was a few days old. However I have a tar archive of the entire structure that is more up to date and healthy so now I want to extract all files (including directory structure) based on a date filter on the files if possible?
I am running ubundu 10.10 and want to copy all files revised after a certain date (01.02.2011) to a certain location (usb memory stick) for backup purposes. How do I use the "cp" command, or do I have to use any other command ? Or may be this is not possible in Linux ?
This looks good, the files expected to be seen are output: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -print
But this shows me files that should not be output and likewise when I replace ls with tar it is tarring a whole bunch of stuff I do not want: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -exec ls -l {} ;
In the end I would like to replace the "ls" with "tar cvvfp some.tar {} ;", but can't figure out what is going wrong here.
I was going to do a rsync -r -a -z -v -p -e sshto move some files frome server to another, but then realized all I really need are files which have dates starting June 1, 2008 to current. Is there a way to have rsync only sync those files?he directory structure that's my source goes all the way back to 2004.
I know find can do what I am looking for, but I am wondering if there is an alternative way to find files on the filesystem either created before/after a certain point, or at a certain time.
Typically I rely on updatedb & locate for most of my file searching needs. Issues with those tools, though, are that it only has directory and file names, and it only creates a database of local directories, not anything mounted via CIFS|NFS or via -o loop (eg, .iso images).
So if I need to find files created after yesterday across the entire system (local and remote filesystems), I am currently needing to use find.
What other tools, if any, would accomplish this in a similar fashion?
I have tried ls and grep, but that requires (in my attempts so far) multiple searches:
ls -lR | grep Aug | grep 10 ls -lR | grep Aug | grep 11
I have some basic experiencing creating simple scripts/making directories/changing permissions/etc. but I'm stumped on this one.
I have two linux boxes. I have a script set up on box 'A' to SCP into box 'B', grab a copy of a database backup and store it on box 'A'. It looks like this:
I have generated a public key on box 'A' and placed it into the authorized_keys file on box 'B', so a password is not required and the file copies over successfully when the script is run. On to my problem...
I need to know what date the 'dump.23.gz' file was originally created when I'm viewing it after it's been copied to box 'A'. If I ls -l on box 'A' it only shows me the date it was created on box 'A' when it was copied.
What would I need to add to my script to append the backup's original creation date on box 'B' to the filename so that when it gets copied to box 'A' I know when the backup was created on box 'B'. I'm sure this is probably confusing. I've done lots of searching and can only find information on how to append the current date and time to a file name. I need to append it's original creation timestamp to the filename when it copies over.
I currently have a command to backup a directory it will zip the directoryand place it where i have told it too, Now what i am after is a command i can run before my code, that will delete and tar.gz files before todays dateso i my ideal world it would be something like this, delete <'date +%m_%d_%y'.tar so this will delete all the files in this folder before todays date,
I need to know all files modified within a date and time range.E.g: All modified files between 20 April 2010, 1100-1200 Hrs."find / -mtime +10 ! -mtime +11" :: this i found for date but how to include time as well.
I'm trying to write a script that searches my files and lists them by date. Can someone point me in the right direction? I've been looking through the books that i have but i'm just not finding the right commands to search dates.
I wanna copy all folders and files created from 01.01.2011 until today to new placeie:cp -r /home/moviecar/public_html/wp-content/uploads/ /home/teaser/public_html/wp-content/uploads
I have a folder with hundreds of .txt files (logs of some java application) that I have to merge in to one single .txt file. This application produces a new log file everyday:
day1: logFriday10September2010.txt day2: logSaturday11September2010.txt ... day8: logFriday17September2010.txt ... and so on...
I could merge the files easily with "cat" and ">>" however, the problem is that I have to do it by taking into account the date (creation or modification) of the file.
If I simple use the cat command the output file will receive for example, all Fridays in a row, then all Saturdays, etc. and in that way I'm not considering the date.
I've searched for the options of the find command, since the files after creation are not modified...I try to use this for example:
$ find . -newer <some old file>
but that lists me all files after that <old file> and not by correct date.