Software :: Logrotate Duplicate Log Entry
Jan 1, 2011having a problem with logrotate. it's complaing about a duplicate log entry but there isn't a duplicate log entry that i can see.
View 4 Replieshaving a problem with logrotate. it's complaing about a duplicate log entry but there isn't a duplicate log entry that i can see.
View 4 RepliesIs the logrotate.conf settings global/apply to what is in logrotate.d/? I have olddir /var/log/old_logs in logrotate.conf but logrotate is not placing old rsyslogs in /var/log/old_logs for logrotate.d/rsyslog
View 5 Replies View RelatedWhenever I run $aptitude update I get this error :-
W: Duplicate sources.list entry http://ftp.us.debian.org/debian/ stable/non-free amd64 Packages (/var/lib/apt/lists/ftp.us.debian.org_debian_dists_stable_non-free_binary-amd64_Packages)
W: You may want to run apt-get update to correct these problems
[code]....
This error comes from the update manager, but I can not see duplicate entries in the software sources. Is there a program that removes duplicate sources? If not how do I find the duplicate entries. Quote:W: Duplicate sources.list entryhttp://archive.canonical.com/ubuntu/ maverick/partner i386 Packages/var/lib/apt/lists/archive.canonical.com_ubuntu_dists_maverick_partne r_binary-i386_Packages)
View 5 Replies View RelatedWhen I run "sudo apt-get update", it says the following:
"W: Duplicate sources.list entry [URL] Packages (/var/lib/apt/lists/archive.canonical.com_ubuntu_dists_lucid_partner_b inary-i386_Packages)
W: You may want to run apt-get update to correct these problems"
Herewith I've attached my sources.list file.
# deb cdrom:[Ubuntu 10.04 LTS _Lucid Lynx_ - Release i386 (20100429)]/ lucid main restricted
# See [URL] for how to upgrade to newer versions of the distribution.
deb cdrom:[Ubuntu 9.10 _Karmic Koala_ - Release i386 (20091028.5)]/ karmic main restricted
deb [URL] lucid main restricted
deb-src [URL] lucid main restricted .....
If in Software Sources I have "Download from" set to "Server for Australia", then when I run sudo apt-get update, I get a warning at the end
Code:
Reading package lists... Done
W: Duplicate sources.list entry http://dl.google.com stable/main Packages
[code]....
When I run the update manager and start updating, I get this error:
W: Duplicate sources.list entry http://archive.canonical.com/ubuntu/ lucid/partner Packages (/var/lib/apt/lists/archive.canonical.com_ubuntu_dists_lucid_partner_b inary-i386_Packages)
I'm trying to migrate a physical server to virtual hardware. The old server runs RHEL 4 with MySQL version 4.1.12. The new server runs Ubuntu 10.04 with MySQL 5.1.41.
In order to export all the MySQL databases from the old server to the new server, I ran the following command on the old server:
Code:
I then attempted to import them on the new server with:
Code:
The command successfully imports about half the databases, but then fails when it gets to a particular table in a database for one of our custom web applications. The error message is:
Code:
I've located that line in the dump.sql file, and as far as I can tell it's not actually a duplicate entry. I've also gone through dozens of bug reports and forum posts about an issue where this situation arises because a key is not set to auto-increment, but in this case the key is set to auto-increment.
I need to logrotate logs in directories in /var/log/httpd/.
There are 4 directories in /var/log/httpd/... these directories are /var/log/httpd/access/ /var/log/httpd/debug/ /var/log/httpd/error/ /var/log/httpd/required/
Each of the access, required, error and debug directories have around 20 to 30 access log files of different locations for example:mumbai-access.log, pune-access.log etc..same is the case for 'error' dir 'required' dir and 'debug' dir in /var/log/httpd/
I need to clean up the logfiles in all the 4 directories access, error, debug and required...
I have made a custom logrotate file as follows:
Is the above config correct?
Am I missing something? Will this logrotate the files in /var/log/httpd/access, /var/log/httpd/error, /var/log/httpd/required and /var/log/httpd/error ?
Do i need to include following line in postrotate " /bin/kill -HUP `cat /var/run/httpd.pid 2>/dev/null` 2> /dev/null || true" ?
I am looking for a good site to download logrotate for RHEL.
View 1 Replies View RelatedRecently I noticed that on my Centos 5.4 system, yum no longer works and is giving segmentation faults. I can run "yum --help" and it works, but if I try to run something like "yum upgrade php" it will fault. I also noticed that other things are seg faulting as well, like /usr/sbin/logrotate and /usr/bin/certwatch.
I am guessing there is some sort of common library that needs fixing, but I have no idea what. I've read other posts about the yum segmentation fault and have tried various steps provided but so far no luck in getting it to work again. It used to work, and I rarely change this system so I'm not sure what could have caused it.
I have CentOS 5. From sometime logrotate is not working and maillog for example is very big. It is the same for all logfiles. I run "logrotate -d -f /etc/logrotate.conf" but nothing happened. Cron seams to work as I see it with ps -ef |grep cron
View 2 Replies View RelatedI have been trying out in learning with logrotate command and logrotate configuration file )logrotate.conf custom logfile for an process is 'test.log'
Code:
#cat /etc/logrotate.d/test
/var/log/test.log {
rotate 4
[code]....
whenever the log file (test.log) exceeds 100M a new file will be created with the file name as test.'date'.'gz'(new file is created with a current date and in a compressed format of gz) and also with permission mentioned above). I really dont know what is the role of rotate( will this be carried on only for next 4times i mean upto 400MB; (4times*file reaches 100MB)? and also what could be the purpose of postrotate?
Say that a certain server process generates log files and names them according to the current date; e.g.
server.nov-20.2010.log
server.nov-21.2010.log
server.nov-22.2010.log
server.nov-23.2010.log
i'd like to have logrotate compress the logs that are older than 3 days. Is this possible with logrotate, or do i just schedule a cron job to bzip everything under the folder older than 3 days?
I have a postfix mailserver that works fine except for the logrotate.
syslog.conf
mail.* -/var/log/mail.log
logrotate.conf
/var/log/mail.log {
[Code]....
So when cron does the logrotate, there is a new logfile but its empty. After i restart the syslogd it gets back to its normal logging.
What am i missing? All this works with CentOS, why is Ubuntu such a pain...
My apache2 logs aren't being rotated, I have 1 log nearing 100MB in size.
Error shown below when a logrotate happens on apache2 logs:
Code:
error: other_vhosts_access.log:5381 unknown option 'jack' -- ignoring line
error: other_vhosts_access.log:5381 unexpected text
"jack" is a sub-domain.
I am trying to configure logrotate on APP/DB servers.As per my backup policy,logs will compress in daily basis and and will be moved to a Central storage device.
My tomcat generate several application logs with date extension as well as .log extension.For eg app.log,app.log.2010-10-23-14,catalina.out,catalina.2010-10-25.log etc.
Currently my tomcat logrotation /etc/logrote.d/
#cat /etc/logroate.d/tomcat/
/usr/local/tomcat/logs/*log {
[code]....
But its rotating logs only with .log extension..ie app.log.2010-10-23-14 (with date extension) is not rotating.If i put "*" instead of "*log",its rotating all files including rotated files. How can i rotate files which is having date extension.Also i dont want to keep rotated logs for more than 3 days.
We started hosting some very large content on our site, and the usage patterns in cacti have revealed that the HTTP sessions through our load-balancers drop off dramatically right at midnight.
The logrotate process runs right at midnight, and issues a reload command through the service tool (CentOS 5.5):
Code:
$ cat /etc/logrotate.d/httpd
/data/websites/logs/*_log /var/log/httpd/*log {
missingok
daily
dateext
compress
rotate 7
sharedscripts
postrotate
/sbin/service httpd reload > /dev/null 2>/dev/null || true
endscript
}
Looking at the init script reveals that the reload section is suppose to trigger a HUP of the httpd process:
Code:
reload() {
echo -n $"Reloading $prog: "
if ! LANG=$HTTPD_LANG $httpd $OPTIONS -t >&/dev/null; then
RETVAL=$?
echo $"not reloading due to configuration syntax error"
failure $"not reloading $httpd due to configuration syntax error"
else
killproc -p ${pidfile} $httpd -HUP
RETVAL=$?
fi
echo
}
In which, Apache should reload it's configuration and start the new logfile without breaking current sessions. However that clearly isn't what is going on. I'm tempted to edit the logrotate script to trigger a HUP directly by cat'ing the PID file directly. Is this normal behavior for Apache when signaled with a HUP?
I have an Ubuntu server, and I have a special script on logrotate.d to rotate the samba_audit logs:
/var/log/auditsamba/auditoria.log {
weekly
rotate 12
missingok
[code]....
Before I start writing my own file maintenance script, maybe one such program/scripts already exist somewhere. Am looking for a file maintenance script/application that is configurable that I can use to process files under certain criteria, for example, removing files that are x-number of days old, gzip'ping files if they are core dumped files, removing files if they are zero-sized files etc. Am not sure logrotate is the solution that am looking for.
View 1 Replies View RelatedI have the following error on one of my servers. Is there a way to tell which directory is exactly having the problem? if there is, if i delete that directory, will that resolve the problem or no?kernel: EXT2-fs error (device md(9,0)): ext2_check_page: bad entry in directory #10158084: unaligned directory entry - offset=0, inode=605471640, rec_len=7606, name_len=177
View 4 Replies View RelatedI finally got round to transcribing our wedding video from VCR to DVD. Now I would like to duplicate that video a few times and pass copies round the family, more to preserve it than to bore the relatives.
I dragged the DVD icon from the desktop to a thunar (file manager) window and it created a folder containing this lot code...
Total size 1.3 GB.
But I have no idea what to do with these files.
I tried this command code...
and ended up with a 4.4 GB file, which Archive Manager says contains the above folders, and presumably also contains an image of 3.1 GB of formatted but unused disk.
What's the best thing to do here? The ISO file will presumably burn a good DVD copy but it's a waste of disk space to keep it, (I'd like to keep a copy on hard disk too) and I guess will also take needlessly long to burn.
Can I create a smaller ISO from the file folders? If so, is there an app to edit the titles before I do so? The recorder put a second title "Empty Title" on the title page which I'd like to get rid of. I discovered by accident that the VOB files play in parole media player - one is the title screen, and the other two contain about half the video each. So I don't know what all the other files are about.
Anyone know of a good Linux application that will remove duplicate files interactively? I've recently spent a lot of time (read weeks) pruning my music collection, basically by hand. and now I'm moving on to my family photos. Most of the work with the music was done under Windows XP. As for the photos, I have a fantastic Windows application, D'Peg, that I had actually purchased some years ago. This app rocks for Windows. In my opinion it's so good that I would happily pay double the asking price. However, I'd prefer to use Linux if possible, so, what's out there, anything that is worth it's salt? Currently playing around with Picasa.
View 2 Replies View RelatedIs there a program for linux which can show me a list of all duplicate music files in a directory? This will allow me to delete all duplicate files without searching for them manually.
View 9 Replies View RelatedI tried to put a terminal in the desktop using more or less the procedure described in this article:
[URL]
It worked fine, but then I installed avant-windows-navigator and I think it created some conflicts with compiz and other applications.Both the desktop terminal and Skype started to appear in duplicate when I logged in. So I went to the startup applications menu and disabled them both. When I logged back in they both appear only once, but the terminal is in the wrong spot in the desktop.So there are probably some startup registers somewhere that are causing this.i've tried reinstalling compiz and I uninstall AWN and it didn't work.
i have network ( 5 windows and one ubuntu )simply i changed the hostname of my ubuntu pc and changed the workgroup name put the old pc name ( old hostname ) still on the network also the old workgroup name
View 2 Replies View RelatedHow do you delete duplicate posts? Clicking on the edit button doesn't allow one to delete, just edit.
View 1 Replies View RelatedOnly the primary monitor has the panels. How to duplicate all the panels in the second monitor.
View 1 Replies View RelatedI am currently trying Gnome 3 on Natty. Unity didn't do it for me. One weird problem I have run into with Gnome 3 is duplicate icons. There will be 2 of various icons. One icon normal and the second blurry. Other than this Gnome 3 seems to be fine. Anyone know how to fix this?
View 4 Replies View RelatedI had a duplicate IP from another device on the network that would jut not let it go and was the same IP as my Fedora box that was working fine. After screwing around with the other device I finally just gave up and changed the IP on the Fedora box. Now I can't access the internet at all from the Fedora box. I looked in my routing tables in my router and the mac was showing as the mac of the other device. After some reboots here and there that is fix and the routers routing tables are now showing correctly. The Fedora box still can not resolve any domains or get online. Is there something somewhere inside the Fedora box that is still jacked up from the duplicate IP?
View 9 Replies View Related