Software :: Exclude Host / Ip From Sarg Reports?
Jun 9, 2010how can i exclude specific ip frm sarg reports? i have done with exclude_hosts but still it is not working, any one have solution on this?
View 1 Replieshow can i exclude specific ip frm sarg reports? i have done with exclude_hosts but still it is not working, any one have solution on this?
View 1 Replieshow to report the sarg with host name??
View 1 Replies View RelatedI got the following task from my boss. I have to find out if there is some alternative tool for create reports from Squid except SARG. Now, we use SARG, but my boss told to me, that the main problem of SARG is, that SARG generate huge amount files, which cause problems during migration our servers. He told to me the following condition for change of current tool (SARG):
* standard package of Debian
* generate less amount of files, optimal is to save reports to the database
So I would like to ask you if you know about some tool (I can not find some by google)... and the best would be if you told to me some practical experiences.
SARG seems ok but it is not generating any reports.... "Now generating Sarg report from Squid log file /var/log/squid/access.log squid and all rotated versions .... Sarg finished, but no report was generated. See the output above for details. There is also no view generated reports too.
View 1 Replies View Relatedi have just install Sarg-2.2.3.1.tar.gz but When finished compiled i cannot see sarg.conf (in directory /etc/.. or /etc/httpd/conf.d). May i know where is it.?? I'm not sure compiler which it's work good or not good. Some logs show in here:
Quote:
[root@proxy sarg-2.2.3.1]# make install
cp sarg /usr/bin/sarg
chmod 755 /usr/bin/sarg
cp sarg.1 /usr/local/man/man1/sarg.1
chmod 755 /usr/local/man/man1/sarg.1
[Code]...
Since a week or 2 Nagios is constantly marking hosts (servers mainly but also a few Serial-over-IP converters) down for anything up to a few minutes. Typically, all services stay in OK status. Looking closer at a host in such state, it's status information is
CRITICAL - Packet Filtered (<IP address of host in question>)
in soft state. Sometimes services are in critical state with the host in OK status. Almost always the status information is "No route to host". Further checking shows no problems. Rarely this state lasts longer than one check interval. This started after a link was down, putting, correctly, all hosts and services on red for being unreachable. The link problems were solved within a few hours but Nagios only showed this after 2 reboots. Since then the problems has lessened in frequency gradually, from 5 to 10 of the 34 hosts being reported down at any given moment (the same for the 150-ish services monitored) to where I am now, 1 to 5 problem statuses (counting both hosts and services) and the occasional 'all green' screen. A week ago, when the problems were going a week already, Nagios updated from 3.2.0 to 3.2.1. This showed no apparent improvement.
Still, the Host Groups screen is not stable. A read or yellow status initially signified a problem to be looked at, right now it is likely a false alarm to will go away. Whenever the "packet filtered" or "no route to host" is followed up by a ping test, no problems, not even with the slightest delay or packt loss are found.
I set up a DNS server, and it works fine when being queried from other hosts.But if I query it from localhost (by using nslookup), it reports error:
======================================
** server can't find www.example.com: NXDOMAIN
======================================
[code]....
i have created on folder in my server to upload some regular states. I want that user can modify or upload already stored files. but, should not upload any unwanted files orfolders.for that i want to use "rm" command as auto scheduler (putting this in cron tab.so that all files will be removed except some required files / folders for which this upload facility is activated. users are using secure-shell for uploading data.
View 1 Replies View RelatedWhen I start the SARG, its blow up with segmentation fault. Linux intra.local 2.6.18-53.1.13.el5 #1 SMP Tue Feb 12 13:02:30 EST 2008 x86_64 x86_64 x86_64 GNU/Linux
I already tried to empty all squid cache log and recompile (yum remove and yum install again) the sarg rpm w/o sucess.
I just setup a new squeeze debian 6 server, it will work as proxy server with squid. But I have an issue when i want to install sarg (sarg-report), i cannot find the package thru aptitude/apt-get, however it was present in lenny distribution. Is it a miss in squeeze, is it discontinued or should I add a specific entry in /etc/apt/sources.list? (currently squeeze main, squeeze/updates main)
View 3 Replies View RelatedI am trying to configure Sarg to read my Squid access.log file.When I run Sarg from either the command line or from Webmin I get the following error:
Code:
SARG: Records in file: 8343, reading: 100.00%
SARG: No records found
[code]....
How to configure SARG to monitor SQUID?
View 1 Replies View RelatedI have just build sarg, something wrong with my task. i haven't see log on website even on command in putty.This is brief short for my sarg.confQuote:
#access_log /usr/local/squid/var/logs/access.log
access_log /opt/squid/logs/access.log
#output_dir /var/www/html/squid-reports
[code]....
i have a Archlinux PC working Squid like proxy server and SARG for read the access log from Squid. The Squid log only show me 192.168.0.0 making http request, i need show the local IP from each PC on my network. I read in other page i need rebuid squid with follow_x_forwarded option, how i can do that? exist other way ti do that?
View 1 Replies View RelatedI am using SARG for squid report analysis. And it is working fine. But I want to know if there is some custom configuration possible to link the SARG image at the top of the page to my custom location instead of its default homepage on sourceforge? For the ease of understanding, I am attaching the page screenshot here. The SARG image I am talking about is the one on the top in blue colour. It is linked to [URL] and I want it to link it to my sarg homepage.
View 2 Replies View RelatedI am getting an error when i generat a report with squid's report generator ( sarg )is there a tool or way that i can find where in the log file the error is, the log file is 61442 lines, and it's gonna take me forever to find the error,
View 1 Replies View RelatedMany of mails sent from my mail server that are in Queue;The main reason is deffered by domains like yahoo,aol,etc.but there is one more error that i keep getting and that is Host Unknown,Below is an example from mail log,The catch is,test mail sent on the same email id sent from my personal mail from the same server i.e. url was deliveredHowever,another mail containing client information sent from customercare@mycompanysdomain ended up in queue.
There are more examples of the same,around 20 domain have the same problem.
Sep 7 14:33:46 server2 sendmail[24591]: n8793jiC024589: to=<abc@xyz.fi>, delay=00:00:00, xdelay=00:00:00, mailer=esmtp, pri=163672, relay=xyz.fi., dsn=5.1.2, stat=Host unknown (Name server: xyz.fi.: host not found)
Sep 7 22:09:42 server2 sendmail[6407]: n87Gdffa006403: to=<abc@def-fgh.com>, delay=00:00:00, xdelay=00:00:00, mailer=esmtp, pri=152474, relay=def-fgh.com., dsn=5.1.2, stat=Host unknown (Name server: def-fgh.com.: host not found)
any tips from you guys on how to filter my awk output?I want to exclude last 5 characters using awk in my tcpdump result.I don't want to include ".443:" in my tcpdump using awk.
View 6 Replies View RelatedI'm trying to create backup/archive my Ubuntu 10.04 system files (so I can restore it in case my system get corrupted). More specifically, I'm trying to zip the important files in my root directory not including my home directory (which includes my documents which I backup separately/more frequently) to an external hard drive attached via USB (called 'My Book').Since File Roller didn't give me quite the level of control I was looking for, I created a script that I could execute to backup and archive regularly.
View 1 Replies View RelatedI want to search files excluding the NFS find / -mount -name 'filename' restricts the search only in the root disc partition,but the file can be in other partitions also.Is there any way to exclude the NFS only...
View 7 Replies View RelatedI'm trying to do a sync of the root home directory to a folder call backup and excluding some files.
This is wat i executed:
rsync -a /root/ --exclude-from '/root/rules.txt' /root/backup/
My rules.txt is as below:
- anaconda-ks.cfg
Somehow it doesnt read the rules.txt and it will always include the anaconda-ks.cfg file.
I am working on a cluster for a molecular dynamics class and I have to edit my FORTRAN code (only the newest and best for me!). In order to get through to the cluster I have to ssh in. The network on which the cluster resides is behind a firewall, so I have to ssh through the firewall into the network first.
this is fine, I can login and move files and folders as needed, including sftp-ing into host 1, then into the cluster so I can transfer files from cluster to host and then host to me. This gets rather tiresome, so it would be nice to edit the files in place.
The problem is that when I access my code with emacs it launches the emacs client on Host 1, with no mouse support. I know the purists will howl about how I should be using keyboard shortcuts, but I am a chemist and not a programmer, so the mouse is very nice for me. Is there any way I can perhaps mount the cluster using sshfs so that when I open my code it launches a local instance of emacs? Sorry if this is the wrong forum, but I thought it was network related.
I got a bunch of machines (~10) that I share with my co-workers. I have appropriate .ssh file(s) set up so I don't get prompt for password when I try to ssh.Currently I ssh into these hosts and then do a top to check the load before I start using the machine. Because I don't want to be on a busy host.Can someone show me how to write a script that find a least-busy host given a list of hosts to check? (hardcoded is fine)
View 1 Replies View RelatedI have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.
I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.
Bit of an odd one, this. I've migrated a website from my old server to a new machine. Both servers run Ubuntu + Apache2. Both only serve a single site, apart from the default site.I've flipped the domain name to the new IP address.The trouble is that after moving the virtual host config over into sites-available, with the necessary link in sites-enabled, Apache attempts to serve from the default web root (/var/www) rather than the actual site content (in /var/www/technology). So for example, an attempt to browse.
View 1 Replies View RelatedI'm trying to get Synergy up and running between my Windows 7 (server) host and my Arch Linux (client) host. In rare exception, synergy works perfect on my windows host, however every time I try and run Synergy on my linux machine I get the following error in messages.log:
[code]...
I'm running Arch with a barebones Xorg install and SLiM with LXDE. I'm not sure what in the world is causing the problem and haven't been able to find anything of substance in a search.
I am using livecd-creator with the fedora-livecd-desktop.ks file to create livecd/usb images. The image that is created includes several languages (some are listed below). How can I only include one or two specific languages? Will I have to specify to remove each language group? If so, what is the syntax for yum?
View 2 Replies View Relatedopensuse v11.2, linux 2.6.31.12-0.1-desktop x86_64
ZIP v2.32
I wish to exclude some files from a zip archive. On other OSes to exclude an entire directory I would use the "-x" option like so:
Code:
zip -r archive-name * -x dir1/* Simple. And just add "-x"'s as needed (or use an exclusion file).
Not how it works here, it would seem. AFAICT all "-x" options are ignored. (The entries in an exclusion file also.) For instance, "-x diy/mplayer/*" should ignore everything in the <diy/mplayer> directory. It does not. I have tried fully qualified paths as well; no joy.
What is different about ZIP on linux?
Im trying to make backups using tar, and incremental backups using --newer. The scripts are:
Code:
tar -czvf $DISKPATH/backupRepos.tgz /home/repos/ --exclude /home/repos/Temp
for general backup (on saturdays)
and
Code:
tar -czvf --newer="`date -r $DISKPATH/backupRepos.tgz +%F`" $DISKPATH/backupRepos-inc.tgz /home/repos/ --exclude /home/repos/Temp/ for incremental the rest of the week.
The problem is that the first script runs OK, excluding /home/repos/Temp from the backup, but on the second, it makes incremental OK, but doesnt exclude Temp folder. Anybody knows how can i fix this script to continue making incremental but excluding also?
I don't know why --exclude doesn't work when I use tar command. Please see this code
Code:
mahmood@pc:~$ l a/
1.txt 2.txt 3.txt b/
mahmood@pc:~$ tar cvjf compressed.tar.bz2 --exclude=/home/mahmood/a/b/ a/
a/
a/2.txt
a/1.txt
a/3.txt
a/b/
mahmood@pc:~$
As you can see although I excluded b/ but tar command ignored that.