I have a question for you guys. How do I monitor a URL (link) in my server ? I want to see how many times a page has been accessed. In webalizer, there is such thing but there's a TOP 30 URL access, and I want to monitor (count) a specific URL. Is that possible ? If yes, how and what do I need to do.
They say:
Securing and Optimizing Linux: RedHat Edition -A Hands on Guide PrevChapter 30. Optional component to install with ApacheNext 30.3. Configure the /etc/webalizer.conf file
The /etc/webalizer.conf is the default configuration file for Webalizer. With it, you can specify which directories or pages in your web site to analyze, which URLs to hide, and so on. By default, the Webalizer program will install a sample configuration file named webalizer.conf.sample under the /etc/ directory of Linux. You can use this file to configure your choices and then rename it webalizer.conf, and the Webalizer program will be able to find and use it. A lot of options exist and it's important to read the documentation that comes with Webalizer for more information on all of the different setting and parameters. Also, it's important to note that we comment in this Webalizer configuration file only the most common and used parameters.
[Code]...
But I don't get it, what should I type in order for the webalizer to monitor a specific URL.
I have been trying to configure webalizer on a centOS 5.5 system. I did the yum install and I found this helpful link...However when I try to change the dir it puts the info into, it keeps giveing me an error cannot cd to the directory. I changed the /etc/webalizer.conf file as described in the above article and try to run it manually same results. /etc/cron.daily/00webalizer Error: Can't change directory to /home/webalizer/When I put the original dir back in the path it writes to /var/www not /var/www/output like it should. I even tried a local /var/wwww/output dir. Then I tried a link to /home/webalizer/output (a dir I created) and same results.(with orginal .cfg in place)I go to my www page remote or on the local network I get page not found errors.In summary.1. Why wont the /etc/webalizer.cf file allow me to change the location of the files?2. How do I display the www page to view the data?3. How can I secure the www page so only I can view the data.
Recently i am install webalizer in my application server .i am using centos4.1 .i installed successfully.but when i run the command i am getting like this.
root@tuxserver# /usr/bin/webalizer Warning: Truncating oversized referrer field Warning: Truncating oversized referrer field
I'm trying to get webalizer to analyse some log files. The server uses virtual hosts and has log rotations on and also uses turbopanel (now known as simple control panel). Because of this, the documentation is limited and webalizer works in a weird way. I found this perl script under turbopanel called webalizerrun.pl the code is as follows:
Code: #!/usr/bin/perl $WEBALIZER = "/usr/bin/webalizer"; chomp($var = shift); $wdir = "$var/conf/webalizer"; opendir(DIR, $wdir) or die "Unable to read $wdir: $!";
[Code]...
Here's what I want to do, and I believe I can do this using this code with slight modifications. As of now, the log files for each site is in the folder specified above with the file named as "domain-name_access_log" and then the log rotation just adds a number to the end of that. I want use this perl script to run webalizer for a particular site and have its output be placed in directory.
1.) Line 4: chomp($var = shift): I know chomp is used to remove trailing characters, but what character in this case? How may I find that out? Also what does $var = shift do inside chomp?
2.) Line 8: What exactly does the readdir function do? What does it return to $domain?
The rest seems similar to csh, checks if it's dir or file and then changes to the directory and runs webalizer on that directory.
I have a dedicated server with fc-9. I have several domains. They are located in /home/domainname/html Under each /home/domainname I have added a webalizer.conf file. Recently I moved video files into a subdomain so I could track usage and generate reports on the use of these video files. Hence the main domain html files call the sub-domain to present flash files for viewing and .avi, .ogm, mp4, etc files for downloading.
So I have for directories: /home/domain/html /home/domain_files/html
I set up the domain_files as I have the main domain with access_log and error_log. I set up webalizer to analyze every day at midnight. What I get for webalizer results is links back to Sept, even though the logfile's first enter is in January. I also see many files in the report that are from a totally different domain. I have checked carefully and in webalizer.conf I have: LogFile /home/domain_files/access_log I'm lost. I haven't a clue what's wrong. If webalizer can't do this is there another simple logfile analyzer that can give me video usage?
I have been trying to configure webalizer on a centOS 5.5 system.I did the yum install and I found this helpful link...[URL]..However when I try to change the dir it puts the info into, it keeps giving me an error cannot cd to the directory. I changed the /etc/webalizer.conf file as described in the above article and try to run it manually same results.
/etc/cron.daily/00webalizer Error: Can't change directory to /home/webalizer/ When I put the original dir back in the path it writes to /var/www not /var/www/output like it should. I even tried a local /var/wwww/output dir. Then I tried a link to /home/webalizer/output (a dir I created) and same results.(with orginal .cfg in place) I go to my www page remote or on the local network I get page not found errors. http://mysite.com/webalize or 192.168.1.80/webalizer page not found.
In summary. 1. Why wont the /etc/webalizer.cf file allow me to change the location of the files? 2. How do I display the www page to view the data? 3. How can I secure the www page so only I can view the data.
just saw this in my email too : header info deleted... /etc/cron.daily/00webalizer: Error: Can't change directory to /home/webalizer /etc/cron.daily/logrotate: Current logging target is: `- SYSLOG
edit: i should also mention SeLinux is in permissive mode not enforcing while i am testing this.
I'm using Webmin, and Webalizer plugin. Days ago, webalizer works fine. But starting from yesterday, Webalizer no longer generate reports. And when I try to generate report manually using webmin, I get this message: Code: Running Webalizer to generate report from [URL]..
.. Webalizer failed! See the output above for details. Because I didn't get any details on that message, I run webalizer manually using command:
Code: # webalizer -n mydomain.com And as result, I got this message:
Code: # webalizer -n mydomain.com Error: Unable to restore run data (10) I have few domains in this machine, and for all domains I got the same result. Anyone has any idea(s) about what happened and how to cure it? Few days ago I think I tried to install Perl-Small-XML using CPAN and failed. And I installed awstats too. Don't know whether it's related or not.
I suspect that this has come up numerous times, but I am new to Linux and I am setting up a new in-house server using Ubuntu 9.04 and Apache, etc. I can see the welcoming "It Works!" message when I log in via Firefox. I can see "index.html" when I FTP the server with the site name and password at /var/www. I can also see the -rw-r-r-- attributes, but I can't edit the HTML file or replace it. When I try to rename the "index.html" file.
I get the following message: "Request denied. Verify that the file or folder exists and that you have the necessary permissions on the server to perform the requested operation."
I haven't been able to determine where to enter the password or what changes I need to make to be able to work with the /var/www directory via FTP.
I'm trying to find some tool on generating reports based on apache access_log files (of Common format). I found some of them (awstats, lire/logreport, weblog expert, apache logs viewer, etc..) but they generate some global and general report about the log file. Also some perl script I found they just show the Top X number of different patterns. My request is how can I generate some similar report with this output:
IP-s | Total nr. of connections | Number of pages visited | Total time of connection
So basically this is a list with every IP on the log and the respective numbers (connection/pages/time) associated.
I'm sure this is something simple, but I've googled all over and can't find the answer for the life of me. I've got an apache server and I need to redirect all requests to HTTPS in the same domain, except for 1 html file the load balancer hits and needs to get a 200 on. Can anyone point me to some documentation or show me what I need to add to the httpd.conf file to get it working properly?
I'm pretty new to Debian, and I'm trying to set up Apache 2, and I want to set the DocumentRoot to public_html in my home dir, but I run into some problems.
I tried to change this (/etc/apache2/sites-available/default): <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/ <Directory /> Options FollowSymLinks AllowOverride None .....
When I restart apache I get a 403 error, and when I change it back to its original, it works fine. I want to change the DocumentRoot so I can upload files via FTP to ~/public_html. ~/public_html has the mode 777.
Code: include virtual="test.txt" I have tried following the advice in this thread http://ubuntuforums.org/showthread.php?t=1510098 but it makes no difference. The file is there, but the line is delivered to the browser as-is. Using Lucid and a new install of apache 2.2 from the repository.
My setup is two laptops connected by crossover cable. One runs Windows xp the other Fedora 13. Neither is connected to internet. I'm using a subnet of 192.168.1.x On Fedora eth0 is up, apache runs because it creates pidfile. Everything pings fine. Windows xp ip pings fine from command line. Gave Windows xp a static ip of 192.168.1.1 mask 255.255.255.0 and gateway 1.2, same as eth0. xp says it sees the server. eth0 is up.
DirectoryIndex looks at index.html. I created that file with very simple code and put it in document root. Document root permissions are 755. Access_log 770. Error_log 644. Apache User 755. Listen 80 When I type the ip for eth0 (192.168.1.2) into firefox, firefox gives me an error message - can't find server. The connection status says its connected.
The error log includes a line: [warn]./mod_dnssd.c:No services found to register I don't know what this means. Apache is not writing to access_log. When I cat the path to access_log I get nothing, then a command prompt. I'm looking for the part I'm missing that will let Apache serve that index.html file to firefox so I can see how my code looks to firefox as I go.
Do I need to restart the webserver if I make any change in the file index.html? Or should the changes get reflected when I open the page again without restarting the service?
I got the following task from my boss. I have to find out if there is some alternative tool for create reports from Squid except SARG. Now, we use SARG, but my boss told to me, that the main problem of SARG is, that SARG generate huge amount files, which cause problems during migration our servers. He told to me the following condition for change of current tool (SARG):
* standard package of Debian * generate less amount of files, optimal is to save reports to the database
So I would like to ask you if you know about some tool (I can not find some by google)... and the best would be if you told to me some practical experiences.
I have apache working,i have users set under admin group in /home/admin/username/html that is with an html publicfolder at the end, now permissions are set right, /html is set to mode 777, and the contents also inside them. But everytime i do a 10.0.11.25/~les i get a damn forbidden error code, its got me so pissed off and i dont' know whats the problem. This is the error i get: Forbidden 403 You don't have permission to access /~les on this server.
Code:
[Tue May 05 19:37:48 2009] [notice] Apache/2.2.11 (Unix) DAV/2 PHP/5.2.6 configured -- resuming normal operations [Tue May 05 20:44:30 2009] [error] [client 10.3.0.254] (13)Permission denied: access to /~les denied
With F11 installed Apache is having permissions issues reading files out of the html directory. Only wants to work with permissions set to read for other. [Thu Jun 11 23:25:28 2009] [error] [client 127.0.0.1] (13)Permission denied: file permissions deny server access: /var/www/html/index.html Tracked down the permissions issue. Is there a good reason not to change the group to apache and remove world read?
Today, the power was suddenly cut off in my house, then my home Ubuntu Server restarted after the power on, but when I use my laptop to view my wesite, the index.html suddenly became blank page, I did clear the firefox cache, doesn't work, still blank, and I changed browser, to seamonkey, the index.html still was blank, so, I am sure that the problem is coused by the server, and then, I put the index.html file to a subdirectory, which under the /var/www/home/index.html, and then I put the address < [url] > ,then,I can view my website the main page index.html.
I want to put Google analytics code on all legacy pages on my server, most of them don't have a template so I was wondering if Apache can automatically append the code.
I would like to set-up a pretty fast a running apache. I would like to use him so to launch a pretty small web site of 10 static html pages. At this time there is no security concerns (even though I want to do it right) , because the computer has not even ethernet cable. I have some experience, 8 years ago I Was setting up virtual hosts in fedora, so this process is not a blackbox for me.
I used wget -r to get all the web pages that were linked from index.html. The pages listed in index.html are all chapters. After using wget -r, all the chapters are now in the same folder on my local hard drive. Is there a way to build the chapters in their proper order into a "long"/"full" web page, rather than simply having each chapter as a link/next link on a previous page?
what is the best way (i.e standard way that is supported on all browsers and probably as well followed by web crawlers).... to include an html file either locally or externally in another ? Of course , i've done the research and i also know that there are server side includes (php , asp ...you name it) at the moment , i'm using this:
Quote:
<script type="text/javascript" src="path to file/include-file.js"> </script>
however, i've been warned that this method may not show up in some browsers as some tend to ignore this tag and that crawlers like your favorite search engine wouldn't bother reading this. so , what is the best and safest way to do the job? and btw , the reason why i've ousted SSI's from the start is because of among other things:
1) the fact that the included file is static html and because the text is included pretty much everywhere
2) hoping to reduce load time as the code (if successfully recognized) would hopefully be treated like any other embedded external file (e.x like an image) , therefore it would be cached without the need to downloaded it over and over again for each new page on the site.
1. Webserver (Centos 5.5) 2. Mail server (Centos 5.5)
We have configured autossh successfully to create/manage the ssh tunnel into mail server in order to dump all emails to localhost port.
To auto start autossh in boot time we have included following into /etc/rc.d/rc.local,
Quote:
So whenever our web application wants to send out emails it dump all emails to localhost:33465 port, easy piecy, all are working great
Now we have a requirement that logwatch reports should get delivered via the same ssh tunnel rather than installing postfix and configuring as a relay.
Mounted second hard disk still report 0 bytes even when files are already deleted in rhel5 . I already checked the lost+found and trash . It only happen that disk space on deleted files cannot be recovered after the disk reach full capacity , but if it does not reach yet its full capacity , deleting files will recover the disk space . The format of the disk I have mounted is ext3 also have tried ntfs using fuse but the same problem , once allowed to reach 0 bytes I can no longer recover space with deleting files and had to reformat and restore the backup
When I do xhost + also it hangs.Without the xclock I am not able to run oracle reports in production for bar code printing as I am not able to start a standalone reports server. Help needed to start xclock and where to look for any help