General :: Software Rec - Monitoring Changes In A Webpage
Oct 14, 2010
I have some web pages that I need to frequently check. Is there any tool that will check for updates and send notifications by mail or RSS. These pages are not RSS supported.
I am looking for a Linux standalone application, browser extension, or a service that sends mails.
Im having issues to connecting to my network and it continues to not connect and keeps saying that webpage cannot be displayed....i have other computers here that connect just find....i was wondering is there a command like ipconfig or something
I have webpage its charset is 8859-9 and it was prepared in windows with char encoding ascii. From ftp access, I am opening it via gedit in ubuntu then turkish chars changes like (turkish ı became ý). What can I do to avoid this kind of stupid things?
I have set up a linux box running slackware 12.0, along with Apache 2.2.4, on my LAN I have a couple of computers. I want to force them to a webpage under document root, the webpage will be a agreement webpage. Is this possible to do with Apache? This will not be real domain, so my guess it that I would have to tell my DNS server to resolve the ip address to the hostname of my slackware box.
I have just install conky to my Linux Mint 9 system.I like to know if I can set up conky to auto reload a webpage.FYI, the webpage do not have rss feed therefore I cannot set conky to use rss.
I just setup my first Linux box using [URL] everything went along fine except now I have a problem that I cannot seem to solve. I've set up a webpage on the box for my company's intra-net for testing purposes but I cannot get the links to work. On the server itself all the links work but Firefox still ask me to authenticate with the Adobe Flashplayer (player10), but when I access the page from another computer I have the following issues:-
1. Even though hostname -f shows the a fully qualified domain name I have to use the IP Address eg. 192.168.100.100 2. I can access the page but the links leading to the other pages do not work I get "Webpage cannot be found or the HTTP 404 Not Found" Error Message 3. None of the embedded pictures show up I get the red X.
my webpage PHP below. I would like to enter "Hello, in the main inputbox field" below (You are editing: textfile.txt) and click "SAVE" directly from command line.
Sort of : wput_php "hello, in the main inputbox field" click save, and here is it. the text would be uploaded.
I have been trying to set up SNMP monitoring using Cricket. I believe I have Cricket running fine, however I also believe that I have SNMP set up wrong. When I do an SNMPWALK I get the following:
[Code]....
This tells me I have SNMP running, however I would expect to see a lot more data... Am I correct, should I see more data? I spoke to a friend, and he suggested that I have something not set up correctly in SNMP dealing with the community string...
I wonder if there is any tool that can read health of my HDD's? There are tons in windows but what options are there in Linux? I would really hope there is a tool a can install without using terminal cause I dont have web access for the Xubuntu right now. If anyone aware of such a tool, preferable a .deb file so I can easily install without hazzle. Also I hope tools like that are not littered with tons of dependencies. Cause those dependencies are a major challenge without online connection. If there exist any smart terminal commands that will let med check health status pls let me know and pls write down the command. I have enabled the s.m.a.r.t in BIOS. I have 4 drives running Raid 0 if that matters.
I'm trying to write some bash script to monitor for a change in web pages.'m using WGET (through a cronjob) to collect the pages, then use DIFF to compare the previous and current page for changes, then just sending a notification to indicate if a change is detected.This was all working great in the simple test environment until I tried monitoring a page that had a dynamic link . Where nearly every test brought up a difference.So to my question, is there a way to do a DIFF on a couple of web pages but configure it to exclude dynamic links?
I just need to ask about any existing tool in linux which can show us the CPU memory and swap utilizations of overall system for particular time duration and generate graphs.?i m a student of computer science and want this information of resource utilization for my project..kindly reply if any of u liux fans knows about such tools.
I am trying to use GKrellM [URL].. to monitor my system's fans/temperatures (I am trying to undervolt the fans a bit to make the systems quieter), but there don't seem to be any sensors available. I have lm_sensors installed.
Is there a 'top' like command for monitoring the GPU and memory usage of a video card? I am most interested in Linux commands, but and OS would be interesting. I strongly suspect that for a group of my systems the video cards are being under-utilized (but I have no idea by how much) and would like to re-allocate funds to other bottle-necks. We are using higher end cards, so the price difference between cards is significant.
I'm trying to find a good host for my site, and I've been trying to get one with a fast, reliable connection - I frequently use it as a proxy server for various areas, and since my connection is frequently a 50 Mb or 100 Mb line, I need a fast network connection so it doesn't slow down too much when I switch to the proxy server.
At the moment, I've narrowed it down to a couple of providers that are all within a few hops and <6ms away from my main location; it pretty much comes down to connection speed. One in particular that I'm trialing offers an metered 100100 connection at a good price, so I'm hoping to go with them, but I want to make sure the connection is solid and doesn't drop to a much lower speed during the day if they are sharing it among many users.
I have a pretty simple speed test procedure - I use wget on a 1GB file hosted on a backbone with a 1GB connection, and see how fast it downloads. For now, it's sticking at a steady 11.2 MBs or 90 Mbs, which is fast enough for me, but I need to make sure it can maintain that speed even in times of heavy usage.
[/BACKSTORY]
Basically, I need a script that runs wget every half hour and logs the output and time in a reasonably readable format. It's probably something simple enough to do, but I'm just learning my way around the linux command shell, so some simple instructions on how to create and run such a script would be great [CentOS 5]. I have full root access to the server and I'm the only user on it if it matters.
Can you provide some form of monitoring on this server or recommend any server-side applications that could monitor the status, in high detail, including traffic, etc?
I have a Dell Precision M4500, Intel Core i5 CPU, running Linux (Ubuntu Lucid), and would like to keep an eye on CPU temperature. I've tried lm-sensors: sensors-detect didn't find any sensors; following its hint ("This is relatively common on laptops, where thermal management is handled by ACPI rather than the OS.") I tried acpi -V but got nothing thermal. The Gnome panel applet "Hardware Sensors Monitor" reports on GPU temperature but nothing else.
I need a serial port monitoring software that works on windows 7 64bit or GNU/Linux and supports at least RS-232. Bonus points are if it's free software or at least freeware. I just need to be able to see what goes in and what goes out of the port. At this time, I don't need any protocol deciphering or anything else complicated.
I'm faced with standing up an open source NMS and am deep into Zenoss Core. I'm evaluating distributed collectors that will be deployed behind a customer NAT/Firewall. Cool, this works. What if the customer IP space overlaps with an existing customer IP space? From a management perspective Zenoss distinguishes devices by IP. So it will refuse to add duplicate addresses. To get multi-realm IP functionality, It would require purchasing a subscription to enterprise license.
So my pickle, do I spend weeks or months hacking the Zenoss sub structure to duplicate that? Do I somehow remap the IPs through site to site VPN at the router? Or do I look for a different open source solution?Does anyone know of an open source NMS solution that addresses overlapping IP space and can do distributed collectors? I have posted a question on Zabbix forums asking if the distributed monitoring they have will do this. But I hope that someone else has tackled this and succeeded.
Is there a clever way to monitor the progress (as percentage or hash) of copying a large file (using pv could be an option)?Like monitoring the progress of a copy command such as this:Code:cp linux.iso /tmp/
File information: 3 records in input file found 3 records processed 0 records skipped Load statistics: 3 messages loaded correctly 0 messages ignored 0 messages with errors Details: Destination OK ignored errors correct incorrect not sel. other house Server (def 3 0 0 0 2 1 0
2011-05-17 13:21:27 - Msg 2410 Archiving information: File /path/to/xxx.txt was archived as /path/to/xxx.txt.
Now I want to monitor this "house Server (def" and send alert based on 3 0 0 0 2 1 0 say if [ $5 -gt 0 || $6 -gt 0 ]; then <send email>