General :: Capture Reponse Of 20 Different URL Hits In Single Shell Script?
Aug 5, 2010
I need to execute 20 urls in one shell script and display there responses on the console and write on text file too. consider the case when url is not responding.
I have to admit that I register to LQ after I failed to search for similar solutions.let me see whether I can explain my problem clearly. I need to extract a single report from a big file. The big file looks something like this:
Report for yyyyyy Your info 999-9999999 End of Report
[code]....
I need to search for a user provided string, say 999-9999999, in the big file. Then I have to extract the single report. My logic is simple,
1) find 999-9999999 2) backward search for "Report for", note down the line number 3) forward search for "End of Report", note down the line number 4) extract the record by using info found from step 2) and 3).
I am trying to do this in bash, with awk and sed (I am new to both).
I am in need to create a script that queries how large a partition is and when it hits a certain percentage (say 90%) it will execute another script that tars up certain files (or they could just be part of the same script). I would create a cronjob that runs this script once a day.
I have the script that tars up the files I need, sets permissions, etc. (btw, the files in question are audit logs). I just need the part that runs something like a df -h and takes the use percentage of the /var partition in that query and if that percentage is greater than/ equal to 90%, it kicks off the tar script.
Here is a sniplet of the df -h with just the /var partition shown:
Quote:
So, when the cronjob sees that the Use% is >= 90%, it would kick off the tar script...if not above 90%, it closes.
Some files have a list of hardware errors (we test electronic components), some have none. If the file name has no errors, I still want to display a message like so
Code: grep ^err R*VER && echo "No error" FILEA.TXT:err->USB3910err FILED.TXT:err No Error
This grep statement works but it seemingly overrides the find() statement above if I run both at the same time... How can I combine the two statements to create a report that lists the filename and error(s) like so
Code: FILEA.TXT Button3320err FILEB.TXT USB3235err FILEC.TXT IR Remote2436err FILED.TXT No error
Is it possible to return "No error" with the file name without error?
I have given an ip address say 172.172.200.1 mask 255.255.255.0 .now I want to ping 172.172.0.1 or 172.172.10.1 it does not reponse.If I ping 172.172.200.30 it responded.
I have a slave node uploading all kinds of backups to my server in the internet. Now I would like to display the actual upload and download rate to this server (not the entire nic-traffic, any protocol) in a small php-page for easy monitoring.I had a look at quite some monitoring tools and the one which kind of offers what I am looking for is iftop with a filter on the IP of my server. As I would like to periodically update a file with the actual rates, an interactive program won't do. A possibility would be to filter the packages myself using but this seems to be quite a long shot.The optimal solution would be a program or script printing out the actual upload to a host specified in the options to STDOUT
Does anyone know of a way to single-step through Bash shell scripts in order to debug? (similar to the way Windows 98 used to let you single-step through autoexec.bat)
I want to write a single shell script that allows me to, once executed from a panel launcher, change the image preview setting between "local files only" and never. Right now i have two tiny scripts, one for local files only and another one for never, that is:
[Code]...
and the other says string "local_only". But that means i need to have two launchers, because i don't know how to write the condition <<when set to never, change it to local only. And is it possible to make a script that also changes the launcher's icon when the preview config is set to one or the other value? That way i'd know what it is set to just by looking at it. It would act as a diagnostic and therapeutic tool XD
I am trying to see whether wget can be used to generate actual url hits on a webpage. This does not look good so far�. I changed the following lines in /etc/wgetrc to:
Code: http_proxy=http : / /<proxy_ip>:<port>/ use_proxy on Output :
2011-01-16 12:26:39 (88,9 KB/s) - `index.html.3' saved [50548] This does NOT generate a hit on the actual web page! It does not seem like the, > /dev/null part is working either... How can I get this to work?
My script looks really crap and messy, the logic isn't great and I'm not hugely happy with it. Also it echo's $i instead of an actual IP address (line 10). How to improve this. It basically searches through /var/log/messages for multiple FTP hits and when the hit count is higher than a specific number the IP is added to a config file and ftp is restarted. There are some obvious flaws in my script.
if [ $HITNUMB -gt $MAXHITS ]; then for i in $HIGHIP; do echo $i sed -i '78s/$/,$i/' /opt/etc/proftpd.conf /root/ftp restart done else echo "not greater than $MAXHITS" fi
I'm not even sure what will happen if I get multiple responses for my $TOPHITS. It would be cool if it could search for IP's already blacklisted somehow, it might actually be easier to just create a file with a set of blacklisted IP's or something.
I need something to make a script that will search some logs and extract IP hits from one country only. Let's say UK. I guess I need to use GeoIP or some database. I just need a very simple bash, perl, php script that will do this job. Just search threw logs (apache) and then give me number of hits found from UK.
I haven't recompiled ny kernel in a while, but whenever I did it, it was all pretty easy. Make menuconfig; (adjust); make && make modules_install, and copy over bzImage in */arch/* and System.map to /boot and stick in new entry into grub menu.lst.However, this time, I must be missing something, because I get kernel panic on booting up to the new kernel. Is there a step I'm missing?Certainly, I was looking in /etc/rc.d and there is a rc.modules<kernelver> script in there. I wondered if I need make a new one ... although when I looked over it, it seem only to be required when forcing particular modules.
In OpenSuSE11.2 I followed this url to got it to work: Genius MousePen 8x6 It worked like a charm. In OpenSuSe 11.3 the Xorg stuff is different. In /etc/X11/xorg.conf.d there is a wizardpen.conf but when I use the tablet the mouse hits the top left corner. Using the settings from the url above doesn't make it work either.
MACHINE: HP Proliant DL260G5OS: SLES 11 SP1kernel: Linux xserver 2.6.32.12-0.7-default #1 SMP 2010-05-20 11:14:20 +0200 x86_64 x86_64 x86_64 GNU/LinuxIt is used as remote xserver in a LAN.I have configured /usr/lib/restricted/bin/.rbashrc with some environment variables but when the users logon in the system finally is executed $HOME/.bashrc and some environment vars are overwritten.
I am wondering if I can open a shell or new terminal thing from within the terminal in a unix/linux enviroment. Particularly a commandline only one where there is no GUI. Is this doable? how do I do it?
I have question about the UNIX sockets. my goal is to connect multiple sockets from a single client to a single server and keep them open...I'm not sure if that is possible to create or not. Do you have any suggestion or an example of code?
I have a toshiba phone system that has a feature where it spits raw data and i can direct that data to a serial port on the toshiba phone system or to an IP/PORT the data is just ascii data - tells me calls that are made/time/etc
I would like to setup an existing linux box to capture this data and store the data for me
then later i can build reports based on that data (and for me a report could be just a few cat's/greps/etc so basic stuff) - but i can figure this out later.
for now i would just like to GRAB the data on a port and dump it into a file (and recycle files every 24 hours)
When I check for updates in Update Manager, instead of downloading the files,goes through the files very quickly and says "Hit" for every file. If I try to update this information manually by running sudo apt-get update, I get this:
I do a lot of beta testing on vario ustros and encounter a problem during initialinstalls .does one capture the screen display during installation when an error occurs?The display usually doesn't stay on long enough to copy the information, even when moving into an error console (such as Alt-F3).often doesn't have the install data, especially errors.Screen capture programs are not up and running at that point.I've tried taking a picture of the screen with a digital camera but the results were not very good.
if I open a terminal and hit [TAB] [TAB] it will display "Display all 2583 possibilities? (y or n) ". If I press y is there a way to capture the output and write it to a file? Not like it is a command so I can't just use a redirect to a file? If not I guess I could just do an ls on all locations of $PATH and capture that to a file.
I'm trying to capture a video from a site in Japan. Unfortunately, most of the video downloader tools don't seem to work. Since I'm using Firefox I'm guessing I can't look in my local cache. Is there any way to find it?They might block you if you're not in Japan. I used to use a proxy server when I was in the US but after a while it didn't work.