Networking :: See Whether Wget Can Be Used To Generate Actual Url Hits On A Webpage?

Jan 16, 2011

I am trying to see whether wget can be used to generate actual url hits on a webpage. This does not look good so far�. I changed the following lines in /etc/wgetrc to:

Code: http_proxy=http : / /<proxy_ip>:<port>/ use_proxy on Output :

Code: root# wget -c <url>/ > /dev/null
--2011-01-16 12:26:38-- <url>
Connecting to <proxy_ip>:<port>... connected.

[code]....

2011-01-16 12:26:39 (88,9 KB/s) - `index.html.3' saved [50548] This does NOT generate a hit on the actual web page! It does not seem like the, > /dev/null part is working either... How can I get this to work?

View 4 Replies


ADVERTISEMENT

Ubuntu :: Using Wget To Save A Frequently Updated Webpage?

Mar 3, 2010

I'm trying to figure out how to use wget to save a copy of a page that is frequently updated. Ideally, what I'd like it to do is save a copy of the page every minute or so. I don't need multiple copies; I just need to know what the most recent version was. Also, if the page disappears for whatever reason, I don't want it to save the error page, just wait until the page is up again.

View 2 Replies View Related

Ubuntu :: Wget Webpage And Some Links For Offline View

Apr 25, 2010

I've looked around the other threads as well as the wget man page. I also Googled for some examples. I still cannot work it out. From the page [URL] I want to download the 48 linked files and their corresponding information page.To do this (the first file) by hand I click on the line that saysApplications (5) Go to the first optionDell - Application Open and copy the linked pageApplies to: Driver Reset Tool Then back on the first page click on the Download button. On the window that opens up I choose to save the file.

Then I move on to the next option (which is Sonic Solutions - Applications) and repeat this until I would have all my files. I do not want to download the many other links on this page. Just the above mentioned, so I can take it back to my internet-less place and refer to it as if I was on the net. I am using the 9.10 LiveCD at my friends place.

View 2 Replies View Related

Server :: Wget Webpage Secured By Username / Password

Feb 18, 2011

I wish to download a webpage, which is secured by username and password, using WGET. The thing is there are many forms on that page and I dont know how to tell WGET which one should it send (by POST method) the parameters. I have solved it till this so far:
wget --post-data="predmet=xxx" --http-user="yyy" --http-password="zzz" [URL]
It gets through the authentication but it will not submit the form.

View 3 Replies View Related

General :: Wget Webpage Using Post Method With Multiple Forms

Feb 19, 2011

I would like to download a webpage using WGET which needs a form submission (POST method) in order to appear. I can do that with this command.

wget --post-data="computer=hosts&style=list" http://www.example.com

The problem is there is more than one form on the requested page and I dont know how to tell WGET which one should it POST the data to.

View 3 Replies View Related

Programming :: Java Code To Implement "wget" To Save A Webpage

Feb 24, 2010

This is the code i used, there is no error in execution but no file is bein saved in the working directory. I'm new in java,so just started learning.

public class Hel
{
public static void main(String args[])
throws IOException

[code]....

View 5 Replies View Related

Ubuntu Networking :: Update Manager "Hits" Files?

Aug 1, 2010

When I check for updates in Update Manager, instead of downloading the files,goes through the files very quickly and says "Hit" for every file. If I try to update this information manually by running sudo apt-get update, I get this:

Code:
thomas@THOMAS-PC:~$ !!
sudo apt-get update

[code]....

View 4 Replies View Related

Software :: Resume An Interrupted Wget Using Wget.log?

Jun 19, 2011

If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.

View 2 Replies View Related

Networking :: Sometimes Not All Of A Webpage Loads?

Apr 24, 2011

I suspect the network card but I could be wrong.

Sometimes it doesn't load all of a webpage. In Firefox this results in only some of the page being displayed. In konqueror I get the "Connection to host www.page.invalid is broken" error message. The computer was doing this when running Opensuse 11.3 and continues to do it when running a clean install of Opensuse 11.4. I haven't tried a live CD of Ubuntu or something but that might be worth a go.

View 5 Replies View Related

Ubuntu Multimedia :: When Will VLC 1.1.10 Hits Natty

Jun 8, 2011

When vlc 1.1.10 hits natty, this is a minor security and bugs update so it should arrive soon officially?

View 5 Replies View Related

Ubuntu Networking :: Aireplay-ng - How To Generate ARP

May 20, 2011

How to generate ARP since i started aireplay-ng in ARP request replay mode by using the command :

And i got no arp`s. Screen looks like this :

View 1 Replies View Related

Ubuntu Networking :: 11.04 - DNS Errors / Webpage Not Available

May 12, 2011

I have recently upgraded to the new ubuntu 11.04 but am having problems with the internet. Often When I try to access a website it takes a long time (5-10 minutes) to load even google, and sometimes I just get the following error message:
This web page is not available. The server at google.com can't be found because the DNS lookup failed. DNS is the web service that translates a website's name to its Internet address. This error is most often caused by having no connection to the Internet or a misconfigured network. It can also be caused by an unresponsive DNS server or a firewall preventing Chromium from accessing the network.

Here are some suggestions:
- Reload this web page later.
- Check your Internet connection. Reboot any routers, modems or other network devices that you may be using.
- Check your DNS settings. Contact your network administrator if you're not sure what this means.
- Try disabling DNS prefetching by following these steps: Go to Spanner menu > Preferences > Under the Bonnet and deselect "Use DNS pre-fetching to improve page load performance".
- Try adding Chromium as a permitted programme in your firewall or antivirus software's settings. - If it is already a permitted programme, try deleting it from the list of permitted programmes and adding it again.
- If you use a proxy server, check your proxy settings or check with your network administrator to make sure that the proxy server is working.
- If you don't believe that you should be using a proxy server, try the following steps: Go to Spanner menu > Options > Under the Hood > Change proxy settings and make sure that your configuration is set to "no proxy" or "direct".
Error 105 (net::ERR_NAME_NOT_RESOLVED): Unable to resolve the server's DNS address.

I have tried to access the internet using Chrome, Firefox, Konqueror, on the Untiy Desktop, and the Classic Desktop. I've checked my Proxy settings in Chrome (My main browser) and its set to direct internet connection. I've Disabled the UFW. I've edited the resolv.conf which initially showed:

Code:
# Generated by NetworkManager
domain HG532.com
search HG532.com
nameserver 10.0.0.1
and edited it to:

Code:
# Generated by NetworkManager
domain 10.0.0.1
search 10.0.0.1
nameserver 10.0.0.1
and even tried the Google DNS server:

Code:
# Generated by NetworkManager
domain 8.8.8.8
search 8.8.8.8
nameserver 10.0.0.1

But nothing works. It seems like When I initially log on to the internet it runs fine, but over the coarse of 10 minutes the page loading times go down then eventually I just get the above error message. There are other computers here running windows which connect through the same wireless fine.

View 5 Replies View Related

Ubuntu Networking :: Generate Syn Flood Attack In Pc?

Aug 3, 2011

I want to test syn flood attack in my pc

but i dnt know how to generate it, can you tell me

how to generate syn flood attack in pc

View 2 Replies View Related

Networking :: Getting Error When I Try To Generate A Scenario / Sort It?

Nov 27, 2010

I installed bonnmotion 1.4 but when I try to generate a scenario, i get the error msg

bm: command not found.

if I replace bm by ~/Bureau/bonnmotion-1.4/bin/bm , I get This error msg :
Exception in thread "main" java.lang.NoClassDefFoundError: edu/bonn/cs/iv/bonnmotion/run/BM
Caused by: java.lang.ClassNotFoundException: edu.bonn.cs.iv.bonnmotion.run.BM
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
Plsss any idea on how to resolve this?

View 1 Replies View Related

Fedora Networking :: FC9 DNS - Cannot Yum Or WGet But Can Ping And Dig

Jan 13, 2009

For some reason some command line commands are unable to resolve urls, whereas other commands work as they should. I have checked most setting but am unable to find out what is wrong and am no closer to figuring out what and why.

[root@subzero ~]# yum update
Loaded plugins: refresh-packagekit
[URL]: [Errno 4] IOError: <urlopen error (-2, 'Name or service not known')>
Trying other mirror.
Error: Cannot retrieve repository metadata (repomd.xml) for repository: atrpms. Please verify its path and try again
[root@subzero ~]# .....

View 11 Replies View Related

Networking :: Wget Can Access / Browser Cannot

Jun 27, 2010

I have been having a problem on my Ubuntu desktop with the wireless connection. I am now running Ubuntu 10.04, but the problem showed up immediately after upgrading to Ubuntu 9.10. This machine is used as a CUPS server so my wife can print from her laptop and get to a printer downstairs. Intermittently, I will be unable to access the CUPS server web pages (or any other web pages on the local Apache server) from a remote machine on the internal network. I also cannot connect in via SSH. However, from the wireless desktop itself the web pages are still accessible and a local browser can also access remote web sites just fine. So, the network connection is still up.

To try to determine how often this was happening, I wrote a simple Bash script that checked if a page could be accessed on the web server on the wireless machine. I used wget to access a page and then log the results to a file while running the script from a crontab entry. It turns out that even though I cannot access a web page using a remote browser, I can access the same web pages using wget from a remote machine. This has me a little confused..What could be causing this situation? I do not have a firewall running on the desktop with the wireless connection. After a while, the blockage of inbound web pages from a remote browser is "fixed" and I can again access the CUPS (and other) pages.

View 3 Replies View Related

Ubuntu Networking :: Setup A Simple Webpage Using No-ip?

May 11, 2011

My plan is to set up a simple webpage using no-ip. I already made an account and downloaded and configures the client. Now I want to make the actual website (hosted on my local machine). How would I do that? I assume I would need a hosting software.

View 2 Replies View Related

Networking :: Taking Passwords From The User Through Webpage?

Feb 24, 2010

I am doing samba file sharing. I got struct in problem relating reading passwords. I have to read password whatever user enter from web interface and process it. And through that password and username the user must log on from windows system. I have to write appropriate shell script for this. this is all i am doing for sharing files through samba.

View 2 Replies View Related

Networking :: Launch A Webpage With Vsftpd Ftp Server?

Apr 14, 2011

I'm kinda new to all of this so I believe there's a simple solution for what I need I just don't know how it's done.

So I've configured a vsftpd, ftp server and enabled the anonymous user, therefore when I do

Code:
ftp://my_location.com

a directory like view is displayed in my web browser, corresponding to /srv/ftp where the files for the anonymous account are kept. Now I've made a little web page and I've copied it there, and can be executed running main.html so the address to access the web page would be

Code:
ftp://my_location.com/main.html

My problem is that I don't want anybody to have the possibility to see the directory like display so I'm wondering if when someone does ftp://my_location.com can't this be redirected so it will do ftp://my_location.com/main.html instead?, making the display of files and directories impossible from the ftp://my_location.com

View 7 Replies View Related

General :: Script To Tar Up Files When A Partition Hits A Certain Use Percentage

Apr 19, 2011

I am in need to create a script that queries how large a partition is and when it hits a certain percentage (say 90%) it will execute another script that tars up certain files (or they could just be part of the same script). I would create a cronjob that runs this script once a day.

I have the script that tars up the files I need, sets permissions, etc. (btw, the files in question are audit logs). I just need the part that runs something like a df -h and takes the use percentage of the /var partition in that query and if that percentage is greater than/ equal to 90%, it kicks off the tar script.

Here is a sniplet of the df -h with just the /var partition shown:

Quote:

So, when the cronjob sees that the Use% is >= 90%, it would kick off the tar script...if not above 90%, it closes.

View 14 Replies View Related

General :: Grep: Return An User Message When No Hits?

Jun 13, 2011

I have ASCII files to parse that 48 hours old or more ; I can identify them like so

Code:
find . -name "FILE*TXT" -mtime +1 -exec ls -ltas '{}' ';'

Some files have a list of hardware errors (we test electronic components), some have none. If the file name has no errors, I still want to display a message like so

Code:
grep ^err R*VER && echo "No error"
FILEA.TXT:err->USB3910err
FILED.TXT:err No Error

This grep statement works but it seemingly overrides the find() statement above if I run both at the same time... How can I combine the two statements to create a report that lists the filename and error(s) like so

Code:
FILEA.TXT Button3320err
FILEB.TXT USB3235err
FILEC.TXT IR Remote2436err
FILED.TXT No error

Is it possible to return "No error" with the file name without error?

View 11 Replies View Related

Programming :: Script For Searching Through Messages For Multiple FTP Hits

Oct 25, 2010

My script looks really crap and messy, the logic isn't great and I'm not hugely happy with it. Also it echo's $i instead of an actual IP address (line 10). How to improve this. It basically searches through /var/log/messages for multiple FTP hits and when the hit count is higher than a specific number the IP is added to a config file and ftp is restarted. There are some obvious flaws in my script.

Code:
MAXHITS=0
TOPHITS=`grep "FTP session closed." /var/log/messages* | awk '{print $7}' | sed -e 's/^.*[//' -e 's/].*$//' | uniq -c | sort -nr`
HITNUMB=`echo $TOPHITS | awk '{print $1}'`
IPHIT=`echo $TOPHITS | awk '{print$2}'`
HIGHIP=`echo $TOPHITS | grep $HITNUMB | grep $IPHIT | awk '{print $2}'`

if [ $HITNUMB -gt $MAXHITS ]; then
for i in $HIGHIP;
do
echo $i
sed -i '78s/$/,$i/' /opt/etc/proftpd.conf
/root/ftp restart
done
else
echo "not greater than $MAXHITS"
fi

I'm not even sure what will happen if I get multiple responses for my $TOPHITS. It would be cool if it could search for IP's already blacklisted somehow, it might actually be easier to just create a file with a set of blacklisted IP's or something.

View 14 Replies View Related

Programming :: Script To Search And Count Hits From Some Country

Mar 25, 2011

I need something to make a script that will search some logs and extract IP hits from one country only. Let's say UK. I guess I need to use GeoIP or some database. I just need a very simple bash, perl, php script that will do this job. Just search threw logs (apache) and then give me number of hits found from UK.

View 5 Replies View Related

Slackware :: Lazy Kernel Compiler Hits Snag?

Feb 22, 2010

I haven't recompiled ny kernel in a while, but whenever I did it, it was all pretty easy. Make menuconfig; (adjust); make && make modules_install, and copy over bzImage in */arch/* and System.map to /boot and stick in new entry into grub menu.lst.However, this time, I must be missing something, because I get kernel panic on booting up to the new kernel. Is there a step I'm missing?Certainly, I was looking in /etc/rc.d and there is a rc.modules<kernelver> script in there. I wondered if I need make a new one ... although when I looked over it, it seem only to be required when forcing particular modules.

View 14 Replies View Related

Networking :: How An Agent Itself Generate Trap On Some Particular Situation Arises

Oct 15, 2010

how an agent itself generate trap on some particular situation arises

View 1 Replies View Related

Fedora Networking :: Ping But Can't Wget Etc - Complains Of DNS?

Aug 6, 2009

I have a pretty strange problem I can ping www.yahoo.com:

Code:
[root@localhost ~]# ping www.yahoo.com
PING www-real.wa1.b.yahoo.com (69.147.76.15) 56(84) bytes of data.
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=1 ttl=52 time=20.1 ms
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=2 ttl=52 time=20.7 ms
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=3 ttl=52 time=23.3 ms

[Code]...

View 7 Replies View Related

Ubuntu Networking :: Crontab And Wget With Terminal?

Sep 13, 2010

I used the crontab to start wget and download the file with the following

Quote:

14 02 * * * wget -c --directory-prefix=/home/Downloads/wget --input-filefile=/home/Downloads/wget/download.txt

But it doesn't shows a terminal and so not able to get the current status and stop wget. So how can I start wget with a terminal using crontab?

View 1 Replies View Related

Networking :: Curl And Wget Error 400 Bad Request?

Nov 9, 2010

I use slackware current, and curl and wget give the following errors:

Code:
repo@cannabis ~]$ wget -r http://users.telenet.be/reggersjans
--2010-11-09 13:48:14-- http://users.telenet.be/reggersjans
Resolving users.telenet.be (users.telenet.be)... ::ffff:74.117.221.11, 74.117.221.11
Connecting to users.telenet.be (users.telenet.be)|::ffff:74.117.221.11|:80... connected.
HTTP request sent, awaiting response... 400 Bad Request

[Code]...

View 7 Replies View Related

Networking :: Grabbing Wiki Code Using WGet

Aug 2, 2010

I would like to grab wiki code from a wiki page using wget. Running this grabs HTML:
wget -O wikihtml.html [URL]
The first attempt at getting wiki code was to pretend to edit, and run:
wget -O wikiedit.html [URL]
but of course that grabs GUI HTML. I thought perhaps the text inside the text box would be in tact, but HTML is througout. How to get just the raw wiki code?

View 2 Replies View Related

Networking :: Yum Install / Update And WGet Do Not Work

May 6, 2010

I have a CentOS 5 server running as a web server. The web services are okay. Ping, ssh work fine both ways. But when I try to wget or yum update or install, I get a timeout. The URLs are resolving properly. And there's no difference if IPtables is turned off or on.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved