Networking :: Wget Can Access / Browser Cannot

Jun 27, 2010

I have been having a problem on my Ubuntu desktop with the wireless connection. I am now running Ubuntu 10.04, but the problem showed up immediately after upgrading to Ubuntu 9.10. This machine is used as a CUPS server so my wife can print from her laptop and get to a printer downstairs. Intermittently, I will be unable to access the CUPS server web pages (or any other web pages on the local Apache server) from a remote machine on the internal network. I also cannot connect in via SSH. However, from the wireless desktop itself the web pages are still accessible and a local browser can also access remote web sites just fine. So, the network connection is still up.

To try to determine how often this was happening, I wrote a simple Bash script that checked if a page could be accessed on the web server on the wireless machine. I used wget to access a page and then log the results to a file while running the script from a crontab entry. It turns out that even though I cannot access a web page using a remote browser, I can access the same web pages using wget from a remote machine. This has me a little confused..What could be causing this situation? I do not have a firewall running on the desktop with the wireless connection. After a while, the blockage of inbound web pages from a remote browser is "fixed" and I can again access the CUPS (and other) pages.

View 3 Replies


ADVERTISEMENT

General :: Download Files Via Wget In Browser?

May 26, 2011

I had set two 700MB links for download in firefox 3.6.3 by browser itself. Both of them hung at 84%.I trust wget so much.Here the problem is : when we click on download button in firefox then it says save file & when download has begun then i can right click in downloads window & select copy download link to find that link was Kum.DvDRip.aviif i knew that earlier like in case of hotfile server there is no script associated with download button just it points to avi URL so I can copy it easily. read 'wget --load-cookies cookies_file -i URL -o log'I have free account (NOT premium) on sharing server so all I get is html page .

View 4 Replies View Related

Ubuntu Networking :: Cannot Access Certain Ports Via Web Browser, Outgoing Port Blocked

Aug 12, 2011

I have a vps server running certain services which can be accessed via a web browser (e.g webmin control panel), but I have recently been unable to access these services from my home machine using Firefox 5.0, running ubuntu 11.04.

Example:

I can access the server on port 80 fine, eg: [URL]

However I cannot access my webmin control panel on: [URL]

The pages takes ages to load and then times out. Same with transmission-daemon on: [URL]

Everything is set up fine on my server, the ports are open in firewall etc. and I can access these pages fine from my work computer.

This has only started happening in the last day or two and had been working fine up till then. I have not messed around at all with the firewall on my home machine. I have tried other browsers besides Firefox with same result.

View 4 Replies View Related

General :: Use Wget To Access A RESTful Interface?

Apr 12, 2010

I am trying to use wget to access a RESTful interface, but I can not figure out how to do HTTP PUT with wget. How can I do it? Or isn't it prossible?

View 2 Replies View Related

General :: Wget To Access Web Resource But Not Download It?

Jul 16, 2011

Is there a way for wget not to download a file but rather just access it? I use it to access a URL that triggers a process on a web server, but the actual HTML file at that location doesn't need to be downloaded and saved. I couldn't find anything in wget's help to show if there's a way to do this. Could anyone suggest a way of doing this?

View 2 Replies View Related

Ubuntu :: Wget To Access Credit Card Online?

Apr 8, 2011

I am interested in making wget do a slightly different function for me. I have downloaded it, built it (1.12) and it works perfectly right out of the box. would like to have it login to my creditcards.citi.com https website, give my user id, my password and "select NEXT-SCREEN label=Account Activity", then capture the account activity that returns.

I got these three values in my firefox Selenium script that runs perfectly time after time. My big picture goal, is to be able, on a crontab, to dump my account activity every night at midnight. I am not married to this idea if anyone has a better or different route.

View 1 Replies View Related

Software :: Resume An Interrupted Wget Using Wget.log?

Jun 19, 2011

If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.

View 2 Replies View Related

Networking :: Browser "waiting For ..." Stalling / Browser Seems To Stall For A Long Time?

Feb 7, 2011

I'm surfing along just fine when I select some link in my browser.The status line reports "... waiting for ..." and the browser seems to stall for a long time (minutes). Often, but not every time, I can select the STOP toolbar icon and reload the page. This time things work normally.

I have a similar but different malfunction running Evolution email connected to my hosted IMAP server. I select a message and see "... formatting ..." and then Evolution seems to stall for a long time (minutes). Rarely the stop-reload actions help reading email. Instead,all Evolution windows go dark grey and the entire desktop stalls.[Analysis -- It has the feel of a "network" issue provoking issues within Evolution itself rather than an Evolution-only malfunction.]

Watching the running system with 'htop' and similar, I do not see where some other application or service is sucking all of the available CPU time or such. [Analysis -- It has the feel that either the browser message or the email message were sent into limbo without reaching their intended destination. Stop-reload sends the bits to the right places and so I get results.]

View 7 Replies View Related

OpenSUSE :: 11.2 - Cannot Access MySQL Via Browser

Apr 21, 2010

Since I installed SUSE 11.2 I have been unable to access MySQL via any browser although I can via MySQL Administrator. Firefox keeps asking if I want to open index.php with KWrite, it also occasionally asks this when I am on other sites. Konqueror gives an immediate timeout error.

View 2 Replies View Related

General :: Access Server Via Browser

May 10, 2010

I have a linux server , there are many files in it , I want to let the user to use browser ( windows ) to access the files to upload / download files , as below command , the user can access the path in the server "linuxserver".

View 5 Replies View Related

Server :: Subversion Web Access Will Not Allow To Log In With Web Browser

Sep 7, 2009

I'm running Fedora 10 with apache and I installed mod_dav_svn so that I could set up a secured remote code repository. It appeared to be working ok until I turned on SSL and basic Authentication.I even verified in my subversion.conf httpd configuration file that it is pointing to the correct password file. When I attempt to access it from my web browser, it prompts me for a user name and password, but it will not let me log in. I tried to disable SSL thinking that that might have been the problem, but I get the same results either way. Can anyone please help me resolve this problem?

View 4 Replies View Related

CentOS 5 :: Access Browser From Windows?

Dec 21, 2009

I successfully installed R12 on Linux. It is working fine on server. I want to access from different computers. All M/C are in same network

[Code]...

View 1 Replies View Related

Fedora :: Can Update / But No Internet Access Through Browser

Dec 8, 2010

I just installed fedora on a customers laptop of mine and just trying to get everything working. I hooked up a wireless PCMCIA linksys card and got connected to my wireless and went through the first 250+ updates, all that went fine, so I know I am getting internet access. Everytime I open firefox though I cannot connect to any websites, like I said I am a noob so my troubleshooting skills are very limited with linux, anything I can try etc.

View 4 Replies View Related

General :: Remote Desktop - Access Box From Browser?

Aug 21, 2011

I use a hosted machine for work which has vnc and apache servers running. To work on a shell, I connect to the VNC server, and to access files I host them using apache and open them from my browser. It would be great if I can access my shells via my browser itself instead of using VNC or command prompt.

I am looking for an end result like this: [URL].. What are my options? PS: I already tried [URL]..but this uses a java applet to run and does not do it in browser itself.

View 2 Replies View Related

General :: PWD To Use Browser - Root Access With Dolphin SU

Jun 8, 2010

I have Mepis 8.5 and I can access root Dolphin etc. but I cannot get past the ID and pwd panel when I try to use Firefox to see my localhost sites. When I installed, I knew I should not have made using the browser to be password needed, but it was too late so I am stuck with it. As I haven't used it for a while I have fotgotten the user and pwd for this area. How can I find it or negate it so I have access once I put the localhost url in the broswer?

View 4 Replies View Related

Server :: Subversion Web-access, Not Able To Login From Any Browser?

Mar 9, 2011

tried to subversion webaccess from browser,it is asking username and passwordsnter both.its asking again. how to login.configuration shown below

vi /etc/httpd/conf.d/subversion.conf
<Location /svn>
DAV svn

[code]...

View 1 Replies View Related

Fedora Networking :: FC9 DNS - Cannot Yum Or WGet But Can Ping And Dig

Jan 13, 2009

For some reason some command line commands are unable to resolve urls, whereas other commands work as they should. I have checked most setting but am unable to find out what is wrong and am no closer to figuring out what and why.

[root@subzero ~]# yum update
Loaded plugins: refresh-packagekit
[URL]: [Errno 4] IOError: <urlopen error (-2, 'Name or service not known')>
Trying other mirror.
Error: Cannot retrieve repository metadata (repomd.xml) for repository: atrpms. Please verify its path and try again
[root@subzero ~]# .....

View 11 Replies View Related

Fedora Installation :: Connect To Internet But No Browser Access

Mar 23, 2009

On one machine is upgraded from F7 to F10: no problems. On my second machine, I did a fresh install. I can connect to the internet via KPPP, but both Firefox & Konqueror fail to recognize that I'm online. I tried to create a network modem connection, but when I select "new" & "modem", then press the "forward" button, nothing happens.

View 5 Replies View Related

Fedora Servers :: PhpMyAdmin Alias - Cant Seem To Access It With The Browser

Jun 12, 2010

I have been trying to set up phpmyadmin for a while now, but I just cant seem to access it with the browser.

I yum installed phpmyadmin. In my /etc/httpd/conf.d/phpMyAdmin.conf I have:

When I visit [ip]/phpmyadmin I just get a 404

I've reloaded apache, checked the phpmyadmin config file, and everything seems fine, I just get constant 404.

View 2 Replies View Related

Fedora :: Package - Allow To Keep All Bookmarks In A Library That Can Access From Any Browser

Feb 20, 2011

I'm looking for a package that will allow me to keep all my bookmarks in a library that I can access from any browser that I use such as FireFox, Chrome etc.

View 3 Replies View Related

Ubuntu :: No Internet Access On Browser While Torrent Download

Dec 18, 2010

I use Transmission to download torrents off the internet. The problem is whenever I start the download, I cannot access the internet via my web browsers, Chrome AND Firefox. Altho, I can use Pidgin or Empathy to communicate. Why should I do ? I am assuming it is causing some sorta congestion .. I tried giving the downloads "low" priority, applying temporary speed limits but it doesn't help.

View 5 Replies View Related

Ubuntu Servers :: Login To Access Files Via Web Browser?

Mar 1, 2011

I am currently new to all of this and not even sure if I am in the right place.But I am currently running a server with OpenSSH.When you go to my site's homepage, I have a link that redirects you to my files that I will be sharing.They are currently access denied as I do not want the public to be able to download.My question is how would I go about having a login box to pop up when they clink on the link on my homepage to access the files folder.Currently I have accounts set up and it works when going through Filezilla with SFTP but I'm not sure how to give access to use via a web browser

View 1 Replies View Related

General :: Terminal Server Site Access From Browser?

Sep 2, 2010

I'm going away for the weekend and will only be able to bring my laptop that has the latest release of Ubuntu on it. My question is: will i be able to access my company's Terminal Server site [URL] from browser in Linux, and be able to launch the applications? (Outlook, etc.) Seems to connect to Windows Server 2008. If this works in Linux, that would make my life a lot easier..

View 2 Replies View Related

General :: Unable To Access Port 8008 From Web Browser

Jan 22, 2011

So this is my first post so I'll put it in "Newbie".I seem to have successfully configged my httpd.conf file to listen on port 8008. I restarted httpd ok. However, when I go to port 8008 in a web browser from another computer on the internal network by going to 192.168.2.5:8008, it doesn't connect. When listening on port 80, I can browse to the IP address fine. is this a server-side or client-side issue? Using Fedora 12. Thanks in advance and tell me what forum this would best fit in.

View 11 Replies View Related

Server :: Can't Access Squid Proxy From Firefox Browser

Oct 18, 2010

I have Fedora Core 13 running. I have successfully (I think) installed squid, although I may have it configured incorrectly. I can ssh into the box from work via putty, but I can't use the proxy. I get a message "the connection to the server was reset while the page was loading" I can use the proxy from my home network, and have watched the tcpdump for port 3128 while using the proxy. I have turned off iptables completely (I'm not sure yet how to just allow squid)...

View 3 Replies View Related

Fedora Networking :: Ping But Can't Wget Etc - Complains Of DNS?

Aug 6, 2009

I have a pretty strange problem I can ping www.yahoo.com:

Code:
[root@localhost ~]# ping www.yahoo.com
PING www-real.wa1.b.yahoo.com (69.147.76.15) 56(84) bytes of data.
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=1 ttl=52 time=20.1 ms
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=2 ttl=52 time=20.7 ms
64 bytes from f1.www.vip.re1.yahoo.com (69.147.76.15): icmp_seq=3 ttl=52 time=23.3 ms

[Code]...

View 7 Replies View Related

Ubuntu Networking :: Crontab And Wget With Terminal?

Sep 13, 2010

I used the crontab to start wget and download the file with the following

Quote:

14 02 * * * wget -c --directory-prefix=/home/Downloads/wget --input-filefile=/home/Downloads/wget/download.txt

But it doesn't shows a terminal and so not able to get the current status and stop wget. So how can I start wget with a terminal using crontab?

View 1 Replies View Related

Networking :: Curl And Wget Error 400 Bad Request?

Nov 9, 2010

I use slackware current, and curl and wget give the following errors:

Code:
repo@cannabis ~]$ wget -r http://users.telenet.be/reggersjans
--2010-11-09 13:48:14-- http://users.telenet.be/reggersjans
Resolving users.telenet.be (users.telenet.be)... ::ffff:74.117.221.11, 74.117.221.11
Connecting to users.telenet.be (users.telenet.be)|::ffff:74.117.221.11|:80... connected.
HTTP request sent, awaiting response... 400 Bad Request

[Code]...

View 7 Replies View Related

Networking :: Grabbing Wiki Code Using WGet

Aug 2, 2010

I would like to grab wiki code from a wiki page using wget. Running this grabs HTML:
wget -O wikihtml.html [URL]
The first attempt at getting wiki code was to pretend to edit, and run:
wget -O wikiedit.html [URL]
but of course that grabs GUI HTML. I thought perhaps the text inside the text box would be in tact, but HTML is througout. How to get just the raw wiki code?

View 2 Replies View Related

Networking :: Yum Install / Update And WGet Do Not Work

May 6, 2010

I have a CentOS 5 server running as a web server. The web services are okay. Ping, ssh work fine both ways. But when I try to wget or yum update or install, I get a timeout. The URLs are resolving properly. And there's no difference if IPtables is turned off or on.

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved