Out of no where Firefox has just decided to not load web pages anymore. I've restarted my router, restarted my wifi card, restarted my computer, *completly* uninstalled Firefox then reinstalled it, still no luck at all. Anyone have any idea how to fix this? I'm stuck using Google Chrome now, I like it but I need the Add-Ons FireFox has (AdBlock Plus, Xmarks, DownThemAll, etc.)
Here are my specs: Ubuntu 10.04 64bit Acer Aspire 3680 Intel Celeron M 1.6GHz Single Core 1GiB RAM 40GiB Hard Drive
Basically all it does is sit there saying "Connecting to www.google.com..." for like 15seconds then gives me one of two pages, "The connection has timed out - The server at www.google.com is taking too long to respond." or "Unable to connect - Firefox can't establish a connection to the server at www.google.com."Google Chrome, Skype, Ubuntu's built in messenger, System Update, and so on connect just fine to the internet.
I have an odd problem loading certain webpages (not all) from the New York Times in Firefox.
If I try to load this for instance:http://well.blogs.nytimes.com/2010/1...-shoes-or-not/ the page will not load and I see the text "Waiting for graphics8.nytimes.com..." in the status bar.
I did a complete reinstallation of Firefox and it worked fine (yesterday). Today the problem is back again.
I tried the page on a friend's computer also running Firefox in Ubuntu and it worked fine.
I tried it in Windows Firefox and it worked fine.
I have the same add-ons/extensions in Windows as I have in Ubuntu and, in any case, I have removed them all once and tried the page with the same non-result. I also noted that the same thing happens if I try to load the page in Google Chrome in Ubuntu.
Pages do not load on certain sites (such as LinkedIn and Which? for example). The sites themselves load but when I click on links the pages load forever. With Which? I can close FF and open it again and then the link loads but after a few clicks they stop loading. It happens on all browsers (at least I tried FF and Chromium) but OK on Windows.
Yesterday I upgraded from 9.10 to 10.04 using the built in upgrader. After upgrading, whenever I open Firefox, it will just say "The connection has timed out" for every page, but I am connected to the internet. Also, w3m will not load any web pages as well. But I am connected to the internet, because pidgin works, apt-get works, ping, and telnet work.
I will ping google.com with success each time, then attempt to load google.com in Firefox and it will say that it cannot connect to the host. I have tried both "Connect directly to the internet" and "Connect using system proxy settings" for Firefox (the system settings are to connect directly to the internet).
I then thought that maybe something was blocking port 80, so I used telnet (telnet google.com 80), and with that, I got the HTML response of [URL]. what can I do to make my computer able to brows the web again? I am running Ubuntu 10.04 with kernel 2.6.32-22 on a Compaq Presario CQ60-417DX Notebook.
I can't seem to fire up any web pages but I can do everything else on my ubuntu box. This seems to have started within the past day or so. My /etc/reslov.conf file looks solid.Again, I can ssh, vnc and ping to the outside but just can't load any web pages.
Ive just install Midori, and overall I like this browser. But certain pages aren't loading in Midori. Like this forum, Fedoraforum.org. On exactly the same laptop, I can open this page with Firefox. How can I fix this?
I seem to have stumbled on a problem regarding internet connectivity. My web browser (Firefox or any other web browsers) can't completely load the desired webpages like yahoo.com or even google.com.But as I read different blogs in this forum, I have convinced myself that there is an internet connection.= when I ping different websites, Iseem to get good responses from those sites. Only when I need to browse the webpage do get no results.= ifconfig seems to respond well enough.= even "route" shows the IP my router/gateway correctly. The problem , I think, lies on the proxy settings or the DSN.
I have apache running on my server, and also Zoneminder, a surveillance system running on the same machine. Both services runs without glitches, and I think apache's config as well as ZM's config are fine. I am not sure I understand how apache (not to mention the whole thing zoneminder, apache, web browser...) works. Pretty hard to manage when you dont know what you are doing. Also, when I try the supposed to work zoneminder webpage in my web browser, I get nothing (a blank page), or sometimes a "Not found" error message. The latest seems to be from apache because it is the same font as the "It works!" message when I try http://localhost:80
The only bit of information I have so far is in the apache error log (/var/log/httpd/error_log) and it says: Code: [Sun Mar 21 00:35:14 2010] [error] [client 192.168.0.100] script '/srv/httpd/htdocs/zm.php' not found or unable to stat [Sun Mar 21 00:46:04 2010] [error] [client 127.0.0.1] File does not exist: /srv/httpd/htdocs/zm It seems that the "zm.php" is missing.... That would be why Apache cant find the page?
I have been using Linux via my Acer Aspire One for a few weeks now without any problem. Suddenly I find I can get a WiFi connection but consistently it fails to load the pages. I have tried 2 connections so I know it's my machine. And probably something I have touched. How to reset the the internet settings or better still can I reset the computer back a few days (like in Windows) to resolve this problem?
I have been running Fedora 8 on my machine for some time now with no networking problems until yesterday. It sounds hard to believe, but I quite literally tried turning my machine on and no web pages would load. I haven't made any system updates in a week, I haven't installed any new software in over a week, and I haven't tweaked any network settings. Just opening firefox (or konqueror) and entering in a website brings an Address Not Found error.
* All other windows boxes on my network can connect to the internet just fine. * I can ping any outside address * I can load google in a web browser by browsing directly to the IP address * /etc/resolve.conf displays this information :
" # generated by NetworkManager, do not edit! domain hsd1.wa.comcast.net. search hsd1.wa.comcas.net. nameserver 220.127.116.11 nameserver 18.104.22.168 nameserver 22.214.171.124 "
*I could NOT ping the first nameserver address, but all others were pinged successfully *I am not familiar with dig, but per several forums I have read in researching this I tried "dig www.google.com" and it doesnt seem to give any sort of "found a connection" type results *KTorrent seems to be connecting to clients and downloading and uploading successfully.
Heres what I have looked into and have found unsuccessful: *There seems to be an issue, mostly ubuntu related it seems with connecting to the internet through ADSL modems. I dont connect through an ADSL modem, but I did try disabling ipv6 but no success. *I tried disabling SELinux, nothing. *I tried stopping Network Manager but that seemed to cause more harm then good. *I tried stopping VMware services (thats the last application I installed), but it made no difference.
In conclusion it seems as though this is DNS related in one method or another, but I am just not sure where to go from here. I do need to get back online asap so that I can publish out some updates for a client.
I have a serious problem relating network connectivity under all Linux distributions. I have tried variety of browsers, but the webpages either load partially or do not load at all. Surprisingly, app manager still works well, and I can download apps at full speed from xterminal or app manager. This became an issue which renders Linux distributions close to useless to me. I am currently connected through a router, but direct connection does not solve the problem. What is more, my phone, Nokia N900 is experiencing the same problem with its web browser. I assume, it's due to the fact that N900 uses full Linux distribution. Internet works just fine under all Windows versions (XP, Vista, 7).
I have just terminated a program which used almost all my free RAM space, so most pages from other processes are now in swap. When I need to access another process (e.g. browser), it will reload its pages from swap resulting in a lag. I would like to load all the swapped-out pages into physical memory before I need that process, how can I do this (supposing that I have enough free RAM for this)? The only way I'm thinking of is swapoff -a, then swapon -a again.
Excuse me if such thread exists already, i carefully searched the forum so i wont create a second one concerning the following problem:So i downloaded (on my new fresh Fujitsu SIEMENS Li3710 running Vista the Ubuntu 9.04 distro.. im a complete Linux noob, but i wanted to give it a second try) so i downloaded it and burned it on the fast laptop, so there wont be any errors caused by low memory during burning I installed and then i went to the Ubuntu desktop - i had all drivers installed. Everything but Amarok sound worked, i turned off my laptop and went doing my job.. The next time i started up my laptop and went on the desktop .. a tradition is to browse the web, so i did that and i couldnt load a single page, except pages with direct IPs i had Skype working just fine and i had a friend there running SuSe and asked him for help. We identified it as a DNS problem, but because of my noobness i couldnt deal with it and i HAD TO switch back to windows, wich i regret so much.
I know that im a noob, but if i have an internet running at its best i can look at forums and resolve my other issues, so im asking for a screenshots on DNS settings so i can fix it the next time i install Ubuntu.Also! I was using cable via PPPoE (and the only terminal command i know is "sudo pppoeconf" ), now im with router, connecting my laptop directly via WI-FI, will i have the same problem and will i have to make settings so i can connect to my router? Because back then i tried to connect to wireless networks and the ones i could connect on XP/Vista/7 i simply couldnt connect on Ubuntu...I really want some real PC knowledge and to use Linux
I had started another thread about this but had a terrible subject line. Others may have this issue.Found the answer deep in the bowels of documentation. My question now is " did version 20 mess this up for everyone with dialup"?
Laptop: HP 6910p Wireless Card: BCM4312 NO issues with wireless in 8.10 (intrepid) - fwcutter driver. Performed clean install of 10.04 (lucid) onto a brand new hard drive (July 2010) - fwcutter driver. NO issues with wireless in 10.04 (lucid) until a few days ago, when. my laptop shutdown because the battery died and I safe-upgraded to 2.6.32-24-generic. Currently I have chronic wireless connectivity issues - slow to non-existent. Ping tests (ping -c6 google.com or ping -c6 126.96.36.199) sometimes reveal 0% loss, sometimes 100% loss, sometimes "unknown host" - within 5 minutes of each other. Regardless of ping results, web pages are consistently slow to load. Skype will also cut out from time to time as the wireless connection vacillates. I have a MacBook Pro which I am using as my (wireless and consistently well-connected) control - and from which I am currently forced to draft this note.
I have 2 servers that are mirrored. They host 3 separate websites. Two of these websites are regular HTTP and the other is HTTPS with Digest authentication as well. The reason there are 2 servers is because one is a primary and the other secondary in case the primary goes down. Recently I decided to upgrade the secondary server then make it the primary server. I have done most of the configuration and the sites using regular HTTP are working perfectly fine. The page using SSL is not. Apache fails to load and here are the errors I am receiving the the error log file:
Code: [Fri Aug 13 09:27:00 2010] [warn] RSA server certificate CommonName (CN) `newserver.domain.com' does NOT match server name!? [Fri Aug 13 09:27:00 2010] [error] Unable to configure RSA server private key [Fri Aug 13 09:27:00 2010] [error] SSL Library Error: 185073780 error:0B080074:x509 certificate routines:X509_check_private_key:key values mismatch
For the first warning, I cannot find anywhere that says the CN "newserver.domain.com", only what the CN is in the SSL key. I have no idea where to even start with the other errors.
I am using BSNL WLL (Wireless Local Loop) on my Fedora system to connect to the internet My modem is Huawei ETS-1201, I use wvdial The problem is that, on every linux distribution I tried, Webpages dont load or load incompletely randomly This happens most during peak net traffic. I tried to edit the /etc/ppp/options file to set my MTU and MRU to 576
But, It does not seem to be working. There is no such problem on Windows and I can connect flawlessly I takes several retries to connect to a site. I sometimes become frustrated with this and switch to Windows But as you all know, Windows is not the right OS for people who want more from their computers.
Below my specs: Intel Core 2 Duo E8400 Intel BOXDG43NB 1GB DDR2 800MHz RAM WD 320GB 7200rpm HDD
I am running OpenVZ.I wanted to use the VPSs IP at a particular port to load the Servers Web Pages e.g. The IP 188.8.131.52 is the VPS IP and 184.108.40.206 is the servers main ip.My files are on the main server at /usr/local/pages/
I want to load the page from the URL: https://220.127.116.11:8888
These should load the files from the main server at /usr/local/pages/ Virtuozzo a OpenVZ panel does it. So how is this going to be possible ? Will DNAT do the trick : iptables -t nat -A PREROUTING -p tcp -d 18.104.22.168 --dport 8888 -i eth0 -j DNAT --to-destination 192.168.0.1:8888
Whilst I like Ubuntu as an operating system and generally am fine with Firefox, could anyone explain why I sometimes get errors in trying to get through to various web pages. I don't get the same errors in IE8. As an example, my telephone billing provider; I can login on both IE8 and Firefox and get into all the screens in IE8 but not always in Firefox as happened to me 30 mins ago (I think the version installed is 3.5 on Ubuntu 9.10). There is a message at the bottom of the screen showing that Firefox is trying to connect to a sub page, but it just can't display. Is there anything that I can do to resolve this and is it an encryption issue? This isn't the first time I've had issues with Firefox. In Ubuntu 9.04 the screen used to sometimes go fuzzy when I used the firefox version there.
I've enabled LDAP authentication on my 2.2.15 Apache server, but now pages load very slowly. As in, 1.515s with it enabled, and 187.4ms without (just the base page, numbers collected via Firebug). Here's my LDAP config (other directives snipped) -
I have Kubuntu installed under windows 7. It finds my wireless connection and says i am connected. even with a wired connection it does the same. When i go to the web browser and try it out. says problem with connection. will include some pictures of "iwconfig" and others.
I am Using Ubuntu 10.04 fresh install connected to internet via VPN Network.Internet works fine i can dowload mail and send. But for some reason i cant load some webpages like Facebook, Adobe and etc.I install all the newest programs Java, Flash anything.I'm runnung Firefox 3.6.9.I tried everything disabled IPV6, Proxy. But nothing solved it
Two very simmilar laptops connected to the same WiFi router loading the same web pages. Firefox on Windows does that for 3 secons, while Firefox on Ubuntu needs 1-3 minutes to load the same page. What could be the reason for this? Any idea how to fix it? The laptop with Ubuntu runs 10.04 LTS (lucid), Firefox is 3.6.10 and on any other WiFi network it loads the web pages really faster but why is much slower than Windows now?When testing the speed on URL...Ubuntu loads times faster than Windows. The WiFi router has been reseted. Firefox has been started in -safe-mode, both didn't help.
It seems that I have been running Firefox unbeknownst to me with a broken configuration. It seems that if I enable the Quote: Allow pages to choose their own fonts, instead of my selections above in Edit->Preferences->Content->Fonts&Colors->Advanced, pages will no longer be render properly -- they are mostly blank.When running on the command line, I notice that I get errors from Pango (whatever that is) and think there may be a connection.
I tried doing sudo aptitude reinstall libpango-perl libpango1.0-0 libpango1.0-common libpangomm-1.4-1 but it didn't fix the problems. (I haven't yet tried uninstalling pango since it looks like it removes some crucial components.)I also tried sudo dpkg-reconfigure fontconfig and it didn't help.I'm guessing that the fonts cannot be scaled.This otherwise wouldn't be an issue, but it seems that some Firefox addons forcibly ignore the configured fonts and those pop-up windows are mostly blank/empty and therefore useless.
I'm having trouble with Firefox hanging. It takes at least two attempts to launch, having to force the initial blank page to quit. Often it will freeze my desktop when loading new pages, and I have to force quit. I've upgraded to 3.6.17 but no difference. The Mozilla help page suggests disabling add-ons to find the problem, but to no avail. Chrome works OK but I'd rather use Firefox.
For some reason, a few web pages don't use thier full functionality under Firefox 4. For example:GMail uses the basic HTML interfaceIn LQ, Thread Tools is now a list of buttons at the bottom of the page instead of a menu at the top.Also in LQ, clicking the member's username on a post goes to the user's profile, instead of showing the usual menu.What is the problem?Not sure if this should be in another thread, but I found that Firefox 4 often just suddenly disappears with no warning or error messages. It already happened twice today. It never happened with the beta versions.
i use firefox for a long time, but today in opensuse he just stoped working without any reasonable reason! The only page he opens from now on is my default page... when i try to go to other ones appears this error message: "by the way the server is redirecting you, you will never reach your destination" or something like that. After this, says that this may be caused by blocked cookies... the problem is with my firefox, cause konqueror is working ok!
I am using KVM and created four guest Operating systems on it. The server host is Ubuntu 10.04.I am using 4 websites in a reverse proxy environment. One of our website is running on CentOS VM. Right now there is no traffic on the website static HTML pages. I do not have any clue as why it was taking longer time to be accessed.
My computer freezes and I am trying to diagnose the issue. When.. Using Devede to convert avi to dvd. (almost always) Using Firefox on pages using macromedia flash or java apps. With multiple windows open using terminal. I am strongly leaning towards this being a HARDWARE issue.
I have been investigating some security precautions over the past several months. I use Ubuntu now instead of windows and FIrefox browser also. I have installed BetterPrivacy, WOT, NoSCript and a few other add ons. I have SELinux, ClamAV, AIDE, and chkrootkit installed for Ubuntu.
When we browse certain web sites, we get an error about the server being reset. However, when I put the Ubuntu install cd in and browse with Firefox, obviously with no add ons or settings changed, we can browse to the site with no problems. We are trying to be secure on the internet and I don't want to lower or get rid of any of the settings / add ons we added. What would cause servers to reset when using Firefox / Ubuntu with browser add ons / OS addons?
I've been using Firefox Portable for ages, mainly between my work (Windows XP) and home (Ubuntu) machine. Recently, just on Ubuntu (10.04), Firefox Portable has started freezing on pages containing Flash. Anyone else experiencing this? I've done a clean re-install of Firefox Portable, then just added Flash, and it still freezes.
I have fedora 12 and I'm playing a game using wine. To view a page from inside the games own browser(it has an internet browser style menu) there is a button that says "open in external browser" I've used it hundreds of times to save that particular page to a folder when I was in windows but now that I'm using fedora it doesnt do anything. Is this something I have to configure in firefox or could it be something else.
I'm playing a game using wine that uses an internet browser style menu. One of the options that the game offers is to open the current game page with your external browser. I've used it hundreds of times when I was on windows and it worked fine and I was able to save the current page to a folder. Now that I'm using fedora the link doesnt work. It doesnt open firefox and it doesnt give me any message to let me know that it even recieved the request.
There is the option to select a "printable form", but the form is much larger than any paper the printer supportsIf I use "Print Preview" in Firefox, it creates only one page that contains the beginning of the form, and ignores the rest of the form.If I do the exact same thing on a Windows machine, Firefox creates as many pages as necessary when I click "Print Preview"I feel that this problem is not because of Firefox, but because of how "Print Preview" works on Ubuntu.How can I get the print preview to create additional pages to include the entire text when I click "Print Preview" ?