Ubuntu :: Painfully Slow Access To Particular Website
Sep 1, 2011
I'm having difficulty accessing http://egrants.ed.state.pa.us/v2/default.asp while using Ubuntu 11.04 and Ubuntu 10.04.I've tried chrome, firefox and opera as well as three different ISPs. My Windows machines don't seem to have this problem.
A little background:My agency has several offices, some with different ISPs, but with essentially the same setup: ISP router/modem --> Ubuntu Server with Squid/Dansguardian --> Switches, etc.When we first started encountering this problem I thought I had to have had something going on with Squid/IP Tables or something of that sort. I connect my laptop (running Ubuntu 11.04) directly the ISPs modem/router and have the same slowness. I figure that maybe the site is having problems. I ask the other tech in my office to try it out (he's running Windows 7) and he has no slowness with the site when connected directly to the ISPs modem/router.For the time being I've circumvented the filter box for the Windows users who need access to this site, but I'm not certain that's the best solution.I contacted the site's tech support several times and just gotten a "we'll call you back".Any suggestions on things I can check/change with my setup to get this working?
Since I switched to lucid (clean install), the interface has become very slow, unresponsive. I had no such problem in karmic. Switching from one window to another, displaying menus, browsing files, resizing windows etc... take ages to display.Now what is strange, is that this problem only affects native gnome applications.For instance Scribus or Blender run very smoothly, whereas Nautilus, Rhythmbox, Gimp or Inkscape, to name a few, suffer from those horrible lags.I'm pretty sure it's not a driver problem, I'm using the latest nvidia-current drivers, and I tried everything : disabling compiz, disabling metacity compositing, using nouveau, using latest kernels, using xorg server 1.8. No change.
So I know the problem only affects native gnome apps, but now how can I find which package causes this mess ?I just installed a few repos to have some recent graphic apps (gimp, inkscape, openshot, scribus... that's it), but I can't see in what way they could have messed with my system.
I recently upgraded from 9.04 to 9.10, and Firefox is painfully slow to use. Changing between already loaded tabs takes almost a second. Scrolling through a loaded tab is also slow and jerky. Other programs (e.g. Thunderbird, or just browsing the filesystem) also seem slower when Firefox is running.
I'm running Firefox 3.5.6. My PC is Compaq Presario 2200 laptop (about four years old) with 768 MB RAM. As another data point, I also have the most recent version of Linux Mint installed on the same PC, running Firefox 3.5.3, with no problems. I've searched, and some problems have apparently been due to DNS problems, but I have problems when I'm offline (wifi physically removed), looking at webpages saved on my disk, which are already fully loaded, so it's not a DNS problem. I did try steps 1 - 7
I have a Acer Revo R3610 nettop on which I installed oS11.3. It has an Atheros AR5001 wireless network adaptor.I'm getting very bad latency and throughput on with this card. In comparison my Thinkpad with Intel card achieves at least 10 times the throughput from the same location. Also when I boot my wifi will come up authenticate and then a few seconds later disconnect and reconnect. I've tried updating the kernel to 2.6.35 as one of the fixes was to improve the performance of the ath5k driver. I'm using wpa2 authentication.
I am having problems with website load speeds. I have fendora 15 and windows loaded on my machine. When I am in windows the internet connection is fine but when I am in fedora everything is slow. I am connecting wirelessly to a netgear router using a D-Link DWA-552 XtremeN Desktop Adapter
I have not installed anything for the wireless card.
I have tested my speeds from here [url] and I know that I am getting around 7000kb down.
I have gone into firefox to about:config and set network.dns.disableIPv6 to true... but everything is slow.
Any tool where I can test a web site with slow connectivity? E.g.: A web server running at Location A and from Location B want to test the web site hosted at location A with various speeds How is the loading of the web site from location B at 256kbps, 512mbps etc..
I just installed FC14 on 2 different PCs. It takes over 40 seconds for either of them to open some webpages, ie amazons home page. It is not my internet connection. I can open the same webpage on both my Windows PCs in less than 5 seconds. I was using FC9 up until 1 month ago, and I believe it took about 15 seconds to open amazon, but I never actually timed it. I spend many hours a day on the internet and have used FC for many years. I have been very happy with it. Until now. My PCs are 1.8GHZ single processor and 2.4GHZ dual processor. Do I have to revert back to FC9 to fix this, or use a different distro?
I have ubuntu 10.04 running LAMP and proftpd, I can access over LAN but not when I type my domain name in. I have forwarded all necessary port to to the local ip of that box, added rules to ufw to allow traffic on those specific ports. I use dyndns for a second level domain and have used all their tools to check whether the ports are open and have the results I am looking for.
After a battle with Ubuntu, Django, Apache and wsgi i could reach the website i set up from another computer via ip-adress (10.37.129.6). i then restarted the server and after booting tried to access the website from outside - permission to / denied with the usual 403 error. trying to fix that, i logged in to the server and suddenly the website was available again. typed logout on the server - no access wt. 403. logged in - website can be accessed.i somehow suspect this is some strange permission problem, but i don't have a clue where to start searching. errorlogs just contain information that a / access request has been denied.
I installed ubuntuserver edition 10.10 on my old p3.Installation went fine and i installed LAMP and SSH along with it.*everyhting i do past this point is via LAN on my laptop using root account on putty or winSCP.*I installed phpmyadmin, loged in with root, duplicated the root acount and gave the duplicate any host (%) access. Next i created an account with access only to the users database (for use in my php to check username and password).Next thing i did was change the apache2 config 000-default to disable indexing.After that i changed the php.ini file to allow 6GB uploads (for movies).Rebooted... and thats where the problems start.I CANT:Acces phpmyadmin with any acount.Login on my website.I CAN:login to mysql using root.I HAVE:Checked if the mysql database and table for websitelogin still existed, they do.Checked mysql useracounts and they are still there.So something changed after the reboot, because it was working fine before i did.
Our setup is 64bit F13, Firefox/NoScript, KDE, when trying to access www.nook.com we get a long pause followed by "server not found" message. We can change to another hard drive and boot WinXP/Firefox/NoScript and we a get a redirect to:http://www.barnesandnoble.com/nook/i...refront-_-nook
You can see thread about it here also:http://www.linuxquestions.org/questi...-linux-835160/
I have web server apache on linux Centos. I can access it successfully by typing on the address bar http://localhost, 127.0.0.1 or 192.168.0.150 from the local computer server and the site loads normally with graphic. When I access the site from another computer in the same local network, I don't get the correct website. I see the site like html as text not graphic. Please see below text file output from the browser: Also I can only access the site by typing 192.168.0.150 IP address in the address bar. When I type http://localhost or 127.0.0.1, the site does not come up. Do you see what I did wrong? How can I fix this problem.
I just bought the Barnes & Noble Nook eReader for my wife's birthday and a requirement is to log on to the Nook website and establish an account. Using her Fedora 13 64bit desktop and Firefox she repeatedly tried to access the website www.nook.com to create an account, however consistently got the server not found message. Even using the Firefox addon User Agent Switcher and turning off NoScript did not correct this.
When we switched over to the WinXP install and Firefox we were able to access the website no problem. Anyone else unable to access that website via linux?
I have an internet and mail server installed CentOS, and I want to restrict client machines to access a certain website, e.g. if i want restrict users from accessing the website: www.mydomain.com, How do I do it?
I want to give some web address to host file and except these web address no website will open. For example I give permission for [URL] and [URL]. The user just enter these 2 website. Other websites will be blocked.
My company is using Squid Proxy 2.6 for internet connection. Recently we implement a Webmail system. This is the link [URL] can access it using computer directly connect to internet but those using Squid Proxy are unable to Login although the login screen appear. Is this cause by the squid.conf setting or something else?
I set up an apache webserver on a redhat enterprise server 6 last week. It works fine on the localhost. However, the webpage can't be accessed from the other computer. I didn't modify anything related to 'allow,deny' in httpd.conf. The only thing I've done is I added a rule in iptables to approve the access from a computer with a specific IP address. Since I am quite new to iptables, I don't know if there is anything wrong with my setting.
Even I stop iptables, the problem is still there. I don't know if my setting of iptables is correct. Or, there is anything else that I should do?
I have set up a 5.5 installation on a Virtual box VM. The Virtualbox (not for the VM) network preferences have DHCP Server disabled and IPV address and the network mask I believe are valid for my home network. My VM is up and running and it has gotten an IP address from my home router and everything seems fine. From any machine in my house I can ssh into my centos installation. I went ahead and started up the httpd service and the website is up. IOW, I can see it from my centos box - when I navigate to http://localhost However, I cannot see it from any other machine on the net. The IP address of the centos bos is 192.168.0.141. From the centos I can do a ttp://192.168.0.141 and see the Apache 2 test Page. I can see that from anywhere else. Although I can ssh to that machine.
I am trying to block a few websites on a lucid lynx, I tried editing /etc/hosts and that blocks access via url but the site still open if I enter the ip on the browser, how can I block ip access also? (without using any extra software besides what linux 10.04 have by default)
Trying to make a copy of my website to a local ubuntu server - I have very limited access ie: no shell access. What is the best way to make a copy of my site. have ftp cli, lftp, wget ... just not sure what to use and how.
I'd like to know if this is common security flaw or normal to open up FTP to the public which is of course protected with password for 3rd party access to maintain our public facing / production website ? If yes, what sort of FTP application to install in your Linux webserver?
I have this sitecom MD-253 NAS disk using Raid-1 and equipped with two 1TB WD hdd's. The NAS firmware is Linux of some cind and I use the, pr. today, latest firmware.However, as mounting the NAS server was not any challenge, the response time is in the most shamefully end of the scale. Even listing folder content is deadly slow, beeing from one to three second before list is shown. Both the linux laptop and the NAS is connected through cable via the router, the XP however, is wireless but access is no problem here.I found a few tutorials around dealing with mounting the NAS drive but few which dealt with the speed issues and none solving my problems. I saw one post in another forum though discussing if the problem could be caching but they had no solution.
I have used different commands mounting, but at the moment i use this one in fstab:Code://192.168.0.190/Projects /mnt/nas cifssername=zainka,password=********,_netdev,uid=zainka,gid=users 0 0Response time is not affected though. That is, for FTP the responce time is actually higher but then I run into other issues like that mounting it like a disk is difficult and the link must be keept alive constantly I also seen some comments about using NFS but isn't this a proprietary MS protocol for file access?
My first post, having only installed 11.04 at the weekend on one of my old PCs.Everything is up & running pretty well, I have a 100BT home network with 4 Windows XP PCs and a Synology DS211j server.I can see & access these other shares from Ubuntu, but it's so slow to connect, approx. 45 seconds each time, compared to almost instant from any of the XP machines.Could someone suggest how this performance could be improved?
Accessing photos on my Nokia N97 mini via Bluetooth was lightning fast on 10.10, but on 11.04, each click on a folder (via Nautilus) takes well over 1 minute to display the next level and to swap from 'Icon View' to 'List View' with around 90 photos in a folder didn't complete after several minutes at which point I cancelled out. To select 5 photos took 1-2 mins but after an initial wait, the actual transfer was fairly fast. I've a MSI U100. Connection via Bluetooth works without problem.
I'm running Ubuntu 10.04 as a full install from a USB flash drive. In other words, I've installed to the flash stick just as though it were a normal hard drive. This is not a Live USB/Persistent install.
The drive is an off-the shelf 8GB Gigaware stick, and its read/write performance is pretty slow. Any time I do anything that requires disk access, it's very sluggish and tends to hang.
I'm looking for advice on things I could do to minimize the amount of disk-access made in the course of using the system, so that it will feel snappier and more responsive.
Some things I've done already:Installed 'preload', which is a daemon that monitors what programs you use frequently, and pre-loads them into RAM to reduce startup time. Mounted /tmp as a tmpfs (RAM disk) and moved my Firefox and Chrome browser caches into RAM. Set noatime for my root and home partitions.
Should I be trying to disable the filesystem journal as well? I'm less concerned with potentially burning out the flash drive with too many writes than I am with just making the system more responsive and nicer to use.
One other thing I was reading about is the so-called "Laptop Mode" that appears to be kernel settings to allow you to spin down a laptop hard drive: [url]
Obviously a flash drive doesn't spin, but it seems like some of those same techniques could be helpful here. Is there anyone who has experience running Linux in a situation with a very slow hard drive?
The computers I'm using this flash drive with all have between 2 and 8 GB of RAM, so moving more stuff into RAM is unlikely to be an issue.
I got an old PC I am using as a game server (Counter Strike and Left 4 Dead), it's a dual core Athlon 5000+ with 2GB Ram, the motherboard is kinda old and has a SiS Chipset with terrible graphics causing the video to flicker at somewhat high resolutions (anything above 1280x1024 will cause problems).I decided to add a Geforce 8400GS just so I don't have to deal with that terrible onboard VGA, upon installing the card I noticed the VNC is unusable, awfully slow over a 1gbps lan. I have installed the kmod drivers and it hasn't changed anything
I have installed Kubuntu Lucid on my Samsung NB30 netbook. Got everything working, but the disk access seems extremely slow and noisy. Whenever larger files are being processed (software installation), the disk makes a kind of rattling sound - looks like the head is moving back and forth between two or more locations. The disk seems o.k. so far - no io errors under Linux and Windows XP boots without problems.
When I am running Ktorrent it takes about 30 seconds for a web page to open, in stead of 1 or 2 seconds. The download speed is very slow 200B/s and the status of most of the torrents is "Stalled". If have opened TCP and UDP ports 36477 4444 6881, 7881, 8881 and added them to the port forwarding list in my router. No difference.
Slow access to web site using squid and Internet explorer.I am trying to troubleshoot an issue I am stuck on. We have a website that is loading .htm documents extremely slow when using Internet Explorer 8 behind Squid. When we bypass the proxy and go directly out to the internet all is fast and pages load fine.But when the proxy is on documents will take sometimes up to 6 minutes to load.This issue is only apparent using Internet explorer 8.I do not see the issue when using firefox with Squid.I have tried to use the no_cache directive thinking it may have been the cache but that didn't work either.I am attaching our access.log, store.log and squid.conf.
The shares get mounted correctly and you can navigate through the directories and open files.The only problem is that it randomly starts going really slow taking 30 seconds or longer to open a directory that has 2 or 3 files in it.I have tried quite a few things to try and fix this without any luck. Its getting to the point where I am having to consider recommending that we use windows instead, which I would rather not do as I think its good for students to experience different operating systems during school.