Ubuntu :: Sites That Use A Webcam Cannot Connect To Server?
Feb 4, 2011
i'm finding that websites that require webcams like chatrt, omeegle, mebeam etc wont allow me to connect, it will say connect to server, or initializing and never pass this stage - is this an ubuntu flash/java issue or is it something else entirely?
I have an odd Linux problem. Changed ISP recently -- Windows NT4 & W95 work fine. Linux has worked fine on previous ISPs, but on the newest ISP, I can connect successfully on Linux, but can only communicate w/maybe one out of 10 sites (Google, for ex.). Mostly I get "cannot connect to site". Some points:
-Linux is Slackware w/kernel build 2.4.26. -No settings changed from Linux setup that worked fine on previous ISPs. -/var/log/Log file says PPP connects via CHAP successfully & no problems.. -I don't see any similarity on the few sites I can connect to -- seems random. -ISP is useless -- say they "don't support Linux". -Using Firefox & Konqueror browsers.
I've installed 10.10 yesterday on my old HP Pavillion ze2000 as dual boot with windowsXP and the wireless was working normally, I've updated everything it asked to update, rebooted a few times due to some of the updates. Today when I turned it on it was connecting normally to the router, though when I opened mozilla or tried to ping the router IP on the network tools neither have worked. Couldn't open any site on mozilla or ping any ip although it says I'm connected to my wireless router as it would normally. When I open into the XP my firefox there works just fine.
Have just installed Ubuntu 9.10 in Vmware workstation 6.5.1 and unable to connect to external sites, or servers (using NAT). I can ping on host name and IP but can't resolve either host or IP in browser (Receive message the connection has timed out). I can't ping the host (Windows 7 Professional) IP from the guest and vice versa. I use a mobile wireless network card, and my Windows operating systems work successfully in vmware. Currently in my VMNet8 properties I just have Obtain IP address and DNS Automatically (same as host).
DM9, 2GB RAM, 32GB SSD, Ubuntu 10.04 UNR.At Panera and at my local library there is a page that comes up when I try to connect to the Internet that is an agreement page. It comes up with my iPod Touch and with my MacBook. Using my DM9 that page does not come up for me to sign in so I can't get on the Internet.I have had Firefox and Chromium running at different times with the same results. When there is a WEP password or no password it connects. What do I need to do to get connected to the Internet at Panera and the like?
Over the last 3 or 4 days, I have been unable to load sites that serve their images, scripts and whathaveyou from Amazon's cloudfront domain.
[URL]
I have made no changes to any of my networking files, hosts{allow,deny}, or dns settings.
Connections don't provide any errors, just continually fail to load. Stopping the page load after a while reveals the raw HTML in some cases (quora and blekko).
Tested in Firefox, Chromium, Midori and Vimprobable.
I have booted into another distro and pages resolve immediately.
I have disabled IPV6 and my firewall - to no effect.
I'm running squeeze/testing. Two or three days ago I became unable to get my mail from pop.gmx.net. A bit of investigation and I found that the error message I was getting was: "could not resolve pop.gmx.net: Name or service not known". Now the funny thing is that I can ping, traceroute, or nslookup the site successfully. Telnet, rlogin and ssh give the same: "could not resolve pop.gmx.net: Name or service not known" error message.
I have tried using the IP address instead of the pop.gmx.net and everything works fine. There is no issue with connecting to pop.gmail.com on the same port. Obviously there is some cache somewhere with a messed up pop.gmx.net. Any idea as to how I can track this down?
In the past week or so I've noticed some weird network behaviour. I find accessing some sites such as Amazon, Paypal, and Bigstockphoto really slow. Sometimes the page will not load at all. Other sites are fine. The problem sites are not a problem for others on my LAN at home. When I try to open the problem sites, I can see in Firestarter blocked connections coming from 2.1(8/9).xxx.xxx on various ports such as 36007. This only happens for the problem sites. I attached a typical output from firestarter.
This happens with Firfeox or Chrome. Using Ubuntu 10.10
My sites on the server are loading very slow, though my server load is under 1.00.What can be the problem, when ever i restart httpd the sites starts to load instantly.
Recently I've been earning money doing web development, php, html/css, MySQL and so on. What I have encountered a lot are clients that need a complete solution. They need their site built, but they also need a hosting solution. I've sent more than just a few clients off to GoDaddy, and quite frankly, I'd like to cash in on some of that.
It would do wonders for my business if I could offer them a hosting solution with full support on top of building their site. My problem is I have no idea how to do this. So I'd like to know how I can host multiple sites on the same server. Does anyone know of a nice guide I can follow to set this up? It's really important that I can add sites fairly easily over the internet. Since I will be away at school, I won't have direct access to the server
I have a server with 8GIG of ram and a quad-core q9550 CPU and 500GIG of hard-disk space, and 100Mbps net connection.I have separated it into 3 parts of 2 Gigs of ram each, by virtualization, so there are 3 VPSes. The problem is that the one of the VPSes (created) with centos and WHM/cpanel would go down very often, as many as 60 times per day!I asked one of my friends to have a look at it, and he told me that Tomcat was the reason.Things got a little better (ask me what I changed about Tomcat?!) and after that the server would go down only about 20 to 30 times per day.I asked the friend again, and he told me that the CPanel integrated antivirus would cause the high load periodically.(Ask me again, what did I change?) And after that the server would go down only about 4 to 10 times per day.Now, as far as I can tell, there is nothing more that would/should cause the server to go down.
About 30 minutes ago, before writing this post, the hosted sites were unavailable again, for about 30 seconds.I checked the load through ssh and I saw that the load is 3, so I went to the main server and checked the VPS ram and noticed that only 10% of the cpu is used and only 307Mb of available RAM is used. Everything shows that the server should be functioning OK, but it still seems to get unavailable far too often.
I used the top command and noticed that there is nothing using the cpu and the ram and all of them are 0.I used the mysqladmin proc command and again nothing noticeably weird seemed obvious at this time. I also checked the other VPSes and they were also in relatively low-resource-consumption states(I don't understand this part):Also sometimes I use the top command when the load goes high I see that all of the users, even the non visitors users site use cpu seems forexample a refreshing things or updating thing updates all of the accounts forexample mysql or so Please help me to figure out what other things might be causing these service/server high loads and unavailability.
After a recent CentOS upgrade, my PHP has gone nuts.About 50% of the time, the websites will show just fine from my web-server.Most of my sites are wordpress related; all have the same issue. Standard HTML sites load okay all the time, though.
I have a dedicated server with two sites ie site1 & site2. We have two sites which need to share the same images folder, the situation is a shopping site (site1) with one category of products which we have decided to setup as dedicated site for (site2).
Our system uploads the images to site1 directory ie:/home/site1/public_html/images/image1.jpg
I don't want to duplicate the images into site2 images directory so is there a way of sharing them using the following url /home/site2/public_html/images/image1.jpg (which pulls site1 image)
For some reason my DNS servers aren't able to resolve certain names. Most names resolve fine there are just a few that don't work. Nslookup doesn't work either of course, and curiously neither does "whois".
I was just wondering what the best approach is to allow mobile device sites on my web server is?Its an apache 2 server running on Centos and I am just very curious as to how to best present a site just in standard output for a mobile site, ie no server side programming or scripts (but would like to do some eventually).I actually have someone asking for someone on his development team and would love to be that person with that attribute lol.
Is there a way to prevent users of an OpenVPN service from accessing restricted sites? I know this can be done through a proxy server but through a VPN there seems to be no way of preventing traffic from accessing porn sites or other as the traffic is encrypted. I am using a VPN in the same fashion as a proxy server except that the VPN is necessary because some video sites use rtmp on port 1935, which a proxy server cannot route.
When i starts it for the first time,my sites getting loaded fast when requested through browser,but slows down gradually. what could be the reason, my cpu load seems ok. Because i have another site hosted through apache in the same server and its getting loaded fast. so what could be the issue.
i have different subnet running on my network they are 192.168.1.0 to 13.0, i want to restrict subnets with different sites. for example. from 1.0 to 6.0 i want to give them 3 sites to visit, for 8.0 give some different sites to visit. allow all traffic to 13.0 but block few sites for them. i want to use url_regex for this and place sites name in different locations.
i have seven department in my office. i want to restricte web sites for all the departments but not same web sites for all the departments i.e. different sites for different departments.i have no idea about this issue.
We are using Red Hat linux 4 .We blocked certain sites through squid for certain ip address.we want to unblock these sites in particular interval ,ie during noon time and after 4.00 pm.
Here we have given
In the sites.txt we have given the names the of those sites that should be blocked
The problem is to unblock sites in sites.txt, for particular intervals
I have a linux server with debian system. I want to put in my debian server a site. I installed apache php mysql and phpmyadmim. I really thought that was more difficult for me to put my site database. But now i can load on server my database with phpmyadmin after a very simply installation. I could not believe. I'm studying how make ftp work (apt-get site-ftp?) and waiting from my host directadmin installed. But i try to manage linux console and was very exciting for a newbiest like me!
Now I can write text on my site server with echo etc,etc, I have a database, but I have not site files. While I was studying tutorial to make apache work, I frequently read the directory /var/www and my debian console confirm that is an existing directory. How can use this directory?
Any listing debian linux server commands to: copy, delete, paste from my pc desktop a file TO folder /var/www ?
It is possible without ftp? Another question. Which bad errors can make a newbie like me configuring apache with linux? I ask this because I read all where that I could not manage a server system if I'm not a sysadmin, but I have only to put in a site... what the real risk?
I have set up a SSL site for my default Apache server. But I want to set up multiple SSL sites for multiple IP based as well as Name based Virtual hosts. Is there a way where in I can include definitions for SSL certificates and keys within the Virtual Host directive in the httpd.conf, so that I can specify separate key and cert file for every Virtual Host.
I dual boot XP and FC14 and have 2 routers. I can connect and ping one of these routers when I'm in FC and I have an IP address I just can't load any websites. When I connect to the other router (my main router) it works fine. When I boot into XP and connect to the problem router I can load pages fine. It's only when I'm on FC14 and connect to the problem router that I can't load pages even though I have an IP and can ping around.