I am using squid 2.6 (as a proxy server) in my cent os 5 box.The clients computers are factehing the web pages successfully. The firewall (IPTABLES) are already disabled.The problem is we have an internal web based application by which the users add the data in it. when the user type the ip address in the browser i.e http://10.1.7.21:81/mis squid shows Code:ERRORThe requested URL could not be retrievedWhile trying to retrieve the URL: http://10.1.7.21:81/mis/The following error was encountered:Access Denied.Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect.Your cache administrator is root.We have another proxy server MS ISA 2006 and by changing proxy from Linux squid to MS ISA we can access the page.
installed dansguardian and now working fine.I got a small problem. People bypassing proxy settings in firefox, means they go to settings and changes proxy settings to no proxy.. how to prevent this? How can I force people to use proxy to connect Internet? I done some googling but, unable to find a solution.
I have tested the networks at several schools in the area,and at the town hall. It is not possible to surf on www on any of these networks using a PC running Linux. My conclusion is that there has to be some kind of filtering of traffic that exclude PC's running Linux.
From the same PC I can send and receive email,I can ping and trace (mtr) addresses on www, and I can view webpages that are on servers on the inside of the filtering-gateway. The filter used is InterScan Web Security Virtual Appliance from TrendMicro I have also demonstrated for the admins at the town hall that using Linux-PC on a "clean" network, surfing is no problem. By doing these small tests I have demonstrated that Linux is not the problem.
Tomorrow I'm going to visit the network providers admins, so that they could see what happens when a PC running Linux tries to access www. What kind of things should I test to document, or find the problems? So far I have just used MTR to document slow respons, wget --no-proxy to document that www hangs and ends time out, ifconfig to show NiC settings, and route...
The network provider is the same company that refused to turn on IMAP on the exchange servers, resulting in 3 week without mail at our school. All the other schools had to upgrade Outlook in order to connect to the new exchange-server with MS MAPI settings.
I'm using Fedora 10 as a proxy server using squid, but I recently noticed that some users use the IPS's Dns to bypass the proxy and surf the web freely. So my question is, is this a problem with Squid or perhaps I can solve the problem whit IPTables.
I've attached my squid.conf, I'm creating a proxy server along with a content filter for work using dansguardian and squid, recently my boss asked me to allow a few ips to bypass squid without being authenticated now i thought this config might do it but it seems to be reluctant to let this computer through without being authenticated
I have Squid and Dans set up on a passthrough box with 2 nics, port 80 requestsEverything is working great. I need to know if there is a way to set up Dansguardian so that a user can enter a password on a blocked page to access it.
I want to restrict some site (Social Networking) through my newly configured squid proxy. But It always allow those site How to block those site. My squid.conf file is configured as follow :-
#Recommended minimum configuration:
acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
Every time I try and visit this website I get brought to my default localhost page. I've pinged the website and my terminal reads:
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.019 ms 64 bytes from localhost (127.0.0.1): icmp_seq=2 ttl=64 time=0.027 ms 64 bytes from localhost (127.0.0.1): icmp_seq=3 ttl=64 time=0.030 ms 64 bytes from localhost (127.0.0.1): icmp_seq=4 ttl=64 time=0.026 ms 64 bytes from localhost (127.0.0.1): icmp_seq=5 ttl=64 time=0.028 ms
I've also thrown the domain into [URL] to make sure it wasn't something up locally... again it came up as 127.0.0.1. Is there any way I'd be able to access this site?
I was saddled with the job of maintaining my department's website (I work at a college). When I still used windows I would access my department's folder on the web server using the following procedure: (in windows XP) go to the start menu > click 'run' > enter the folder address, I would be prompted for my login and password. The folder, and the whole server in fact, would then be visible in the windows file browser, under the 'networks' icon. I could then navigate to my department's folder and modify the files I need to to update the website.
How do I do this in Opensuse (using Gnome). I tried going to 'network' in nautilus and then 'open location' but no luck. I also tried 'connect to server' in nautilus (in the 'file' menu), but again no luck. which I could stomach if my college provided reliable access to computers on campus, but they don't so I have to use my laptop, which is now windows free . My current job is only going to last for a few more months, so having only so recently got rid of windows I am reluctant to re-install it just for this purpose (which is just about the only reason I currently have for using windows - the other is being able to download audiobooks from the public library, but that's another matter).
i am using squid proxy server to block some websites in my organization.now i was doing one testing that is, when i blocked Google. gmail automatically gets block.is there any method by which i could block google but gmail access is
My company is using Squid Proxy 2.6 for internet connection. Recently we implement a Webmail system. This is the link [URL] can access it using computer directly connect to internet but those using Squid Proxy are unable to Login although the login screen appear. Is this cause by the squid.conf setting or something else?
i have seven department in my office. i want to restricte web sites for all the departments but not same web sites for all the departments i.e. different sites for different departments.i have no idea about this issue.
I would like to configure Squid and DansGuardian that way, that it's a Proxy with Authentication via Website. That means: A new Notebook gets about DHCP the Network-Information like IP-Adress etc.. When he now tries to open a Internet connection it should check if he's authenticated and if not he should get (if this try is from a browser) a login screen in http. It should also not be possible to have internetconnection without being logged in. The clients are Windoze, Mac and Linux. My question now. What programms/deamons are there for doing this authentication. Would you decide for another Programm instead of Squid?
I am using FC11. My problem is whatever application that needs access to the internet are blocked by company's proxy server. So, configuration is like
my_machine---------> firewall ----------> outside world..google and etc
Now, If I am using firefox then I have configured it to use proxy server and required login details and etc. But, my eclipse, ssh, git and all those needs internet connection as well... Is there anything like which sets all details (proxy server, user name and passwd) system wide ? So that I dont have to pass it to each application...
I have a lifttime premium account at Megaupload.com. Recently I found that megaupload has blocked all IPs in my area. I have sent emails to their tech but nothing has returned!!I have tried to use proxies to download stuff from megaupload, however, none of them works for suspected bandwidth reason, not to mention that it might be slow too.
An application that supports SOCKS 5 protocol can forward its network connection over ssh and dynamically forward to any host name that you specify isn't it ? that means firebox web browser can use SOCKS 5 protocol can be used by pass proxy settings isn't it ? so how can a sys admin remotely detect if a particular user is bypassing proxy settings using SOCKS 5 protocol ?
My router is crap. If I use DHCP it sets all the computers DNS to itself and all DNS requests get cached in the router. It even starts to loose some DNS request if to many are made at once. On my windows PCs this isnt a problem I just set DNS to google's public DNS servers (8.8.8.8 & 8.8.4.4) and bypass my router and ISP alltogether but when i go to pref>network_connections i have to either set DHCP or manual, there is no option to set DHCP with custom DNS. I'm sure there must be a way to do this in terminal, can someone tell me how? I'm using ubuntu 10.10.
I am using squid to controlling access to the internet all is working fine expect one of the user who is using outside organization portal to connect internet. But whenever he tries to enter in the portal by typing (EXAMPLE)url. Permission denied error from squid occur.
How can i allow this portal in squid. So squid will allow this to access.
My squid server works fine in fedora 11 system . Is there any web like interface for admins to create,change,modify users of squid and to view their logs.
I would like to ask some help and tutorial for setting up and how to configure squid proxy server in my (Home PC Server). I am a newbie in Linux Centos. I already installed in my system the CentOS 5.5 . Now, I want to configure it as my internet server, all of my 4 system running in Windows including the laptop I want to connect through my CentOS pc with username authentication. I assign all IP address by static. see tthe attachement in my set up. [url] I just want to know what I need to change and add in my squid config file. And how can I configure properly my CentOS with 2 LAN card as internet server.
I have met following problem: there is a website I cannot access. When I am trying to connect to this site I get an error message. I tried all browsers installed on my system. Seamonkey, Firefox, Konqueror. All failed. I don't understand this completely as I have no problem with connecting to internet. Now I am thinking that I myself somehow unintentionally blocked an access to this particular site. Is it possible? Under W$ I have no problems at all.
How would I set up a website that would be only accessible locally. There's a router machine (server) that keeps provides internet access for a number of client machines. I need to set up a learning platform (moodle) locally. The server machine runs moodle server (apache server) and students should have access to their accounts locally (no need to be accessible outside of LAN). First of all, what would be the best network configuration for it.Sorry for a dumb question, but could I just come up with any domain name if everything stays locally within LAN?
I have ubuntu 10.10 server with a web site I am mess up in NIC configuration. I have only one web site on my server. I Have 2 Internet connection with static IP. I have 2 Network Card as follow eth0 119.155.152.140 (1st internet with static IP without firewall) eth1 203.135.30.240 (2nd internet with static IP without firewall) when i restart my networking it give me following error
I am trying to create a shell script (CGI) which will create a website with an embedded flash mp3 player. At the same time I want this script to overwrite the file currently in the playlist with another file which will allow me to change the song currently playing on the flash player. The reason I am doing this is so that I can copy the song I want to play to this file at a low bitrate reducing its size. The code I am using in my script is:
I have several web servers, say myserver1, myserver2, and myserver3, behind a firewall that the higher-ups run. They recently changed the firewall to block all ports except for the ones they want open. I have www.myserver1.com viewable to the outside world, but www.myserver2.com is not viewable to the outside world. I was wondering if there is a way to set things up so that people could go to [URL] and view the [URL] and [URL] website content.
myserver1 is running Solaris 7, which will (hopefully) be upgraded to Ubuntu 10.04. myServer2 is running Ubuntu 8.04. myserver3 is running Windows XP. I was wonder if this is possible with any OS, not just mine.
My hosting server is running Linux / Apache. It would be very nice to be able to link some files (preferably hard links, but symbolic links also would help), but haven't a clue how to do so. I would be willing to write a server side php script if that would do the trick.