I'm running a squid proxy in my ubuntu server, and I must have mess it up with the squid configuration. Users, cannot, access https pages. Can you tell me what to change in my squid.conf, so, to fix this?
Here is my squid.conf (witch is a friends conf, that i have change for my needs...)
This started yesterday. I haven't made any recent changes. I can't access any pages beginning with https. It's just my computer because my girlfriend's laptop doesn't have any issues. I'm using OpenDNS, but I have been for a long time and this is the first time this has ever happened. I'm not using a router, I connect straight to the modem, which I've already reset.
Freshly installed Ubuntu 10.10 amd64 on Asus K50IN laptop refuses to show secure HTTPS internet pages, while posting this here is OKay. It says connection timed out. I tried several browsers FireFox Arora Epiphany.
I connect to the internet at work through an authenticating proxy, and to avoid having to enter the proxy info into every app I use (e.g. firefox, wget, kde, etc) I have set up squid as a local transparent proxy which authenticates and routes all traffic to the work proxy. It has been working fine, but lately I haven't been able to connect to any https sites. I don't think I have changed the configuration, so perhaps it is the result of an upgrade, or something badly configured on my system from the start. I have tried connecting to https sites without squid and iptables and it works fine. My system is Arch linux, and my squid.conf file is: Code:
acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 acl Safe_ports port 80# http acl Safe_ports port 21# ftp acl Safe_ports port 443# https [Code]....
I managed to configure my W890i phone to get access to internet through an ubuntu-based computer. It's very easy to use the phone to give internet access to the computer, but the opposite is quite more tricky. For that I've done the following
----On the phone---
-Set the USB network option to "through computer", so that the phone uses the computer's internet connection and not the opposite.
-Decide and set "Shared Network" parameters: user, pasword and workgroup.
-In "conectivity-> internet connection" set "allow local network" to "yes"
----On Ubuntu 10.04---
-Install samba, samba-client, smbfs, smbclient, firestarter and dhcp3-server
-Configure Samba (System-> Administration-> Shared folders): same workgroup as in the phone, add new user (the phone), passwd this new user. In my case the user was called "w890i" and the password given was the same.
-Once the phone is connected to the computer through USB (then select "phone mode"), a new connection appears in NetworkManager: usb0.The aim is to create a shared network that gives internet access to this device. Edit the IPv4 parameters of this new connection, set them to Manual and give an IP adress (192.168.0.1) and a subnet mask (255.255.255.0); the rest of the fields are left empty.Connect this network.
-Set firestarter to use dhcp3: sudo ln -sf /etc/init.d/dhcp3-server /etc/init.d/dhcpd
-Launch firestarter and follow the wizard. Set "allow internet shared connection", choose the device for the primary internet access, and then the device for the shared network (usb0). Then change the settings for firestarter: activate DHCP for local network, set IP to the one we gave before (192.168.0.1).
-Open dhcp3-server config file sudo gedit /etc/default/dhcp3-server And set INTERFACES="usb0"
-Set the policies of firestarter: in incoming connections, allow connections from the IP adress given to the phone (192.168.0.1). Then add rules for the ports that need to be open for this connection. I opened HTTP, HTTPS, SMB, SMTP, POP3, IMAP, IMAPS, DHCP for all the connections in the local network.
-Apply policies and start the firewall.
------------
After all this, the phone can access the internet through the computer. Two problems appeared:
1. I couldn't get access to https sites, like webmails. The phone gave a "communication error". But then I tried with Opera instead of the browser built in the phone's firmware, and I could finally get to https sites.
2. I couldn't retrieve mail, neither POP nor IMAP nor IMAPS. I thought it was a firmware problem again, and I tried out several mobile phone email clients written in java, but none of them worked.
So this is at the moment the problem. If I connect from the phone to the internet directly through 3G, the email clients work for all my accounts. I don't think it's a firewall problem, because the ports are opened for this connection
My problem is Firefox only shows https:// pages and not regular pages like google. I'm stumped. I have no clue why. I've tried other versions of linux. Different computers and still the same result. If I turn quiet off I see in my terminal all the traffic.
My Problem is: I want to stop gmail access without blocking https. Yes in my squid proxy normal [URL].. is not accessible. But gmail recently started https service by which user can still get access to gmail. I DONT WANT TO STOP https CAUSE ITS BEING USING BY OTHER PROGRAMS.
Up until yesterday I was able to access these devices through the web interfaces that they use. I'm running Ubuntu Karmic 9.10 I have 3 PCs running on my network, and 2 NAS devices: Ubuntu (main computer, also has an XP partition)) - static IP 192.168.1.30 Ubuntu netbook - DHCP IP Windows XP (HTPC) - static 192.168.1.50 Linksys NSLU2 (was running Debian, problem arose when trying different configuration, now back to stock firmware) - static IP 192.168.1.100 Dlink DSM-G600 - static IP 192.168.1.120
I used to be able to able access these just fine using my main Ubuntu setup. Now, it will no longer display the pages. Internet access is fine, i can even access my DSL/Router device's internal config page. The netbook and the Windows HTPC can both log into these devices, as well as the XP partition of the Ubuntu system. I have tried using a VM of XP within Ubuntu, bridging the network device, i have the same symptoms - internet is fine, cannot access local network web-logins.
Access to fileshares among all machines remains unchanged. another odd behavior is that i can SSH into the NSLU2 device from all the machines, but i get odd things from this computer - it will let me log in, asks for a username and pass, but if i run anything like mc or htop, it just blanks the terminal in an odd way. from other computers the login and display are fine.
I can connect to anything locally, I can SSH in locally but nothing works from outside my network (can't access pages in apache, can SSH, etc.) I've checked the logs I can think of, but what should I be looking for? The last thing I did today was add a FW rule to log and drop from an IP, saved my iptables to a file in /etc. (this was at 10:45am). The last time I successfully logged in remotely 1:03pm. I connected multiple times in between. I've rebooted remotely once or twice. That's it. I've already flushed iptables.
I'm trying to setup an Apache webserver on my computer in order to practice HTML5/CSS3 for an upcoming competition I'm in. I'm able to access my site from inside my network, but I cannot outside my network. I've had several people try, and they all report that the server just times out. I'm running Ubuntu 10.04 and Apache 2.2.17
I want that squid error pages have the company logo. I edited the page called ERR_ACCESS_DENIED but the image doesn't show (image its in the same directory as ERR_ACCESS_DENIED).
I tried png and gif formats to no avail.
Also I found this but I don't get what the writer is saying so I cant do it.
I am using squid to controlling access to the internet all is working fine expect one of the user who is using outside organization portal to connect internet. But whenever he tries to enter in the portal by typing (EXAMPLE)url. Permission denied error from squid occur.
How can i allow this portal in squid. So squid will allow this to access.
I am experiencing a problem which I lack the knowledge to crack at this stage. I run a home server (2.6.28-19-server #64-Ubuntu SMP) which firewalls and proxies for my LAN (including 2 linux boxes, a vista box and a win7 box).Most of the time this works well. Very occasionally we will run into a website which will not display at all on the LAN boxes.Examples: for a long time I could hit the forums.egosoft.com site but not the egosoft.com site. Now my wife can't hit www.boden.co.uk. I'm pretty sure that the problem is in the Squid3 config. solation testing:Run the vista box direct into the ADSL modem: get the website.Hit the website using Lynx through SSH on the server box: get the website.Try to get the site on a browser (have tested with IE8 and chrome) on any of the win boxes through the proxy - no joy. DNS error.Firefox on Ubuntu desktop from the linux client boxes gives an error message about an unrecognised form of compression.I can only assume that I've misconfigured the Squid3 proxy in some way - it works for 99% of sites but fails utterly on others.
Suppose when I install squid proxy server Can I view the web pages visited by a particular user/machine in a particular session? I think we can analise the information by the log files of squid. But can I view the page(static)?
We have a sipmle office network set up that we also use use to connect to the internet, however of late the number of users has increased thus slowing internet access. Bandwidth upgrade is not an option thus i have to do bandwidth shaping on our linux router. The question is how do set the squid configs to allow certain IP's range a certain percentage bandwidtheg 60% and furthe divide the rest. Alternatively how can allow certain IPs to have higher bandwidth access.
My Problem is: I want to stop gmail access without blocking https. Yes in my squid proxy normal http://gmail.com is not accessible. But gmail recently started https service by which user can still get access to gmail. I DONT WANT TO STOP https CAUSE ITS BEING USING BY MY COMPANY GOOGLE MAIL PROGRAM.
I am running Ubuntu 10.4 with Apache2, SVN and SSL. Both HTTP and HTTPS are working correctly with my website. Although the SVN setup I have is not working. This configuration gives me a 403 error.
Code: <Location /svn> DAV svn SVNParentPath /srv/svn/repos SVNListParentPath On
[Code]...
This issue is driving me up the walls. If there is any additional information, I will be more than happy to provide it.
I have an old FC2 box running Squid version 2.5. It has been running since 2003 so I am in the process of replacing it. I have a new machine with FC11, iptables, and Squid 3.0 installed.
On the old machine I use iptables to intercept Port 80 traffic and send it to Squid. By default I block all internet access and allow only sites that are in an Allowed_Sites.txt file. Within Squid I also have statements to allow certain users to bypass Squid based on their IP address.
I have set up the same thing on the new box. I have iptables intercepting the Port 80 traffic and sending it to Squid. That is working because if I remove the redirect statement from iptables all internet access is blocked.
The problem I am having is that Squid is not blocking any websites. It acts like the ACL is set to http_access allow all. I have worked on this for several hours and am stumped.
I have set up certain portions of my web site to be forced https:// How do I force, non https:// protocols. I know this sounds confusing, so let me give you an example.
on the command line, I get the man page for perl, which lists all of the sections into which the perl man pages have been split. How do I access one of these sections? For instance, the first section is perlintro, but if I type
Code: man perlintro
I get an error saying there is no such manual entry. How do I access these parts of the manual?
I'm about to create a CSR and was reading this page in the Ubuntu docs: [URL] A couple of things:
* There's no date on the article. The documentation needs DATES because this information gets out of date! Check MySQL docs, for instance -- they are organized by version. * The instructions for generating a cert only specify 2048 bits. I believe that's kind of out of date? The verisign site has big red warnings saying you need 2048 if you want your cert to last past 2013 -- and that article is 4 years old! * The instructions are confusing when discussing the passphrase. We enter a passphrase only to remove it immediately. We need some clarity here. Why do this?
How to understand the current best practices for generating an HTTPS cert for apache and/or mail access?
i m using squid for internet sharing, i am facing problem while accessing public ftp, therer is no problem in accessing local ftp, but if try to access public ftp like ftp://125.125.20.2 i am getting error
' An FTP authentication failure occurred while trying to retrieve the URL: ftp://125.125.20.3/
Squid sent the following FTP command:
PASS <yourpassword>and then received this reply User anonymous cannot log in.Your cache administrator is root.'
if i try to access local ftp ' ftp://10.185.200.12' getting no error