Security :: Using Squid To Restrict Access During Certain Hours But Only To Certain Websites?
Jan 21, 2011
I have been trying to get Squid to work so that I can restrict access to a particular web site during certain hours every night. I can't seem to get it working, however. I am still able to access the site. The following are the relevant lines from my squid.conf file:
acl restricted-domain dstdomain "/etc/squid/denied_domains.acl"
acl test time 19:00-20:00
acl bedtime time 22:00-23:59
I have configured my squid that have a limited access to websites but still some website were accessable vis https so I removed transparent from squid. Now what changes do I have to make in iptbles
my team is working on network thier termial is windows and my server is linux centos we work on simple network with out domainmy user works on files on the server, can I deman ser name and passwork when they try to change to the shared files on the servernd can i monitor which user chaned a fileI have css developer and he is only allowed to create and modify css files can i do this ?
I heard we can set security in /etc/hosts.allow and /etc/hosts.deny on user base also like something user@domain or something if so how can I restrict a user to access particular service by his/her user name in a particular host via /etc/hosts.allow or /etc/hosts.deny
I'm running Natty and have made two logins on the system. One for myself and family and one for the kids (teens 14-15yr) to play in without Internet access via Admin "Users and Groups". I have hidden the Internet software icons on their screen amongst others i don't want them to see on the menus. On our screen I use a Firefox addon called "Web Of Trust" that can be configured easily for the kids and another addon called 'Blocksite' that I can selectively use for them and myself etc.
I have found out that they have still been able to get on to the net somehow under their login. Will have to observe again!! In the users settings for the kids the tick box for 'Internet'and 'use modem' access is un-ticked so I presumed that would be enough! Not so!!
I'm trying to tighten up my network a bit. I've given my dhcp server a list of static mac addresses and ip's for computers i know, and a very short range of dhcp addresses that are redirected to kittenwar.My dilemma is that if someone has my wireless network password, or an ethernet cable, they could set the ip address manually and gain access.how can i deny them this pleasure?im running dhcpd3, and iptables on a debian/lenny intel 2.4 box. dd-wrt is running in a linksys wrt54g and is handling the wireless security
I tried changing the sftpserver port but its not working, besides how can i restrict users from particular ips.Eg: users a can ssh from 192.168.*.*user b can sftp from 200.*.*
I'm using Ubuntu x64 (dunno which version, but I don't think it matters) and I'm concerned about security with PHP.I remember using lighttpd and I had some mystic configuration and the secuirty was perfect for me - if one website gets hacked then the others are still safe (kinda).Now with apache2 if I enable safemode I'm still able to go outside web directory and actually I can go really far untill user/group matches.I tested the system with r57shell and I was able to mess up other websites.Is there a way to disallow access to other websites?
I'd like to be able to limit access to a particular website, based on the time of day. I would also like to be able to password protect this if possible.So for instance, from 7am until 10pm daily, I can access URL... but after 10pm it redirects to 127.0.0.1 or something. And this configuration be protected by only allowing a certain user (other than root) to change the config?
I don't know is this the right place to ask, but i must ask some questions Here's my problem.I'm a student in highscool,and here we use Linux(ubuntu) OS .Every classroom has like 30 PC's connected with the main computer(the teacher's one) so....3 days ago we were forbidden access to some websites it says This domain is Blocked.By the way the Linux version installed is 7.04(feisty Fawn) i tried disable cookies that did not worked,also tried to whitelist some website,that also didn't worked out
I am trying to configure my Linux router to restrict Internet access for one computer on my LAN. It needs to be restrictive based on the time of day and the days of the week. I am using the MAC address of the computer to single out the one computer that needs to be blocked. However, this is my first attempt at making any rules with iptables, and I am not sure if I am doing this right. If some one can take a look at this I would greatly appreciate it. This is what I have done so far.
Here is my thinking. Create a new target. Check the MAC address, if it is NOT the offending computer return to the default chain. If it is the offending computer check that we are between the allowed hours and dates and ACCEPT. If we are not within the time/date range then drop the packet.
Code:
Here I am trying to route all packets regardless of the computer on the LAN into the blocked_access chain for checking.
Code:
Is it a good idea to route all traffic through the blocked_access chain? I do run other servers that are accessible from the Internet, so I am not sure how this setup will affect that. I also use shorewall on the router to setup iptables for me. How would I integrate this with shorewall?
I am using squid to block access when he is using the web browser. However, he is still able to play games(World of Warcraft) and the like.
I am using Debian sid, iptable(1.4.6), shorewall(4.4.6), kernel 2.6.32-trunk-686.
I am not sure whether it's possible or not. We running squid proxy server for our office. We restrict users using ACL to access the internet. There is some who do the followings:
1. Create a own proxy in there box who has the internet access.
2. Other users use those box as proxy and access to the internet.
i have seven department in my office. i want to restricte web sites for all the departments but not same web sites for all the departments i.e. different sites for different departments.i have no idea about this issue.
i have configured the squid for my lan. My lan has three redhat 5.3 web servers. Now by using proxy server, i wish to give access to external clients for my web server and restrict to local client, accessing wan through port 80
i have been looking up ways to block websites in linux but most all of the free ones only block the sites the software makers want to (or you can't block just the sites you enter without blocking the sites that it wants to)i need to know how to block only websites i enter with squid using webmin
I want to restrict some site (Social Networking) through my newly configured squid proxy. But It always allow those site How to block those site. My squid.conf file is configured as follow :-
#Recommended minimum configuration:
acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
I am using squid to controlling access to the internet all is working fine expect one of the user who is using outside organization portal to connect internet. But whenever he tries to enter in the portal by typing (EXAMPLE)url. Permission denied error from squid occur.
How can i allow this portal in squid. So squid will allow this to access.
I'm a terrible procrastinator, it's awe-inspiring annoying and stressful. This in combination with being a information-holic makes the Internet fairly lethal to me; I risk failing my college course because of it, so trust me when I say I'm deadly serious about this.
However, I think you guys may be able to help out, and maybe this will also help some people here with similar problems:
Because so much of my time is taken up with Interwebz, I thought to carefully restrict my internet use. It's not prefect, but it's part of a solution.
To date: I have Firefox and the ProCon extension which uses a whitelist of websites I can access. The extension cannot be uninstalled/disabled and I use a long hex password split into 3 parts, two of which my friends have (so I have to ask my friends for the password parts in order to update the whitelist, hence making it socially awkward to fritter away time online).
So far, it has worked a treat and I'm really pleased with it.
However, this is the problem:
I need to restrict web access so *only* Firefox can access the web. That way I cannot use Chrome/Opera, or even (shudder) use wine to run Internet Exploder.
I typed this into the command line:sudo iptables -A INPUT -p tcp --dport 80 -m time --timestart 12:00:00 --timestop 23:59:59 --days Sat, Sun -j ACCEPTI get this error:iptables v1.4.4: unknown option '--days'How do I do something similar above in which I allow the internet to start at 12 o clock on Saturdays and Sundays
I've installed Ubuntu Desktop Ed 9 and I want to add a user account that would be very restricted. I would only want them to access the internet and run several programs. I do not want them to have access to the destkop, anything under preferences, administration etc... Is this possible?
I would like to allow a user to login through SSH but with different permission coming from different ipaddress.
For example, a user "tester" login to SSH through 192.168.1.1 and another user login with the same login id "tester" but from different ip 192.168.1.2.
How do I restrict 192.168.1.2 to only allow for viewing the content in the home directory while giving 192.168.1.1 full access?
Here's the beginning of the issue: I'm running Fedora 12 with httpd and sshd. I want to create a user with a scponly shell for sftp access, but this user should ONLY be able to view /the/http/base/dir and its subdirectories. The user should not be able to see or get into directories above the httpd base. Someone mentioned creating a chroot jail for sshd and binding the httpd base to that dir, but this seems like more work than is necessary for the application I wish. Also mentioned was creating a user, say user1 with a selinux user setting of staff_r. I have read the articles and creating a user of staff_r isn't overly difficult, but how would I make it where staff_r would be restricted to where I want them to be? If I'm not mistaken, that would require changing the context of /the/httpd/base/dir?
I'm running Ubuntu Server 10.04 32-bit.I'm looking to find if there is anyway I can lock down ubuntu so that remote access, whether it be SSH, ftp, apache.etc can be only accessed from a certain IP range, or a certain set of IPs?Essentially, we'll say the Server IP is 192.168.1.32, and I want the IP addresses 192.168.1.33-50 to be able to access the server, but no other IPs.I am in a switched environment, router's are not allowed to be placed on the network, and I do not have access to a DNS or DHCP server.Is there a way to do this in on the server via a configuration of some sort?
If there is a general NFS share in the LAN and for example this share has three files - a, b, c is there any way to restrict file access to the root user of e particular host(falcon) in the same LAN environment while the normal users from the same host(falcon) should be able to access the NFS share & files a, b,
I have a question in Samba and would like to ask you for the solution. Is there anyway we can restrict the SMB share access to particular domain name? say allowing access for "example.com" domain users only.