General :: How To Automatically Delete Squid Cache
Oct 23, 2010
I'm using squid as a proxy in our organization. My problem is whenever the squid cache folder size exceeds 3 GB our proxy server starts to hang. Is there a way to write a script that it will automatically delete the squid cash when it exceeds the 3GB limit.
I installed squid cache on my ubuntu server 10.10 and it is work fine but i want to know how to make it cache all files like .exe .mp3 .avi ....etc. and the other thing i want to know is how to make my client take the files from the cache in the full speed. since am using mikrotik system to use pppoe for clients and i match it with my ubuntu squid
I recently finished installing Squid 3.1.9 and I think I've done installing correctly with its feature of minor configuration changes. It accepted requests on port 3128 and created and the created some numerical or binary (I guess) files in /usr/local/squid/var/logs, My problem is how can I fully verify if the cache is really storing Internet Files? I've read some forums in the Internet replied to me to try the command: cat /usr/local/squid/var/logs/cache.logSo I did tried it and it gives me this output:
2010/11/11 18:04:49| store_swap_size = 0 2010/11/11 18:04:50| storeLateRelease: released 0 objects 2010/11/11 18:05:41| Squid is already running! Process ID 5458
I installed squid 2.6.stable21 on centos 5.3,it starts normally from /usr/local/squid/sbin/squid but when I start form init.d it is running but display message. /etc/init.d/squid start then displays Starting JunkBuster & Squid 2010/08/02 19:05:52| ACL name 'all' not defined! FATAL: Bungled squid.conf line 179: http_reply_access allow all Squid Cache (Version 2.6.STABLE21): Terminated abnormally. [FAILED]
im using a lease line on real ip for internet connection in my office.i have shared my internet conetion with a squid 2.6 stable6. i m having almost 200 pc in my office.since last few days my squid is creating a problem.as soon as i restart it ,it work fine but after a few minute it becomes extremely slow ..it almost dies.when i go to its cache log i found the error that your cache is running out of file descriptor...once i increased the no of file descrptor from 1024 to 4028 , the problem temporaily sorted out but after a few days same problem is repetaed ....and still the problem exist
I have a script running as a cronjobIt outputs logs upon each run to /var/log/mylog.logIs there anyway I can delete this or compress it when it gets too large?A cheap and dirty way is to setup another cronjob to delete the log every X interval.... although I'm not sure if that's the proper way
I want to know that is there any way to check that how much size of squid cache directory is full? Normally when squid size will reach upto 2GB then I have to clean it.
Code: # cache_dir ufs /var/spool/squid 2000 16 256 I assign 2000 MB to the cache size. One month have been pass while squid is running but i am unable to know how to check the current status of my squid cache size
I am using squid on my cent os 5. The cache directory is almost full.
Code: # cache_dir ufs /var/spool/squid 1000 16 256 kindly guide me how can i flush or clean the data inside the cache directory of squid using terminal
i Have a Squid Server , i'm Using That for Caching ... i Have 3 Ether on My " Squid Server ". Ether1 : Directly Internet From ISP1 , 2Mbps . Ether2 : Directly Internet From ISP2 , 512 KBps . Ether3 : Connected too LAN . i want All The Files Format with " MP3 , RAR , ZIP , AVI , ... (All Downloads File) " Get From Ether1(ISP1) and WebPages Like " HTML , ASP , CGI , & ... " Get from Ether2(ISP2) i Not Know How to Configure That with My 2 ISP internet.
what i need, I got two servers for about 4000 users and 300 servers and well the guy never setup dns caching right, so im redoing it. Now my goals
1) DNS cache 2) Transparent Squid Cache only 3) Load Balance - at switchlevel
Upgraded Hardrives to SSD 2x32gb each server 4gb of ram 2x Dell poweredge 850's - p4 2.8 (single cores) So any advise , pointers , expeirnces and best ways to do this being both server will do both dns caching and squid! Also is bind9 the best for this?? i seen stuff about DNSmasq what performs better( i dont need DHCP)
I am using Cent Os 5. I want to know that is there any way to check that how much size of squid cache directory is full? Normally when squid size will reach upto 2GB then I have to clean it.
# cache_dir ufs /var/spool/squid 2000 16 256
I assign 2000 MB to the cache size. One month have been pass while squid is running but i am unable to know how to check the current status of my squid cache size.
I have installed debian to run Squid cache as a caching proxy. Ive been bashing away now for 2 days and i have managed to install squid (i first tried manually, but that did not work so i used synaptic software packager to install it (from Administration menu) That went well, thereafter i installed webamin to work with squid in a GUI
I have managed to start squid and added my range of IP addresses to the ACL list I have added the proxy restriction too.
Now, i tried to test it. I opened Iceweasel Web browser (on the same machine) and setit to use the Proxy server: localhost and port:3128 That works fine.
But when i try to change the proxy setting to my machines ip (where squid is installed) : Proxy server: 10.0.0.35 and port:3128 That does not work. Am i missing something, please help I then tried to set another windows PC on the network to: Proxy server: 10.0.0.35 and port:3128 That also does not work.
I also edited the conf file to http_access allow all, but i do not know if i have doen it correctly, but maybe there is another problem?
I am administrating a lab in a university and every semester we need to delete all the home folders of the accounts for the next semester. I would like to make a bash script that does this automatically and having trouble with it. Note that I am writing my very first bash script. What I need to do is make a script to delete the following:
Delete everything in /home/$exp$num/$dir when "exp" could be either "rt", "ic" or "sp". "num" could run from 1(single digit) to 45 and dir is "profile" and "work".
This is what I tried to write:
Code:
#!/bin/sh cd /home for exp in "rt ic sp" do
[code]....
What seems to be the problem is the reading of "$exp$num" as a joint expression.
My /var/cache/apt/archives directory has almost 9000 items and is over 12 GB big. All it contains is a bunch of .deb files. Do I need this file or can I delete it to save hard drive space?
I really want to free up disk spaces in my ubuntu, so i look up disk analyzer then i saw /var/cache/apt/archives take about 1.2GB space from my disk. can i deleted those packages to free up disk spaces? the packages there is a *.deb files, when i click on some them, it open ubuntu software center. an the ubuntu software center describe the newer upgrade is installed.
Is it possible to configure a cron script to update the packages in /var/cache/apt-cacher/packages? When a client machine updates a package, apt-cacher checks that it's cached package is up-to-date, and downloads a new version if it is not.
I'd like apt-cacher to check it's cached packages every night and download any updated ones, on the premise that since it exists in the apt-cacher cache, someone has that package installed and is going to want to update it. Is this possible? Does apt-cacher do this anyway, and I haven't noticed?
I have just installed squid and sarg, both are working well. I cannot get squid to start when the machine boots. I installed webmin and configured the setup through the webmin interface where I checked the "start at boot option". I think the problem is that my network card responsible for the web connection is wireless so when squid tries to load during boot it is trying before the Wireless card is up and so it fails. I have tried multiple methods of adding a script to the "dispatcher.d" folder in "NetworkManager" to load squid after wlan0 is up but all have failed to bring up the squid proxy without my interaction. I have not found any two sites that describe the syntax for "dispatcher.d" alike so I may have the command wrong but I have no what to do.
One of our user inform me about some problems in viewing one site that he does not get fresh objects from site's server while site is updated everyday!! After some investigate and dig on headers I get, I found out in response headers that "date" field is for 2 or 3 last week!! 1- Does Squid cache "response headers" ?If so, when user send "If-none-match" or "if-modified-since" in request header the cached "response header" will be updated?
I am on Ubuntu 10.04, running Squid 2.7.STABLE7 Squid refuses to start on my machine unless I give it the -D argument (disable initial DNS tests). I think this is because my machine is offline while I'm testing it, so no DNS server. how to set Upstart to start Squid with the -D argument?
Does squid automatically split bandwidth between connected clients? I'm wondering if someone was downloading a lot of data and someone else connected whether it would split the access 50:50 between them? I have 1 user that is using a lot of bandwidth but the server doesn't seem to split it up between all connected clients so others are receiving slow access. I don't have this client's IP address but I do have ncsa auth connected. Will delay_pools work with an ncsa username?