Ubuntu Servers :: Apt - Get File Size Mismatch Due To SQUID?
Sep 9, 2010
I am trying to get apt-get to work on a server thats behind a squid proxy server.I have added exections in squid.conf to allow all on [URL]..apt-get can find updates but when it try's to download/install I get
Code:
Failed to fetch http://gb.archive.ubuntu.com/ubuntu/pool/main/g/gdebi/gdebi_0.6.0ubuntu2_all.deb Size mismatch
if I
Code:
wget http://gb.archive.ubuntu.com/ubuntu/pool/main/g/gdebi/gdebi_0.6.0ubuntu2_all.deb
it works..
View 1 Replies
ADVERTISEMENT
Dec 10, 2010
sudo apt-get install ubuntu-desktop fails and has been failing for the last two days, I installed both the lto and the 10.10 newest, starts out OK and runs for about 5 minutes then finishes with many 'failed to fetch [URL] size mismatch issues. This is the first one after many installs over the last two years. Is there something going on with the US servers? On closer examination of running sudo apt-get update it seems to be fine till it hits the pool directory on the server, lucid main and lucid-updates are fine, breaks when it goes past those.
View 3 Replies
View Related
Sep 1, 2011
Ive been experiencing problems with my squid3 recently, i am using 10.04.3 LTS. Configured squid as always have been configuring since 8.04.04, but its not working as it should be.Ive been having issues with bz2 files, in my LAN when i try to do an apt-get update, it just says some indexes could not be downloaded cause of a sum hash mismatch.If you check squid log, for that kind of file it saysTCP_REFRESH_UNMODIFIED/206 Google around i read thats ssquid cache keeppin the files more than usual, so i add this hoping to solve the problem:refresh_pattern -i .bz2$ 0 0% 60 override-lastmod refresh-ims override-expireI dont know if thats well written or not, but it doesnt have solve the problem, and now squid log shows,TCP_REFRESH_UNMODIFIED/304but the same behaviour of the hash sum mismatch, please if someone could throw a ligth in here. The only fix to this problem so far is deleting all cache and recreate it every morning, which is far from a solution.
View 1 Replies
View Related
Jun 24, 2010
Squid servers cache size ondisk automatically increasing and decreasing,how I would resolve this issue
View 6 Replies
View Related
Feb 21, 2011
When I run df command, the sum of free and used space doesn't tally up to total space. Now I found an explanation in this thread: URL...The amount of "reserved" space seems a little too high as OP of the thread had commented, but may be that's what needed. Now my question is is there a way to change/specify this amount of "reserved" space manually?Also is there a way where you can see the actual disk usage for every mounted partition including the hidden "reserved" space? Because sometimes this mismatch makes the output of df very confusing!
I am attaching output of df, and fdisk from my NFS server. The amount of reserved space seems to be close to 5% of the total partition size.
View 3 Replies
View Related
Jun 19, 2011
I have Ubuntu 10.10 installed and a few weeks ago I tried upgrading to 11.04 using "Update Manager" BUT I got the following error after a few hours of fetching all the files from internet except for one
Code:Could not download the upgrades
The upgrade has aborted. Please check your Internet connection or installation media and try again. All files downloaded so far are kept.
Failed to fetch [URL]
I thought there's something wrong with the internet connection in spite of that everything else works well.. I tried upgrading in different times in different days, I redid the process again today and I got the same error again. Is there any other way to upgrade or ignore this package?
View 7 Replies
View Related
Nov 29, 2010
When I try to update my software through update manager, I get this error:
W: Failed to fetch [URL] Size mismatch
As a side note, when I enter Update Manager, it does not show 10.10 as an available update. I have checked my settings. I am quite kurfluddled by all of this.
View 7 Replies
View Related
Mar 16, 2010
I've just installed an ubuntu 9.10 server at my office and configure apache, php and mysql without problems.
Now i need to share some directories in order to wed developers access to the websites files.But when I try to share, ubuntu tries to install samba services, and I keep to get those errors:
"W: Failed to fetch url Size mismatch
W: Failed to fetch url
Size mismatch"
View 6 Replies
View Related
Jun 15, 2010
this is at the root of the problems I'm about to describe: Whenever I try to upgrade my Debian installation, I have repeated "size mismatch" errors that necessitate restarting the upgrade process. Since I'm using Debian Squeeze there are lots of upgrades to be done every week. Today, for instance, because I've not upgraded for a few weeks I have to download over 300MB of files to complete the upgrade. However, because of the size mismatch problem this download may well turn into something between 1GB and 2GB, as I repeatedly have to re-download files that were successfully downloaded in the last failed upgrade. What with my ISP limiting me to 7GB per month, I'm finding that merely keeping Debian up to date is using up most of my allotted bandwidth.
I've tried setting the Aptitude preferences not to "remove unused packages automatically", and not to "remove obsolete package files after downloading package lists", but for some reason I still find that files I downloaded in the last failed upgrade have to be downloaded again (and again, ...) whether anyone either knows how to avoid the package size mismatch problem (I can't change my ISP, which has a monopoly on providing internet to my area), or alternatively knows how to set Aptitude not to forget the packages it has downloaded successfully during each failed upgrade, thereby allowing me to avoid such a huge bandwidth usage on my frequent upgrades.
View 14 Replies
View Related
Jun 7, 2010
I have a command line server that logs to stdout, which I start along the lines of ./server > log.txt
What I want to do is limit the size of log.txt, without modifying the server.
I am assuming there must be some kind of tool already that lets me do this, something like where I can pass in my server, the output file and a size limit? If so, can anyone enlighten me?
View 3 Replies
View Related
May 22, 2011
I keep getting a very frustrating error after reinstalling 11.04 on a 64-bit server:
Code:
W: Failed to fetch gzip:/var/lib/apt/lists/partial/us.archive.ubuntu.com_ubuntu_dists_natty_universe_source_Sources Hash Sum mismatch
which follow
Code:
bzip2: Data integrity error when decompressing.
[Code]...
I originally installed 11.04 on this system with the same disk so I'm confused as to what is happening. I have tried using different CD's and even installed of a USB. The memory test of the installer came back okay. The box has two PCI NICs that are recognized and functioning. The only package I have installed is openssh as this is a headless box. What makes this even more odd is that I have a newly installed 32-bit 11.04 server with the same procedure on the same network and it does just fine.
I hope I have been thorough enough in my search and investigation that someone can point me in the right direction. After reinstalling 3 times I am at a loss and I know 11.04 can run on my box without problem (as it did before the reinstall).Tonight I'm going to reinstall with the built in interface unplugged and the two NICs removed, I'm hoping this will prevent apt from bricking itself during the install, which is the only think left I can think of. I suspect the issue has to do with the nic despite it appearing to function properly I am unable to uncompress the archives in question on, my workstation, when they are downloaded on the sever but can successfully uncompress the same files when downloaded on the workstation.
SOLUTION: I pulled both NIC's and enabled the integrated interface and it is all working. I wonder why they didn't work though...
View 2 Replies
View Related
Jun 10, 2010
Are there software that can split big file size into small file size in Linux?
View 1 Replies
View Related
Nov 26, 2010
I m using squid 2.7 Stable 9 and Dansguardian 2.10.1.1, i have compiled both squid and dansguardian, i have enabled follow_x_forwarded_for in squid to make clients IPs visible to squid, i have also set x_forwarded_for=on in dansguardian, this is working fine and clients ips are visible to squid. Now i want to set down-loadable file size limit upto 50 MB in squid by using the acl reply_body_max_size 52428800 allow mynetwork for every user except few users the above acl is not working properly. mynetwork is our private network which is 192.168.0.0/16.
When i set the acl reply_body_max_size 52428800 allow localhost . it works fine but only for localhost. I want to allow upto 50 MB down-loadable file size to every user in my network except a few users whom will have access upto 500 MB down-loadable file size.
View 2 Replies
View Related
Jan 20, 2010
I have set up squid server. My cache directory has been set up as per following statements.cache_dir ufs /Cache1/squid 10000 16 256cache_dir ufs /Cache2/squid 10000 16 256Now the problem is that size of /Cache1 and /Cache2 has reached to about 8GB and in near future it will reach the maximum limit of 10GB. I just want to know that whether I need to delete the contents of these directories or otherwise.
View 1 Replies
View Related
Jan 19, 2011
is lvresize with --resizefs options re-size the Logical Volume and then re-size the file system? i mean we don't need to use resize2fs?I looked at man pages but it doesn't explain this option.
View 3 Replies
View Related
Dec 14, 2010
How can we find the maximum size of the inode table and what decides it, and how the maximum size of volume of file system is decided ?
View 4 Replies
View Related
Jun 11, 2010
I was creating an OpenGL+GLUT+Perl Simulator, and I got this error on starting it:
Error: API mismatch: the NVIDIA kernel module has version 195.36.15, but this NVIDIA driver component has version 195.36.24. make sure that the kernel module and all NVIDIA driver components have the same version. The app still works fine, but I have no clue what to do about this.
View 1 Replies
View Related
Jan 6, 2010
my squid show like this when i try create swap directory
[Code]....
View 2 Replies
View Related
Feb 23, 2011
I have tried many times for squid proxy cache server on10.10.But invain.
View 3 Replies
View Related
Jun 3, 2010
I am an avid ubuntu desktop user, however I started working at a company that runs their firewall, mail and proxy server on centos 5.2, all was working well so never needed to tamper with it, however we (myself and the administrator) randomly decided to install squid3 for its features, the install went well, however squid3 didnt run as we wanted it to, after looking into it reading some material on the net, we realised how stupid we were to not to test it -sigh- so weve purged and removed it and reverted back to 2.6, now however we even have issues with that too, we install squid 2.6 and start the service and we get errors that stop, and starting fail, however the status of squid shows its running. How to get it back to working order?
View 3 Replies
View Related
Mar 12, 2011
I have installed ubuntu in my system .Squid is by default in it. I want to compile delay_pools feature in squid.Normally we download source code file in tar.gz format ,uncompressed itand then install it. If we want to compile any thing we can also do it.Here in my case there is already squid installed during installation process.
View 1 Replies
View Related
Jan 18, 2010
I'm migrating an old squid installation that use "httpd_accel_host" directive to archive the acceleration configuration. What I have is lot internals web servers published through an squid to the rest of the world, some of then http and others https. I try using https_port but when restart squid I get the following message: parseConfigFile: squid.conf:1202 unrecognized: 'https_port'
[Code]....
View 1 Replies
View Related
Feb 17, 2010
How do i exclude some URL from the proxy caching at squid.conf as i dont wont to cache ....., mediacafe etc, I know i can block site by create a file like bad-sites.acl, then adding it to squid.conf but that blocks it.
View 2 Replies
View Related
Apr 28, 2010
I've been having a hard time googling and trying to get ALL network connection to be redirected to squid proxy. I couldn't find a proper configuration for ufw or iptables. The ideas are:
1. redirection rule should NOT depend on a specific network inteface, but should work with any connection type, ex.: ppp0 or eth0... 2. firewall rules can be for firehol, iptables, or ufw (the same as iptables, just tell me where to place them). Preferably ufw or gufw. 3. should not interfere on cups web interface and lighttpd server.
BTW: it's not a ubuntu server install
View 6 Replies
View Related
Aug 2, 2011
I am trying to set up a SQUID server on my Red Hat Linux 7.3. to act as a cache server.
The machine has 2 network interface:
eth0 172.30.254.4/23
eth1 172.3.254.65/23
Default gateway is 172.30.254.19/23
The hostname is "amelie".
[Code].....
View 2 Replies
View Related
Apr 22, 2011
I am curious if perhaps I am doing something wrong extracting pages from a pdf doc using pdftk and creating a new file. I am only extracting the odd pages from the file and outputting them to a new file that is now only 20 pages instead of the input's 40 pages, yet the new output file is still 1.4Mb in size, the same as the original.
It seems strange to extract only half the pages of a large document and end up with a result that is the same size. how to streamline the resulting pdf's using pdftk?
BTW this is the command I am using, in case perhaps I am missing an option to optimize file size or something:
Code:
pdftk A=ch15.pdf cat A1-40odd output odd.pdf
View 1 Replies
View Related
Jan 5, 2010
I've been all around the net and can't find a "simple" answer how to block our LAN users from downloading torrents. Is it really that difficult?
Here's our setup:
1. The Server's Configs:
2. sudo gedit /etc/squid/squid.conf
3. sudo gedit /etc/rc.local (to start Firewall rules on bootup)
4. Server NOT a DHCP Server
5. No other iptables rules are configured, just the above ones.
Before in a 1 NIC setup, I blocked Workstations MAC addresses in the Router + Squid Proxy Server (Not Transparent), it worked, but some Online Java Apps didn't work and users can't send/receive email so I abandoned the method.
Now, I installed transparent Squid Proxy with 2 NIC cards, it works, but workstations can still download torrents! I know Squid doesn't block ports, right? So the answer must lie in Iptables Firewall? I basically use Squid just to deny access to Facebook, Friendster, or other "unproductive sites".
Quote:
How to block torrent downloading by using a Firewall? Or is there another "simple" way?
I've heard that it's better just to allow regular ports (80, 22, 465, etc...) then block all the rest, this way, you can prevent unnecessary ports.
I'm not an Iptables/Firewall expert so can you pls. explain it a bit more detailed if that's the case.
I'm also aware of just telling our users NOT to download torrents, but I just want to prohibit it entirely.
I know I will be the most "uncool" employee in our office.
View 9 Replies
View Related
Jan 29, 2010
I would like to configure Squid and DansGuardian that way, that it's a Proxy with Authentication via Website. That means: A new Notebook gets about DHCP the Network-Information like IP-Adress etc.. When he now tries to open a Internet connection it should check if he's authenticated and if not he should get (if this try is from a browser) a login screen in http. It should also not be possible to have internetconnection without being logged in. The clients are Windoze, Mac and Linux. My question now. What programms/deamons are there for doing this authentication. Would you decide for another Programm instead of Squid?
View 2 Replies
View Related
Mar 5, 2010
I am using Squid Proxy Server alongwith Dansguardian, dhcp3 and ClamAV on my local network. Everything is working fine except for .flv files like [URL]. The problem is that Squid wants to download the whole .flv file first to its cache and then serve it to the client.
It has an advantage that the whole video loads at once on the client's browser but that is not what our users want.
What they want is that these files load on the fly as they do on a normal internet connection. How do I configure squid to serve .flv files on the fly to the client PCs?
View 9 Replies
View Related
Mar 8, 2010
Ubuntu Server 8.04. I have 2 servers in the same rack, on the same subnet, using the same DNS servers and build with the same media. On one of them the following command fails
Code:
apt-get install squid
On the other on the package was installed with out any problems. I have checked the /etc/apt/sources.list files and they are identical. Actually, it could not find mutt either.
View 5 Replies
View Related