Ubuntu Installation :: Cannot Connect To Software Archives?
Feb 25, 2011
Receive the following error when trying to install packages using Software Center or Synaptic Manager. Was working fine just until the other day.Failed to fetch 4.5-1_i386.deb Could not connect to my.archive.ubuntu.com:80 (203.106.62.8. - connect (111: Connection refused)I'm using Ubuntu 10.10 - the Maverick Meerkat - released in October 2010
I tried the following:
sudo apt-get update
sudo dpkg --configure -a
I've found a website [URL] that I've downloaded Basilisk II [URL] and SheepShaver [URL] from, I converted both inton deb archive using the command sudo alien, but while they both successfully converted and installed, and show up in the Applications menu, only Basilisk II launches. When I click on the SheepShaver icon, nothing happens. I had this problem with another SheepShaver RPM I downloaded elsewhere. I really want to use SheepShaver to run Mac OS 9 on my laptop, is there something I can do to get it to run?
During the installation, I kept getting tons of errors. Finally something came up saying that I had to abort the installation and it did some stuff. I tried running an application, but I got an error. So I restarted the system.
The normal screen came up where I had to choose the Ubuntu stuff (I'm new to Ubuntu) in the box. I noticed that it had gone to 8.10... which I had earlier before installing 9.04, which went great. So I chose the first on and the system failed to start. I rebooted and tried all the other options, but they all had errors. Now I'm booted to Windows.
I installed Ubuntu from a CD I created. But it is now outdated because it is 8.10, and I have already upgraded to 9.04. 9.04 to 9.10 is where stuff went wrong.
A few releases ago things used to work this way, but seemingly since Jaunty it no longer does seem to work:
This used to work: I have several Ubuntu machines to upgrade and limited (and highly expensive) bandwidth available. In the past I would upgrade/update one machine and copy the contents of /var/cache/apt/archives to the other machines to be upgraded. When updating/upgrading the distro the packages that already exist in /var/cache/apt/archives would not be downloaded from the Internet while utilising the locally cached .deb files, saving time and bandwidth.
The Problem: I have noticed that since 9.10 (and with 10.04) this no longer appears to work. While the .deb packages may exist in /var/cache/apt/archives, the same packages would be downloaded from the Internet regardless of the same .deb file existing in /var/cache/apt/archives/.
Question: What is wrong and how can I restore the functionality present in copying cached packages from machine to machine in order to save bandwidth?
Why I need this to work: At work (and I have no control over this...) we are limited to a total bandwidth of 3Gb per month for multiple users. Needless to say a single upgrade of a recent Ubuntu distro can decimate a large proportion of our monthly available bandwidth. Upgrading multiple machines is absolutely out of the question if the packages have to be downloaded every time.
An apt-cache server is not an option: The option of a local apt-cache server is not feasible due to the same 3Gb bandwidth constraint. An apt-cache server requires 15Gb storage per version of Ubuntu and the downloading/upgrading of 15Gb worth of packages for that storage is not an option due to the 3Gb limitation.
I have brought a virtuell server to get Magento ready. My server is debian with PHP Version 5.2.6-1+lenny3
For that i need Pear. I want to install it global so i tried this command "apt-get install php-pear". Following i get after this:
Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: php5-cl Suggested packages: code.... 1+lenny3_all.deb 404 Not Found E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/) , are you root?
I have been trying to install MATLAB 2008b as a standalone version; however, every time when the process runs to 18% it shows an error as follows: "There was an error extracting the archives for MATLAB compiler. Chceck that you have enough disk space and rerun the installer."
I did have checked the disk space which is much more than enough. I have also changed few different disks that are way more than enough as well, about 5GB at least, but this error still popped out when it runs to 18%.
I have downloaded a petri net tool petrify.tar.gz.Then extracted it using archive manager.But unable to install it.How could I install it?Is there any commands or instructions for installing it?
I cannot download any package from the greek ubuntu server. The same problem has a friend of mine. Is there any problem with these servers? The error is 404, not found, something very important
Would it cause any issues to delete the cached archives from the installed packages? I have almost a gig of space being taken up and would like to get the space back.
My /var/cache/apt/archives directory has almost 9000 items and is over 12 GB big. All it contains is a bunch of .deb files. Do I need this file or can I delete it to save hard drive space?
I really want to free up disk spaces in my ubuntu, so i look up disk analyzer then i saw /var/cache/apt/archives take about 1.2GB space from my disk. can i deleted those packages to free up disk spaces? the packages there is a *.deb files, when i click on some them, it open ubuntu software center. an the ubuntu software center describe the newer upgrade is installed.
I've never really used command line to do such things but I'd like to learn. So, how do I extract all archives that are spread across several directories in one go?
I want to tar -zxvf several dozen .tar.gz archives. I know how only to tar -zxvf one archive at a time. If possible, I would like to tar -zxvf all the archives in one command so I don't have to type "tar -zxvf [filename]" several dozen times. What's the syntax? I looked at the man tar page, but couldn't find it.
Are there any other archiving tools than tar that preserve Linux file permissions and user/group flags? I want to create archives of a directory tree on my ubuntu box. The directory trees are large, a first run with tar cvpzf archive.tgz /home/foo/bar yielded a 5GB archive. I also want to keep all permissions and other flags and special files.I'm fine with a 5GB archive, however to look inside that archive -- since it is a compressed tar archive -- the whole 5GB have to be decompressed first! (Or so it appears when opening it with the archive viewer -- I'm happy to be corrected.)So I need a way to "backup" a directory (tree) while preserving full filesystem attributes and right and creating an archive with an "index" that hasn't to be decompressed for browsing its contents. An archive is either a single file, or a (small) set of files that carries full and complete information within this file/s. That is, it can live on any filesystem (size permitting), it can be burnt onto a DVD, you can split it (after which the point of this question is really lost - but still), ...
I have a wanderfulley working Kubuntu Karmic 9.10 box and love it. now I just did another install of 9.10 on a old box that dosent have eney network card. the old thing is runing prity well but I want it set up like this one that I'm on now SO I thought I'd try aptoncd the trouble is that I have done apt-get clean a few times sence seting this box up so wat's left in /var/cache/apt/archives is a varey incomplete set of packages. I'm trying to find a way to reload the /var/cache/apt/archives with all insaled packages.
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
I ran into some complications with my ubuntu 10.10 install and couldn't boot into the desktop. I decided to use a live CD to get access to my downloaded cached packages and copied them to my windows partition, now I've re-installed ubuntu and manually restored the apps into the /archives/ directory, I've installed a lot of apps from the directory already but AptOnCD has refused to properly reflect the apps I have there, is there something I'm overlooking?
I have a large number of zip-archives which I would like to extract and then delete, keeping only the extracted files. The files are located in a number of directories and I issued the following command, hoping that it would solve my problem (which it didn't):
Code:find . -name "*.zip" -execdir unzip '{}' ; -deleteWhat happened was that find continued to extract the files as intended, but then it deletednot only the zip-archives but also the newly extracted files.it only affected a portion of the files that I had already backed up.I don't have enough space on my hard drive to extract all the files without simultaneously removing any data and I don't have the energy to go through all the directories one by one manually. Surely there must be a better way. Is there a way to fix the code snippet above so that it works as I intended or is there a better way of doing this?
I have in my pc (ubuntu 10.10 32 bit) a folder with about 10.000 files. It is a samba shared folder and so far I could browse fast and easily those files in another old ( Pentium4 1,5Ghz) PC with WinXP. I installed ubuntu 10.10 but the browsing now is toooooooo slow , although the old Pc's performance is considerably faster than before (when I had WinXP). Also, although the old PC's speed is satisfatory now, would it become even faster if I installed an older ubuntu distro?
Each time I try to just open a password protected archive with Ark it just shows a "Loading file" progress bar, but nothing happens after several minutes. And if I just right-click and select "extract" extracts nothing. It doesn't even ask for the password at any moment. I also checked the help typing "man Ark" in the console, but found nothing related to passwords. Has someone successfully opened archives with password with Ark? Or is it still buggy?
And on a side note, I can open normal .rar files, but cannot compress files. It says "failed to locate program RAR in the PATH". I thought Ark was all self-contained. Do I need to install something else?
How can I group files and create archives accordingly? I have 10,000 files in a folder (no sub-folders) and I want to create 10 zip or tar.gz archives. This means every archive has 1,000 files. How can I do this in Linux?
Someone has sent me a zip file containing some fonts. I extracted it using unzip under Linux, and there are empty files in the top level of the archive, and some files similarly named but beginning with ._ in a __MACOSX subdirectory.
I understand that the __MACOSX contents should be metadata, and normally I'd delete it. In this case, however, all of the data seems to be in there! Is there a tool that I can use to reassemble the original data?
I have a dedicated server and I am having email issues etc (seemingly) because the /var directory is 97% full
I would like to know if it is safe to clear it and how to clear it (assuming it will not disrupt/kill server services to do so).
I have a 'Matrix' control panel so i can view the storage etc but it does not have an way of clearing the /Var directory.
I have Putty Access to root but do not know much about command line access.
I found a few threads but the information is not clear to me as there seems to be an assumption of (basic?) knowledge I don't yet have.
My linux support guru that usually does this kind of thing for me is away and not contactable and my server is grinding to a halt and unable to store/send email.
I have only a very basic understanding of command line but really need to get this sorted ASAP.
Is it possible and SAFE to delete files via FTP from the /var/cache/apt/archives?