Ubuntu :: Cannot Fetch Archives - 404 Not Found
Aug 5, 2010
I cannot download any package from the greek ubuntu server. The same problem has a friend of mine. Is there any problem with these servers? The error is 404, not found, something very important
View 1 Replies
ADVERTISEMENT
Mar 3, 2011
I have brought a virtuell server to get Magento ready. My server is debian with PHP Version 5.2.6-1+lenny3
For that i need Pear. I want to install it global so i tried this command "apt-get install php-pear". Following i get after this:
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following extra packages will be installed:
php5-cl
Suggested packages:
code....
1+lenny3_all.deb 404 Not Found
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
This is my sources.list:
deb url
deb url
deb url
View 2 Replies
View Related
Aug 3, 2011
Code...
Why does it come like this?
Should I do any changes in /etc/apt/source/list?
View 3 Replies
View Related
Feb 18, 2011
[code]...
What should be done in this situation? Shall I tell Ubuntu to use another server? If yes, how?
View 3 Replies
View Related
Sep 6, 2010
I am trying to upgrade the version of dovecot on a rarely-maintained Debian machine with which I almost never have problems (ain't Linux wonderful?). The current version is 1.0.rc15-2, and I think the latest version is 1.2 something, but the point is that I want to be able to use the pigeonhole sieve plugin. Anyway, when I try to "apt-get install dovecot-imapd dovecot-pop3d", it gets to this: Err [URL].. etch/main gnomemeeting 2.0.3-6 404 Not Found Failed to fetch [URL]..-6_all.deb 404 Not Found E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
View 5 Replies
View Related
Aug 24, 2010
Cannot install filezilla For the last two days, I have been trying to install Filezilla from the standard distributions. I get an error:
Quote:
Failed to fetch http://archive.getdeb.net/ubuntu/poo...etdeb2_all.deb 404 Not Found
View 1 Replies
View Related
Oct 24, 2010
I have been having trouble with my updates, specifically these code...
When I try to install these updates I get the following error message code...
View 6 Replies
View Related
Feb 5, 2011
While I am trying to install updates following error is displayed
W:Failed to fetch url 404 Not Found
, W:Failed to fetch url 404 Not Found
, E:Some index files failed to download, they have been ignored, or old ones used instead.
My internet conection is good, I've installed some updates months ago.
View 2 Replies
View Related
Jul 13, 2011
Did ubuntu removed files for 9.4 repository from there server coause when I do Quote: sudo apt-get update I get following warnings
[code]...
View 3 Replies
View Related
Jan 12, 2010
I've found a website [URL] that I've downloaded Basilisk II [URL] and SheepShaver [URL] from, I converted both inton deb archive using the command sudo alien, but while they both successfully converted and installed, and show up in the Applications menu, only Basilisk II launches. When I click on the SheepShaver icon, nothing happens. I had this problem with another SheepShaver RPM I downloaded elsewhere. I really want to use SheepShaver to run Mac OS 9 on my laptop, is there something I can do to get it to run?
View 7 Replies
View Related
May 19, 2011
I just can't get any updates from: [URL] AND [URL] In both cases it says "404 not found" Here's the code out of the terminal after typing "sudo apt-get update"
W: Failed to fetch [URL] 404 Not Found
W: Failed to fetch [URL] 404 Not Found
E: Some index files failed to download. They have been ignored, or old ones used instead.
View 9 Replies
View Related
Feb 13, 2011
Lately when I've tried to update my packages many of them don't update. I get nothing out of the ordinary when using the terminal but when I try to update via the Update manger I get this error message:
Code:
Error message:
Fetch failed: W:Failed to fetch bzip2:/var/lib/apt/lists/partial/mirror.steadfast.net_debian_dists_testing_main_source_Sources Hash Sum mismatch
W:Failed to fetch bzip2:/var/lib/apt/lists/partial/mirror.steadfast.net_debian_dists_testing_main_binary-amd64_Packages Hash Sum mismatch, W:Failed to fetch http://mirror.steadfast.net/debian/dists/testing/updates/main/source/Sources 404 Not Found [IP: 208.100.4.53 80]
[Code]...
I've tried several different mirrors but get the same error with every one. Same result with ftp as well. Yesterday I upgraded to Wheezy and every package successfully upgraded, but today the problem started again.
View 10 Replies
View Related
Apr 13, 2011
I have downloaded a petri net tool petrify.tar.gz.Then extracted it using archive manager.But unable to install it.How could I install it?Is there any commands or instructions for installing it?
View 9 Replies
View Related
May 11, 2010
Would it cause any issues to delete the cached archives from the installed packages? I have almost a gig of space being taken up and would like to get the space back.
View 3 Replies
View Related
May 19, 2010
A few releases ago things used to work this way, but seemingly since Jaunty it no longer does seem to work:
This used to work: I have several Ubuntu machines to upgrade and limited (and highly expensive) bandwidth available. In the past I would upgrade/update one machine and copy the contents of /var/cache/apt/archives to the other machines to be upgraded. When updating/upgrading the distro the packages that already exist in /var/cache/apt/archives would not be downloaded from the Internet while utilising the locally cached .deb files, saving time and bandwidth.
The Problem: I have noticed that since 9.10 (and with 10.04) this no longer appears to work. While the .deb packages may exist in /var/cache/apt/archives, the same packages would be downloaded from the Internet regardless of the same .deb file existing in /var/cache/apt/archives/.
Question: What is wrong and how can I restore the functionality present in copying cached packages from machine to machine in order to save bandwidth?
Why I need this to work: At work (and I have no control over this...) we are limited to a total bandwidth of 3Gb per month for multiple users. Needless to say a single upgrade of a recent Ubuntu distro can decimate a large proportion of our monthly available bandwidth. Upgrading multiple machines is absolutely out of the question if the packages have to be downloaded every time.
An apt-cache server is not an option: The option of a local apt-cache server is not feasible due to the same 3Gb bandwidth constraint. An apt-cache server requires 15Gb storage per version of Ubuntu and the downloading/upgrading of 15Gb worth of packages for that storage is not an option due to the 3Gb limitation.
View 3 Replies
View Related
May 23, 2010
My /var/cache/apt/archives directory has almost 9000 items and is over 12 GB big. All it contains is a bunch of .deb files. Do I need this file or can I delete it to save hard drive space?
View 2 Replies
View Related
Jan 12, 2011
I really want to free up disk spaces in my ubuntu, so i look up disk analyzer then i saw /var/cache/apt/archives take about 1.2GB space from my disk. can i deleted those packages to free up disk spaces? the packages there is a *.deb files, when i click on some them, it open ubuntu software center. an the ubuntu software center describe the newer upgrade is installed.
View 5 Replies
View Related
Feb 2, 2011
I've never really used command line to do such things but I'd like to learn. So, how do I extract all archives that are spread across several directories in one go?
For instance:
Quote:
Work/
Work/Today/a.rar
Work/Tomorrow/b.rar
Work/Yesterday/c.rar
View 2 Replies
View Related
Feb 25, 2011
Receive the following error when trying to install packages using Software Center or Synaptic Manager. Was working fine just until the other day.Failed to fetch 4.5-1_i386.deb Could not connect to my.archive.ubuntu.com:80 (203.106.62.8. - connect (111: Connection refused)I'm using Ubuntu 10.10 - the Maverick Meerkat - released in October 2010
I tried the following:
sudo apt-get update
sudo dpkg --configure -a
[code]....
View 3 Replies
View Related
Nov 3, 2010
I want to tar -zxvf several dozen .tar.gz archives. I know how only to tar -zxvf one archive at a time. If possible, I would like to tar -zxvf all the archives in one command so I don't have to type "tar -zxvf [filename]" several dozen times. What's the syntax? I looked at the man tar page, but couldn't find it.
View 3 Replies
View Related
Oct 29, 2010
Are there any other archiving tools than tar that preserve Linux file permissions and user/group flags? I want to create archives of a directory tree on my ubuntu box. The directory trees are large, a first run with tar cvpzf archive.tgz /home/foo/bar yielded a 5GB archive. I also want to keep all permissions and other flags and special files.I'm fine with a 5GB archive, however to look inside that archive -- since it is a compressed tar archive -- the whole 5GB have to be decompressed first! (Or so it appears when opening it with the archive viewer -- I'm happy to be corrected.)So I need a way to "backup" a directory (tree) while preserving full filesystem attributes and right and creating an archive with an "index" that hasn't to be decompressed for browsing its contents. An archive is either a single file, or a (small) set of files that carries full and complete information within this file/s. That is, it can live on any filesystem (size permitting), it can be burnt onto a DVD, you can split it (after which the point of this question is really lost - but still), ...
View 3 Replies
View Related
Jan 26, 2010
I have a wanderfulley working Kubuntu Karmic 9.10 box and love it. now I just did another install of 9.10 on a old box that dosent have eney network card. the old thing is runing prity well but I want it set up like this one that I'm on now SO I thought I'd try aptoncd the trouble is that I have done apt-get clean a few times sence seting this box up so wat's left in /var/cache/apt/archives is a varey incomplete set of packages. I'm trying to find a way to reload the /var/cache/apt/archives with all insaled packages.
[Code]...
View 9 Replies
View Related
Apr 25, 2010
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
View 4 Replies
View Related
Jan 9, 2011
I ran into some complications with my ubuntu 10.10 install and couldn't boot into the desktop. I decided to use a live CD to get access to my downloaded cached packages and copied them to my windows partition, now I've re-installed ubuntu and manually restored the apps into the /archives/ directory, I've installed a lot of apps from the directory already but AptOnCD has refused to properly reflect the apps I have there, is there something I'm overlooking?
View 2 Replies
View Related
Apr 23, 2011
I have a large number of zip-archives which I would like to extract and then delete, keeping only the extracted files. The files are located in a number of directories and I issued the following command, hoping that it would solve my problem (which it didn't):
Code:find . -name "*.zip" -execdir unzip '{}' ; -deleteWhat happened was that find continued to extract the files as intended, but then it deletednot only the zip-archives but also the newly extracted files.it only affected a portion of the files that I had already backed up.I don't have enough space on my hard drive to extract all the files without simultaneously removing any data and I don't have the energy to go through all the directories one by one manually. Surely there must be a better way. Is there a way to fix the code snippet above so that it works as I intended or is there a better way of doing this?
View 4 Replies
View Related
Aug 4, 2010
There is this directory with a lot of .deb files in it. /var/cache/apt/archives
Can i delete all these files in that directory? is it safe? They are temporary files right?
View 5 Replies
View Related
Dec 4, 2010
I have in my pc (ubuntu 10.10 32 bit) a folder with about 10.000 files. It is a samba shared folder and so far I could browse fast and easily those files in another old ( Pentium4 1,5Ghz) PC with WinXP. I installed ubuntu 10.10 but the browsing now is toooooooo slow , although the old Pc's performance is considerably faster than before (when I had WinXP). Also, although the old PC's speed is satisfatory now, would it become even faster if I installed an older ubuntu distro?
View 4 Replies
View Related
Jul 31, 2011
Each time I try to just open a password protected archive with Ark it just shows a "Loading file" progress bar, but nothing happens after several minutes. And if I just right-click and select "extract" extracts nothing. It doesn't even ask for the password at any moment. I also checked the help typing "man Ark" in the console, but found nothing related to passwords. Has someone successfully opened archives with password with Ark? Or is it still buggy?
And on a side note, I can open normal .rar files, but cannot compress files. It says "failed to locate program RAR in the PATH". I thought Ark was all self-contained. Do I need to install something else?
View 8 Replies
View Related
Jul 20, 2010
How can I group files and create archives accordingly? I have 10,000 files in a folder (no sub-folders) and I want to create 10 zip or tar.gz archives. This means every archive has 1,000 files. How can I do this in Linux?
View 1 Replies
View Related
Mar 28, 2011
Someone has sent me a zip file containing some fonts. I extracted it using unzip under Linux, and there are empty files in the top level of the archive, and some files similarly named but beginning with ._ in a __MACOSX subdirectory.
I understand that the __MACOSX contents should be metadata, and normally I'd delete it. In this case, however, all of the data seems to be in there! Is there a tool that I can use to reassemble the original data?
View 1 Replies
View Related