Ubuntu :: Deleting Cached Archives?
May 11, 2010Would it cause any issues to delete the cached archives from the installed packages? I have almost a gig of space being taken up and would like to get the space back.
View 3 RepliesWould it cause any issues to delete the cached archives from the installed packages? I have almost a gig of space being taken up and would like to get the space back.
View 3 RepliesI have a large number of zip-archives which I would like to extract and then delete, keeping only the extracted files. The files are located in a number of directories and I issued the following command, hoping that it would solve my problem (which it didn't):
Code:find . -name "*.zip" -execdir unzip '{}' ; -deleteWhat happened was that find continued to extract the files as intended, but then it deletednot only the zip-archives but also the newly extracted files.it only affected a portion of the files that I had already backed up.I don't have enough space on my hard drive to extract all the files without simultaneously removing any data and I don't have the energy to go through all the directories one by one manually. Surely there must be a better way. Is there a way to fix the code snippet above so that it works as I intended or is there a better way of doing this?
I've found a website [URL] that I've downloaded Basilisk II [URL] and SheepShaver [URL] from, I converted both inton deb archive using the command sudo alien, but while they both successfully converted and installed, and show up in the Applications menu, only Basilisk II launches. When I click on the SheepShaver icon, nothing happens. I had this problem with another SheepShaver RPM I downloaded elsewhere. I really want to use SheepShaver to run Mac OS 9 on my laptop, is there something I can do to get it to run?
View 7 Replies View RelatedI have heavy swapping going top and free are indicating a lot free memory in cached form.Why does the kernel not use this memory instead of killing my desktop by swapping like crazy.
View 6 Replies View RelatedApparently, Ubuntu (and Xubuntu) keep copies of all thumbnails ever loaded, cached in ~/.thumbnails. To me, this is creepy and can be bad from a security standpoint. Is there any way to have all thumbnails loaded on-the-fly, but not to have them ever cached on disk? I tried symlinking ~/.thumbnails to /dev/null, but this disabled thumbnails entirely.
View 3 Replies View RelatedI am experiencing severe DNS delays today, so I tried to install DNSMASQ by following the instructions given on several pages on the web to make it function as a local DNS cache.
The installation was successful, and after editing the configuration files as instructed, I now have a working DNS cache on my computer.
However, it seems that the addresses are not cached for a long time ; for a given domain, the speedup lasts for a few minutes. If I try to access a previously visited domain once more after several minutes, a new (slow) external lookup is made.
Since websites' IP addresses are not changed every five minutes, would there be a way to tell DNSMASQ to keep the IPs in the cache for a long time (several hours at least) ?
This is very important for me because the DNS lag that I am experiencing makes external lookups last 10 to 20 seconds.
I have an LDAP server holding user/pass/group for many users. Due to network issues, the server sometimes is unreachable and clients cannot login, current sessions usually freeze after a while. All client have ubuntu 10.04.2 x64.
I have went through the outdated howto to cache the LDAP credentials.
I setup the required packages
daily cron "nss_updatedb ldap"
and edited '/etc/nsswitch.conf' to have "files ldap [NOTFOUND=return] db" for both passwd and group.
[Code]....
I ask question about my iPhone 3GS here because it run in linux too.I think it's Debian.it if u happen to know the answer. I want to keep cache downloaded from Cydia for reinstalling later.
View 4 Replies View RelatedI have 2 questions:
1. How can I set the TTL to 2 days for all stuff cached by dnsmasq?
2. When I go to [URL] from my browser and then execute [URL] the query time returned is > 0 ms. When i execute it again it is 0 as it should be. So is dnsmasq not caching the domains looked up by my browser or what? in /etc/resolv.conf I have only 127.0.0.1 and for the upstream servers I have a different file that is only used by dnsmasq.
I've been banging my head on this for a week... I finally got AD login working, but I can't get cached logins working. I installed SADMS, let it configure everything, and though I can now login, I still cannot login as my AD username when my machine is not connected to the AD network. I need to be able to login at home, connect to the VPN (if I can ever get that working), then sign on to services at work using my AD username.
Also, I cannot login to local accounts when the system is not connected to the AD network. Plus, home drive mapping is not working, our shares are \FILESERVERuseruser[I]username[I] so this does not work. UPDATE: I installed likewise-open, and now I can't login unless I use the full domain name when logging in via ssh, but I cannot login on the desktop, which is not what I want, now my username doesn't match the previous UID mapping, and my home directory is mapped to /home/likewise-open/DOMAIN/user, instead of /home/DOMAIN/user, like it was before.
I've set cairo-dock to start but also kde somehow caches the desktop state while shutting down. Can I avoid cairo-dock to be cached at shutdown in kde?
View 6 Replies View RelatedI'm using openSUSE 11.2/KDE 4.4 and Konqueror is behaving strangely lately. That is, when I click an entry in RSS Now widget, Konqueror opens, but instead of going to the web, it displays the cached version of the page (i.e. from /var/tmp/kdecache-[username]/krun/...).Initially I thought it is related to the widget (RSS Now), but the same thing happened when Konqueror was called by another program (Google Desktop > preferences): it opened the cached version of the page instead of what I've expected (localhost on some port).As I see these things, it seems there is a setting making Konqueror/KDE to preload some pages (I guess it's related to a KDE service). The problem is it subsequently displays the cache, not the online version.
View 8 Replies View RelatedClosest analogy I can compare what I want to, is like the `sync` command, which writes out all stuff in the disk buffers, freeing the buffers.Instead of disk buffers, I want to 'clean out' my RAM and SWAP of any/all junk that's accumulated in there over the time my PC has been up. I've long wondered about this, but never asked, though I recall searching around several times..When I first boot it cold and log in, the memory usage bar on my desktop is near zero, and the swap is empty. But after a week or 2 or 3 or more of uptime, and with Firefox always running with a dozen tabs or so at any given time, I end up with all the memory full or 'filled with cached stuff', and the swap space is filled to capacity.Curiousity: I blame Firefox for leaking memory, but even if that's still the case today (historically it was) can this all be blamed on Firefox? Or what-all causes this, besides Firefox- just..Everything?
Here's current stats:
Code:
sasha@reactor: uptime
21:21:42 up 30 days, 10:07, 3 users, load average: 0.02, 0.05, 0.01
sasha@reactor: free
total used free shared buffers cached
[code]...
So, 3.8 of 4 Gib of RAM is occupied, and the 1 Gib swap space is jammed full 100%. This must slow things down to some degree, yes? I mean, the kernel does have to keep track of this, right?Of course closing all the applications doesn't make a difference (not an appreciable one anyhow) and the only way I have found to start fresh is to reboot.
when we do enter on a folder it take some time for loading the folder depending on the no of entries in the folder . If the folder has more entries it take more time to load and if less no of entries then correspondingly less time . the delay in loading the folder varies due to reading of the folder entries in advance . SO what i want to know is that what is the MAX no of entries read in advance while opening a folder in linux and also how can we calculate this
View 4 Replies View RelatedSo everytime i try and use the "yum" command. for some reason it doesn't go past this point: "loading mirror speeds from cached host files". i cleaned up the cache and rebuilt the OS again, and im still getting this problem. would this be a problem with my internet connection? I'm using CentOS 5.5.
View 1 Replies View RelatedI have downloaded a petri net tool petrify.tar.gz.Then extracted it using archive manager.But unable to install it.How could I install it?Is there any commands or instructions for installing it?
View 9 Replies View RelatedI cannot download any package from the greek ubuntu server. The same problem has a friend of mine. Is there any problem with these servers? The error is 404, not found, something very important
View 1 Replies View Relatedwhen i trying to install any thing from yum.it gives following error.
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
[code]...
what can be the problem.I think it is related to proxy but i am unable to resolve.
A few releases ago things used to work this way, but seemingly since Jaunty it no longer does seem to work:
This used to work: I have several Ubuntu machines to upgrade and limited (and highly expensive) bandwidth available. In the past I would upgrade/update one machine and copy the contents of /var/cache/apt/archives to the other machines to be upgraded. When updating/upgrading the distro the packages that already exist in /var/cache/apt/archives would not be downloaded from the Internet while utilising the locally cached .deb files, saving time and bandwidth.
The Problem: I have noticed that since 9.10 (and with 10.04) this no longer appears to work. While the .deb packages may exist in /var/cache/apt/archives, the same packages would be downloaded from the Internet regardless of the same .deb file existing in /var/cache/apt/archives/.
Question: What is wrong and how can I restore the functionality present in copying cached packages from machine to machine in order to save bandwidth?
Why I need this to work: At work (and I have no control over this...) we are limited to a total bandwidth of 3Gb per month for multiple users. Needless to say a single upgrade of a recent Ubuntu distro can decimate a large proportion of our monthly available bandwidth. Upgrading multiple machines is absolutely out of the question if the packages have to be downloaded every time.
An apt-cache server is not an option: The option of a local apt-cache server is not feasible due to the same 3Gb bandwidth constraint. An apt-cache server requires 15Gb storage per version of Ubuntu and the downloading/upgrading of 15Gb worth of packages for that storage is not an option due to the 3Gb limitation.
My /var/cache/apt/archives directory has almost 9000 items and is over 12 GB big. All it contains is a bunch of .deb files. Do I need this file or can I delete it to save hard drive space?
View 2 Replies View RelatedI really want to free up disk spaces in my ubuntu, so i look up disk analyzer then i saw /var/cache/apt/archives take about 1.2GB space from my disk. can i deleted those packages to free up disk spaces? the packages there is a *.deb files, when i click on some them, it open ubuntu software center. an the ubuntu software center describe the newer upgrade is installed.
View 5 Replies View RelatedI've never really used command line to do such things but I'd like to learn. So, how do I extract all archives that are spread across several directories in one go?
For instance:
Quote:
Work/
Work/Today/a.rar
Work/Tomorrow/b.rar
Work/Yesterday/c.rar
Receive the following error when trying to install packages using Software Center or Synaptic Manager. Was working fine just until the other day.Failed to fetch 4.5-1_i386.deb Could not connect to my.archive.ubuntu.com:80 (203.106.62.8. - connect (111: Connection refused)I'm using Ubuntu 10.10 - the Maverick Meerkat - released in October 2010
I tried the following:
sudo apt-get update
sudo dpkg --configure -a
[code]....
I want to tar -zxvf several dozen .tar.gz archives. I know how only to tar -zxvf one archive at a time. If possible, I would like to tar -zxvf all the archives in one command so I don't have to type "tar -zxvf [filename]" several dozen times. What's the syntax? I looked at the man tar page, but couldn't find it.
View 3 Replies View RelatedAre there any other archiving tools than tar that preserve Linux file permissions and user/group flags? I want to create archives of a directory tree on my ubuntu box. The directory trees are large, a first run with tar cvpzf archive.tgz /home/foo/bar yielded a 5GB archive. I also want to keep all permissions and other flags and special files.I'm fine with a 5GB archive, however to look inside that archive -- since it is a compressed tar archive -- the whole 5GB have to be decompressed first! (Or so it appears when opening it with the archive viewer -- I'm happy to be corrected.)So I need a way to "backup" a directory (tree) while preserving full filesystem attributes and right and creating an archive with an "index" that hasn't to be decompressed for browsing its contents. An archive is either a single file, or a (small) set of files that carries full and complete information within this file/s. That is, it can live on any filesystem (size permitting), it can be burnt onto a DVD, you can split it (after which the point of this question is really lost - but still), ...
View 3 Replies View RelatedI have a wanderfulley working Kubuntu Karmic 9.10 box and love it. now I just did another install of 9.10 on a old box that dosent have eney network card. the old thing is runing prity well but I want it set up like this one that I'm on now SO I thought I'd try aptoncd the trouble is that I have done apt-get clean a few times sence seting this box up so wat's left in /var/cache/apt/archives is a varey incomplete set of packages. I'm trying to find a way to reload the /var/cache/apt/archives with all insaled packages.
[Code]...
i have noticed that if right click on many files types for instance iso files i don't have the option to store them in compress files at all i have one iso file now that i want to compress and split to three parts so i could upload it but as i said right click on the file don't helpbecause i don't have the option there so what can i do?
View 4 Replies View RelatedI ran into some complications with my ubuntu 10.10 install and couldn't boot into the desktop. I decided to use a live CD to get access to my downloaded cached packages and copied them to my windows partition, now I've re-installed ubuntu and manually restored the apps into the /archives/ directory, I've installed a lot of apps from the directory already but AptOnCD has refused to properly reflect the apps I have there, is there something I'm overlooking?
View 2 Replies View RelatedThere is this directory with a lot of .deb files in it. /var/cache/apt/archives
Can i delete all these files in that directory? is it safe? They are temporary files right?
Code...
Why does it come like this?
Should I do any changes in /etc/apt/source/list?