I am helping a friend start with Ubuntu and he doesn't have as fast an Internet connection as I do. I was wondering how I could easily download all the deb packages for the software I want to install for him. It seems doing:
sudo apt-get install -d --reinstall <package>
Will download the packages for me, but it doesn't get the dependencies because I have already downloaded them... is there a way to get apt-get to get the dependencies as well?
I know I can build a local repository but I'd like to try just moving the appropriate .deb files. My problem is not knowing which files I need and it what order. Example... I want to install nfs-common
Doing apt-get install nfs-common --- does it all for me when I'm online. So I looked in the /var/cache/apt/archives to see what was installed. I found two nfs files... nfs-common_1.2.0-4ubuntu4.1_amd64.deb nfs-kernel-server_1.2.0-4ubuntu4.1_amd64.deb
But when I tried to install those on another machine I found I was missing additional files. libgssglue1_0.1-4_amd64.deb libnfsidmap2_0.23-2_amd64.deb librpcsecgss3_0.19-2_amd64.deb portmap_6.0.0-1ubuntu2.1_amd64.deb
For future installations. How do I find all the dependencies and the ORDER they need to be installed so I can write my own script and install them to a machine that is offline?
is there a way to get urls of the packages that have been updated and then download them in another computer? like this feature of ubuntu HOWTO: Download package dependencies for offline installation - Ubuntu Forums
its a simple feature and its present in smart and synaptic,yet its not in yast (or i havent found it yet.
i would use smart package manager but in my home connection for checking for updates ,yast is better ( smart downloads filelist.xml.gz that is very way biger than what yast downloads (though it enables smart to show filelist of package BEFORE installing) .so at home i can check for package update with yast ,buy downloading them is very hard. (my connectioon is very bad (i live in iran) and yast mirrors are NOT the best of servers ,so yast gets interrupted in middle downloading a rpm and the whole process is waiting for me to press retry ,so i cant do updates and installs overnight.btw is there some way to tell it to retry always or a number of times automatically? )
i need the url links of rpms so i download them separately and install them.
I am about to loose my internet soon, I am not sure for how long, but I am curious, Can I go to another computer that has internet and download updates for my computer, take them back to my computer and install the updates so I can stay up to date?
I am away for two weeks in a Internet free zone - unless I can get it back on - I would like to download the wiki, if possible to browse and try new things. I can update my comp, but will need to take it to a friends to connect.
Use Firefox 3.0.7 Use GNU Wget 1.11.4 I have a question about downloading web pages. If I download with web page complete will I be able to open the pages without being on line, or will there be some pages that I will still need to log in for. If a web browser is not sufficient, is there some command I can use with Wget to accomplish this?
For some reason it seems to be downloading too much and taking forever for a small website. It seems that it was following alot of the external links that page linked to.
It downloaded too little. How much depth I should use with -r? I just want to download a bunch of recipes for offline viewing while staying at a Greek mountain village. Also I don't want to be a prick and keep experimenting on people's webpages.
I need to download emails via IMAP in thunderbird. However, I want the downloaded emails to be accessible from other computers offline. How can I download so that the downloaded copy is readable by thunderbird copies on any computer?
I have used apt-cacher successfully for a long time on my home LAN. Its been great but after a recent server crash I decided to switch to apt-cacher-ng due to reports of lower memory footprint and faster response. Thats all well and good and after just a little mucking around it seems to be working.
But there are a few old (but still current) debs that I am having trouble importing into the apt-cacher-ng cache. Most of them were picked up after adding all the repos to my client and running apt-get update. But there are still a few that aren't.(the 2 biggest ones are from the same repo).My client machine runs apt-get update without fault but when I import (using the apt-cacher-ng web interface) it lists all the repos (including the one these debs should be in) but isn't importing any more of the debs. There are no errors showing.
I've been compiling stuff with the usual ccmake . make and make install for a while now, but now I'm starting to compile suites of applications, such as Calligra (Koffice). Is there any easy way to have the compilation spit out several .debs for the various components?
I'm running lucid, and I want GIMP, because it's what I used when I was packing Windows but, I'm not sure what I need as far as the .debs go, or the order to do it in, as I'm sure there's things that have to be installed before others, and I have no internet at my home, which is why I'm doing things the hard way.
I'm in a bit of a rush so thought I'd ask two questions in one thread here.
1. Will a .deb made for ubuntu likely run into much trouble if I run it on Lenny? 2. Does the nvidia official proprietary driver come by default with a full 5-DVD install of x64 Debian 5.0.4, or do I need to install it myself? If so, is it enabled by default, or do I need to enable it? How?
I have a folder that contain about 2,5gb of debs ranging from vlc, gstreamer, jokosher and audacity etcthose programs I backed up with dpkg repack, now I want to restore them but sudo -i .deb breaks my systemI wonder if there is a to install them and skipping those which dependenciws are not met
Tell me some way to upgrade from Ubuntu 10.04LTS to 10.10. But in Offline mode? is there any package or iso image or CD by which i can upgrade my offline PC?
I was able to do this before but, I can't remember where I found the link.I think is was a script or something. It would get all the files that the .Deb will need and make an installer.
P.S. I have Ubuntu 64-bit the PC that need the install is a 32-bit both are 10.04.
Does anyone has the information regarding a collective repository for ubuntu download packages which provides a single download for all the packages or selective packages, may be some torrent sort of links.I know of packages.ubuntu.com but there the packages are dispersed. Downloading each one of them requires a hell lot of time. Doesn't anyone has this information where a one click download of packages is facilitated.
I have Ubuntu 10.10 installed on my offline PC andhow to get an updated repository for Synaptic using a online PC elsewhere. I need this so that I could generate package scripts to be used elsewhere to download software.
i installed ubuntu normally before but due to some problems i uninstalled it. i want to install using wubi this time but cannot install while online. i dont want to install the normal way bcoz the last time i uninstalled it my window cannot boot. so how to install wubi while offline?
How to install offline Winff in ubuntu 9.04 in a detailed way. Also tell me where from I have to download Winff. Before this I installed winff_1.2.0-1~ppa1l_i386 but winff is not opening with a warning that no ffmpeg is found. Therefore I request you please tell me winff offline installation in Ubuntu 9.04.
I wish to install debian on a number of boxes and have resolved on a network install. I'll first do a minimal install using the network install iso on a usb stick, then reboot and complete the installation using a local caching repo (apt-cacher) on the LAN. As a way of further minimizing bandwidth usage, I wonder if I could extract the .debs from a full installation cd and use them to populate the local repo?
Is there any easy way to do offline package upgrades in Ubuntu? I was using debian's repository for the longest time to get individual packages, then found launchpad. Is there a script or something that will tell you what the dependencies are then let you copy them to a thumbdrive or something?
I know online upgrades are great but there are some cases where online isn't an option. Here's an example. Getting wine. There used to be this repository of .debs from the wine website, but now I can't find it. Launchpad has it, but it's all individual files.
I installed Ubuntu 10.04 in a Raid 0. The install went fine. When I shut down my machine then boot it back up it shows no raid volumes defined and my hard drives as offline. Then the Disk boot error message. I am new to Unbuntu and I am sure the solution is simple I just need to know what to do.
I'm brand-new to Linux and Ubuntu, and I need to install Wine. I'm installing it on an offline computer, (I'm using Windows on the online pc), and I'm in need of the extra libraries required for proper Wine installation. The easiest way I've seen is to download/install the libraries via internet through the terminal. I can't though, because the computer's offline.
How do I force recompilation of the kernel .deb packages. After a small change I make to the sources without having to clean the sources and recompile the whole kernel again?
Code: Select all$ fakeroot make -f debian/rules.gen setup_i386_none_686 $ fakeroot make -f debian/rules.gen binary-arch_i386_none_686 binary-indep
Calling the second command again does not recompile the modified code, just recreates the .deb packages.
If I use:
Code: Select all$ make -f debian/rules clean before the build command, then it will recompile everything which takes ages.
How can I force recompilation of the files/objects I changed (and dependencies)?
I use this guide: [URL] .... ('Building only a single kernel variant' section)
I'm using Debian 503 Lenny, Gnome. My machine, She-Beast, is a 2001 Compaq Presorry-o, 1100GHz Celery. With every, that is, E V E R Y, attempt to install any .deb, it fails. When I 2x click on the .deb icon, or when I r-click -->Open with archive manager, I get a dialog window: "Could not open "(filename).deb". Archive type not supported." If I use root terminal, dpkg -i, similar responses come up.
Here at home I have several Ubuntu installations, mine, the kids computers and a couple of laptops. What I'm looking for is a solution or a pointer in the right direction to setup on our local Ubuntu server a sort of cache. Each day each Ubuntu on the network, checks for updates and downloads, and installs. What I'm looking for is a way for one machine to download the update and then the others to download from the local resource.
A sort of local cache to try and minimise everyone downloading straight from the net for pretty much the same updates. I did a emerge cache many years ago when I was using Gentoo, so I'm wondering what I can use/do here with Ubuntu as we are all loving this distro now.
For testing I made a debian 8 dvd 1 installation on an usb stick. I selected desktop gui gnome and lxde. Both work. An ethernet 100mb cabel connection works. I can open iceweasel an surf websites.
If I open synaptic package manager and select vlc for installation or press 'mark all upgrades' I get this message. Insert disk debian gnulinux 8 jessie official dvd bin in drive.
Synaptic does not try to get packages from the internet.
Is a debian 8 dvd 1 installation an off line installation? The package system will not connect to the internet?
Administering offline Linux boxes can be a serious pain. The Debian flavours now have keryx to make life easier. Keryx is a cross-platform application, which means one can get the dependencies from Windoze too. Is there any similar package for rpm/fedora based flavours? In the absense of a proper Offline manager, I was also wondering if there is a way to collect the output of:
Code:
yum deplist <package>
... condense or sieve out the double listings, and pipe that to a text file? One can copy the output and run
Code:
yum reinstall <paste them here> --downloadonly
and get all the required dependencies from the yum cache. If all that can be accommodated in one script... then that's pretty cool. I don't have the scripting know-how to dive into this.