OpenSUSE :: Disable Downloading From Multiple Servers In ZYpp?
Mar 14, 2011
Whenever I do anything in Zypper, there is always a huge lag before it starts downloading a file, but then it downloads at normal speed. I think this has to do with the fact that it is now trying to download from multiple servers, and I would like to turn that off. Where do I go for that?
By default, when zypp does an update, it downloads a deltarpm and applies it immediately, during which time the internet connection is open and unused. Can I make zypp apply the deltarpms all at the end, only after it has downloaded all of them? I'm used to this sort of behavior on Fedora/yum, and it reduces the window for which I must keep my internet connection open.
I want to download a file (reasonably large) from say for e.g sourceforge. The problem is some mirrors give speeds of at max 40kB/s and I was considering options to increase this. I considered download managers and seems to work somewhat I experimented with axel and lftp's pget. Now I am wondering how I might download the same file from severals servers (say sourceforge's various mirrors). I tried axel by concataneting all server adresses but not sure if it is working. How do I verify that it is indeed using all the servers specified??
in KDE:Factory:Desktop you can find a new version of kupdateapplet. In the changes you can find the following:
- dropped zypp plugin (bnc: 590192) - V 0.9.11
I have no access to 590192 in bugzilla, so can anybody tell me why the only real working backend is dropped? The PackageKit-Backend is not showing the updates from 3rd-party-repositories (packman or obs). Is there any possibility to get back the working 0.9.10?
I am trying to create a local repository of installed RPMs on my workstation. Using the 'rpm -qa' I can gather the installed RPMs on my workstation. Is there a way for me download all these RPMs using the output of 'rpm -qa'? 'yum --downloadonly' will allow me the option of downloading without installing. Since the installed RPMs are not cached in my workstation I have to download these RPMs again. If I have a local repo then it will be easier for me restore my installation without having to rediscover the packages needed. I am looking for how I can download multiple RPMs.
I'm looking at setting up a couple automated systems: Here are a few examples:
* Internal accounting system to download and process emails * Public web server to visit
I could put each system on its own separate box -- for example, it's generally good practice to separate anything that external users have access to (such as a webserver) from internal processes such as accounting. Now, rather than dishing out the money for two separate servers, could I get away with just installing new instances of VMWare on the same box for each system?
To give you an idea, these are not large scale computationally sensitive systems. The accounting one is simply downloading and tallying emails, and the latter is just a webserver with maybe 5 hits per day on a good day. I could definitely pick up a new box for say $50, but I wanted to know the general practice of using VMWare on the same box versus two separate boxes.
I'm trying to create an archive of a websites images because it tends to go offline now and then. The problem is, when going to the image in full view, it opens it on a php webpage. I've tried using 'wget -m -A.jpg' but it only saves the thumbnails from the menu page instead of the actual images.
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
The great clock applet shows tasks and appointments from the evolution calendar. That's a great and fast overview of my agenda. But clicking a task of appointment opens a new instance from evolution every time, which is enormously annoying and really destroying productivity. I don't want 15 calendars open, I want the applet to behave like an index. So it should reuse the open instance and bring focus to the clicked appointment or task. How do I disable multiple instances of evolution?
I have been using totem player as my default video player. I've abstained from using VLC player considering the fact that it's got some security issues. Of late, I've installed and was mighty impressed by Mplayer because it could play .flv files which totem player couldn't. The only thing which I find irksome is its multiple instances. Can you please help me disable multiple instances in Mplayer. (I've experienced same problem in VLC player too, but right now I'm trying to stick with Mplayer)
I'm curious if anybody can shed some light for me in this department. We're in a large environment with a Windows DHCP Server. We have been tinkering with LTSP on Edubuntu as thin and fat clients. It works great, but right now we just have 1 server handling the lab, which works fine unless we want to expand, which may be very possible.
These are the instructions I received: Login to your windows server and load the DHCP configuration screen Create a DHCP reservation for the MAC address you obtained Add the configuration options below to enable the machine to boot from the LTSP server 017 Root Path: /opt/ltsp/i386 066 Boot Server Host Name: <ip address> 067 Bootfile Name: ltsp/arch/pxelinux.0 # Specify CPU architecture in place of 'arch', for instance 'i386'
I'm curious, what if I want to have multiple Ubuntu servers on the network that I want to have bootable? For example, let's say I have 3 labs, and 3 servers. Server A to Lab A, Server B to Lab B, and Server C to Lab C. I want all C's computers to boot to C, and B to B, A to A, etc.
1 - How would I add multiple entries on the Windows DHCP Server to allow all 3 (A B C) servers to boot?
2 - How would I be able to isolate the clients so ONLY Lab A clients boot to Server A, etc?
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
I'm running Ubuntu Server 10.04 and have a secure (SSL/TLS) FTP server on it. However, I'd like to use this FTP server to update programs I made using Microsoft Visual Studio. Unfortunately, in Microsoft's infinite wisdom, secure FTP servers cannot be used. Rather than use an insecure FTP server, I want to set up my secure FTP server to be able to access whatever I need to on the machine, and then add an insecure FTP server that only has access to the directory where I put my update files. I am currently using vsftpd as my FTP server. Is there any way that I can set up two FTP servers on this single machine?
On my 11.3 KDE if I leave the computer alone to download something and come back to check after a while I sometimes find that torrents or usenet downloads have been stopped long ago and kdeapplet displays the wrong time, anything from half and hour to ten hours late. I usually open NTP configuration screen, check if server is accessible and okay out of it and everything goes back to normal by itself, time is corrected and downloads resumed. What could be the reason for stopping downloads and how can I correct this problem? Is NTP the cause or just another symptom of a deeper problem? NTP is set to "now and on boot", interval is set to 5 min - I think those are all default settings.
How do I stop YaST from downloading the repodata every time I open the software manager?Even if I have opened it 5 minutes before, YaST still downloads the repomd.xml again.This is also a real pain if I am offline and trying to install an rpm, because I have to wait for it to realise that it can't access the data before it allows me to skip the auto-refresh.By the time that happens I could have installed the software already if I was using a debian or windows based system.
My KPackageKit is not downloading any update, it shows updates, I click apply, I authenticate, then it asks me to accept an agreement, after which it returns to an empty KPackageKit (as if there were no updates). If I open it again, there are the possible updates again (naturally, since it didn't download anything).
I though i would try out Clamav. After installation and a reboot, As su I issued freshclam.ClamAV update process started at Mon Apr 18 22:29:22 2011Downloading main.cvd [ 72%] the time at that point is 23:32 for 19 MB The download is so slow that I wonder if Clamav is really meant to be used.
I would like to run a small file server at home which I could connect to both remotely and within my own network. I was thinking of using something similar to a cheap dell optiplex machine (Pentium 3 or 4 2GHz?, with 256mb ram and a 40GB hard drive[will do something about the lack of space later]).The file server part of this should be straightforward but I wanted advice on how I could manage downloads on the machine. On my laptop I currently use both firefox's built in download manager and JDownloader. Sometimes Jdownloader isn't the ideal solution for all downloads, e.g. sometimes a single connection through firefox gets a faster download speed. I also occcasionally download torrents through Miro.
If anyone has setup something like what I'm suggesting, could you please give me a general idea of how best to go about this?
I have removed IcedTea packages but I am unable to run Sun Java after downloading the recommended version from sun website as per the instructions at Installing Sun's Java on openSUSE - openSUSE.I do a lot of web games and need sun java installed.In addition, I need help with -
1. Installing fdisk or similar. 2. Read and write for ntfs partition on same hard disk. I've Samba but SUSE can't see 2 ntfs partitions on same hard disk. 3. Gnome/ LXDE desktop install. 4. Testdisk install. 5. Gparted install. 6. Convert .deb packages to .rpm and install those. I used to convert .rpm to .deb in Ubuntu with alien tool.
I currently mirror the updates repository to my computer using rsync. I was wondering if I could save space and bandwidth by only rsyncing the .delta.rpm files? Are there any disadvantages to this or does zypper/YAST handle updates just fine with it?
I'v downloaded the new versions of suse many times. I average 3 to 4+ times before it's successful, bad downloads(bad checksum) or download stops for no reason. I'v used direct link and bittorrent. Bittorrent takes 2-3 times as long. So, I usually use direct link. I can do the download in about 6-8 hours. The first time, it stopped at 334 megs, using direct link. Using bittorrent, it took about 16 hours. I got a bad checksum. Is there a way I fix the file without downloading it again? Suse 10.4