Software :: Downloading A File From Multiple Servers ( Mirrors )?
Sep 13, 2010
I want to download a file (reasonably large) from say for e.g sourceforge. The problem is some mirrors give speeds of at max 40kB/s and I was considering options to increase this. I considered download managers and seems to work somewhat I experimented with axel and lftp's pget. Now I am wondering how I might download the same file from severals servers (say sourceforge's various mirrors). I tried axel by concataneting all server adresses but not sure if it is working. How do I verify that it is indeed using all the servers specified??
I'm tring to download an iso image that's over 3GB. The file is hosted at several ftp mirrors. Is there a way to download the file from several of these mirrors simultaneously so that I can reduce the D/L time? Presently, it's will take close to 24hrs to D/L from just one mirror. I have a 25Mbps connection so my bandwidth is not an issue.
Whenever I do anything in Zypper, there is always a huge lag before it starts downloading a file, but then it downloads at normal speed. I think this has to do with the fact that it is now trying to download from multiple servers, and I would like to turn that off. Where do I go for that?
I am looking for a version of vmware that manages 2 severs at the same time and that mirrors them such that if one goes down we can still work on the second and of course it also works as a backup. Also, must work on a ubuntu server I have looked but there are so many version that I don't know which one is the best.
I would like to run a small file server at home which I could connect to both remotely and within my own network. I was thinking of using something similar to a cheap dell optiplex machine (Pentium 3 or 4 2GHz?, with 256mb ram and a 40GB hard drive[will do something about the lack of space later]).The file server part of this should be straightforward but I wanted advice on how I could manage downloads on the machine. On my laptop I currently use both firefox's built in download manager and JDownloader. Sometimes Jdownloader isn't the ideal solution for all downloads, e.g. sometimes a single connection through firefox gets a faster download speed. I also occcasionally download torrents through Miro.
If anyone has setup something like what I'm suggesting, could you please give me a general idea of how best to go about this?
I am trying to create a local repository of installed RPMs on my workstation. Using the 'rpm -qa' I can gather the installed RPMs on my workstation. Is there a way for me download all these RPMs using the output of 'rpm -qa'? 'yum --downloadonly' will allow me the option of downloading without installing. Since the installed RPMs are not cached in my workstation I have to download these RPMs again. If I have a local repo then it will be easier for me restore my installation without having to rediscover the packages needed. I am looking for how I can download multiple RPMs.
So, this is a simple download scheduler program code. Which creates multiple threads of the downloading process - wget (i could also have used 'curl' instead 'wget').Can you debug this code?
I have Fedora 13 in a VPS. I cannot work with yum. I got this error: [Errno 256] No more mirrors to try # yum list Error: Cannot retrieve repository metadata (repomd.xml) for repository: fedora. verify its path and try again.
I'm looking at setting up a couple automated systems: Here are a few examples:
* Internal accounting system to download and process emails * Public web server to visit
I could put each system on its own separate box -- for example, it's generally good practice to separate anything that external users have access to (such as a webserver) from internal processes such as accounting. Now, rather than dishing out the money for two separate servers, could I get away with just installing new instances of VMWare on the same box for each system?
To give you an idea, these are not large scale computationally sensitive systems. The accounting one is simply downloading and tallying emails, and the latter is just a webserver with maybe 5 hits per day on a good day. I could definitely pick up a new box for say $50, but I wanted to know the general practice of using VMWare on the same box versus two separate boxes.
I'm trying to create an archive of a websites images because it tends to go offline now and then. The problem is, when going to the image in full view, it opens it on a php webpage. I've tried using 'wget -m -A.jpg' but it only saves the thumbnails from the menu page instead of the actual images.
I'm really desperate as I have spent the better part of the last 10 hours trying to sort this out before my boss finds out
When I try and browse to one of our websites the browser wants to download the file as the server will not process the php file. I get which is a: application/x-httpd-php
What's really odd is that from inside the network where the server is located everything works fine, its only from the outside that this happens?
Everything was fine until I ran an system update from webmin that updated a ton of things including Apache2 and PHP5.
Its a self hosted server that was running UB server 9.10, but I have since upgraded to 10.04.2 LTS but no luck.
Apache version 2.2.14 PHP Version 5.3.2-1ubuntu4.9 Joomla 1.5 -latest
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
I downloaded the Ubuntu file via the website and it was a Rar file. So I then extracted this file and there is no Iso file in there. Was it suppose to be a rar file and where the hell is my iso file? I want to know as I want to test ubuntu first via a disc before installing it.
I'm curious if anybody can shed some light for me in this department. We're in a large environment with a Windows DHCP Server. We have been tinkering with LTSP on Edubuntu as thin and fat clients. It works great, but right now we just have 1 server handling the lab, which works fine unless we want to expand, which may be very possible.
These are the instructions I received: Login to your windows server and load the DHCP configuration screen Create a DHCP reservation for the MAC address you obtained Add the configuration options below to enable the machine to boot from the LTSP server 017 Root Path: /opt/ltsp/i386 066 Boot Server Host Name: <ip address> 067 Bootfile Name: ltsp/arch/pxelinux.0 # Specify CPU architecture in place of 'arch', for instance 'i386'
From: [url]
I'm curious, what if I want to have multiple Ubuntu servers on the network that I want to have bootable? For example, let's say I have 3 labs, and 3 servers. Server A to Lab A, Server B to Lab B, and Server C to Lab C. I want all C's computers to boot to C, and B to B, A to A, etc.
1 - How would I add multiple entries on the Windows DHCP Server to allow all 3 (A B C) servers to boot?
2 - How would I be able to isolate the clients so ONLY Lab A clients boot to Server A, etc?
After downloading Fedora-14-i386-DVD.iso file (3.3GB), I cleaned the window with the list of downloaded files.When I opened the directory where they are always stored, there was not the iso file.I don't find it anywhere, wastepaper basket included.
I know that the question could sound weird but...I was wondering if is possible to download one or more parts of a file.
For example, the first 10 mb, or the latter ones.
I know that there are some apps that let you do segmented downloads, but, is there anyone that let you choose the segment to be downloaded? If not, can this be accomplished with any linux command-line application?
I am, as the forum title suggests, new to linux and to programming and having trouble figuring out how to do this.I have a very large XML file with a lot of information in it. I'm trying to get a single tag out of the file, each of these tags contains a single web link and I want to download the file at every single one of those links. I really don't know how to do this.My thought, though its probably not the most efficient or correct way, was to use VIM to search the document and somehow extract all of this one particular tag and then use wget on the links.
I want to install ubuntu to client machines. I tried to install using apache server.. I installed that well. and it is working well. i tested that.I did every configuration like this link [url]
But when i give the image server ip address to the image server. it promote a message says that release file cannot be download...
I dont know y i'm geting this error..
In that link there is image call netboot installer. i boot from that .iso am i correct or i didn't understand that thing.
This was what I tried...wget -A rar [-r [-l 1]] <mediafireurl>
That is to say, I tried with and without the recursive option. It ends up downloading an HTML page of a few KB in size, while what I want is in the range 90-100 MB and RAR.
What happens with MediaFire for those who may not be aware, is that it first says
Processing Download Request...
This text after a second or so turns into the download link and reads
The installation appears to complete itself successfully. However, when I come to then clicking on the new Opera logo in the panel menu, nothing happens. Even after a restart.
What could I be doing wrong? How should I launch Opera?
i want to let client download file using apache only!it will tell apache generate a session and let client download the file using that session!it that possible? what method or module should i use?
I've been tinkering around with Linux over the past couple weeks. I've played arodn with both Ubuntu 10.10 and Linux Mint 10 and it's been a 99% positive experience. I have noticed an issue regarding my internet connection.
I've noticed when I download a file in Ubuntu (either from the software manager, BitTorrent or just a regular download in Firefox), browsing the web on my computer becomes near impossible. The download itself runs at a good speed for our DSL connection (around 200 KB/sec) but anything else near a standstill. Just loading ESPN.com on my PC can take a minute or two. Someone else playing a game in the house will go from a regular 100-200 ms ping to 5000+ right when the download starts. When I stop the download, internet browsing and gaming speeds go right back to normal.
The weird thing is the same thing happened when I installed Linux Mint 10. When I boot into my Vista partition, this isn't an issue. Downloading a file in Vista doesn't affect anything, but it's happening every time in both Ubuntu and Linux Mint.
I am planning to install Ubuntu or kubuntu netbook remix on my netbook. WUBI has a problem downloading the .iso file. So do you know where the wubi directory in windows is, i have the .iso file.
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
I'm running Ubuntu Server 10.04 and have a secure (SSL/TLS) FTP server on it. However, I'd like to use this FTP server to update programs I made using Microsoft Visual Studio. Unfortunately, in Microsoft's infinite wisdom, secure FTP servers cannot be used. Rather than use an insecure FTP server, I want to set up my secure FTP server to be able to access whatever I need to on the machine, and then add an insecure FTP server that only has access to the directory where I put my update files. I am currently using vsftpd as my FTP server. Is there any way that I can set up two FTP servers on this single machine?
I am new to opensuse. I want to install multimedia codecs on my opensuse 11.4. Is there any way to download multimedia codecs as an iso file and then burn a cd to install them on opensuse? Now, I am downloading "NonOSS CD" in add-on downloads section, is this iso file containing multimedia codecs?