OpenSUSE :: Download Packages With Wget From That Mirror With A String "alsa"?
Nov 30, 2010
Index of /distribution/openSUSE-stable/repo/oss/suse/i586 How can i download packages with wget from that mirror with a string "alsa"?
Such as:
alsa-1.0.23-2.12.i586.rpm
alsa-devel-1.0.23-2.12.i586.rpm
alsa-oss-1.0.17-29.2.i586.rpm
View 9 Replies
ADVERTISEMENT
Jan 21, 2010
After reading a lot of docs, I'm still having problems using wget to download a Centos repo from a mirror. Here's my best attempt so far:
$cd /repos/centos/5.4
$wget -r -nH --cut-dirs=3 -np [URL]
Of course I get all the unwanted index files etc, but I seem to get a lot of other downloads from the mirror, not just their 5.4 directory. It's like it's following other links on the web pages. Maybe I should be using "ftp://" instead of "http://" considering it's an ftp site, but I seem to have connection problems that way.
View 1 Replies
View Related
Nov 4, 2010
I am trying to wget a site so that I can read stuff offline.I have tried
Code:
wget -m sitename
wget -r -np -l1 sitename
[code]....
View 7 Replies
View Related
Sep 14, 2010
I need to mirror a particular website (all the pages under that particular domain) any pages (but not whole sites) that the website links to.
How to do this
wget -r --level=inf (or some other variant) will mirror the site.
wget -r -H --level=1 will get all the links (from all domains) to the first level.
Anyone have any ideas on how I could combine these, to get the entire of the main site and one level deep into external sites. I've been banging my head against the manual all afternoon.
View 1 Replies
View Related
Dec 21, 2010
can we use recursive download of wget to download all the wallpapers on a web page?
View 5 Replies
View Related
Jul 12, 2010
There is a partnering website that provides an RSS feed to display on the website I am working on. The website displays information on the feed every time a user accesses the website. The feed changes almost every day. For bandwidth considerations and speed, I would like to download the feed once by the server using a crontab job (my website is in a linux shared hosting environment). The problem exists with the URL structure, which I have no control over.
Here is the URL:
Code:
[code]....
I am aware that there are characters that need escaping and this is where I am getting my errors. I have never written a shell-script but I am also assuming some of the characters are keywords in the Shell Scripting language or Linux I am also aware that I can avoid having to escape by enclosing the URL with single or double quotes. You will notice that the URL has BOTH single and double quotes, so its not as simple.
View 1 Replies
View Related
Jun 11, 2011
in yast> software installation how can i configure it to download all packages first then install them (as ubuntu does) rather than downloading each package and kimmediately installing it.
View 3 Replies
View Related
Jul 8, 2011
Zypper seems unable to reach the openSUSE servers. I am currently on a network that may or may not block certain traffic, so I don't know if their firewall is in the way. I am able to browse the web without any problems. Sometimes Zypper downloads proceed eventually, but the download is doggone slow. I did this code...
Apparently Zypper has trouble downloading this URI, but if I open it in Chrome it loads perfectly without any problems. How can I find the root of this problem?
View 9 Replies
View Related
Jan 3, 2011
Server A is running Centos 5.5, Server B is running Centos 5.5.What is the simplest way to grab the package list from Server A, and use yum/rpm to make sure that the same packages are installed on server B, and if not -- install them?
View 1 Replies
View Related
Jul 6, 2011
I need a newer version of the nvidia drivers for opensuse 11.4. ftp://download.nvidia.com/opensuse/11.4 has version 270.41.06 available, but I found that I need the 270.41.19 version for CUDA 4.0 to run (yes I know CUDA 4.0 does not officially support 11.4, but the same 270.41.06 is available for 11.2 which CUDA 4.0 does support).
Where can I download the files x11-video-nvidiaG02-270.41.06-5.1.nosrc.rpm and nvidia-gfxG02-270.41.06-4.1.nosrc.rpm that these binaries were created from? Once I have the source rpms, I can easily update them to the newer version.
View 9 Replies
View Related
Jun 2, 2010
How do you download a whole distribution at once from an ftp mirror? Ive never used ftp to DL more than 1 file at a time from konsole I tried mget, get as well as using wild cards like this get /slackware/*/*/*/*. Ive been looking for how to's but can't find any that deal with what I'm looking for. I know there is probably a simple solution but I can't find it.
View 8 Replies
View Related
Jun 25, 2011
A friend recently introduced me to linux and I've experimented with a few different distros and now have 2 working puppies; 1 system i slapped together from misc. parts lying around and the other is my netbook which boots to puppy via USB. I have had to play around with formatting using gparted.
I very recently acquired a server unit with a pentium II 200Mhz and I would like to LEARN linux. Thru careful research I have figured out that Slack is the best OS for those who want to learn the in's and out's. I guess my question would be which slackware distro would be best for this somewhat older system? ...And where can I find a mirror to download the iso?
View 5 Replies
View Related
Nov 28, 2010
in home I do not have internet connection, but in work I have internet connection, in home I install Fedora 14 for my 6 years old daughter and she use it for play games like supertux and openarena and .So I want install opensuse for my daughter and I want test it. for fedora , I download all packages with rsync in work and move them to home by USN flash and then I make localrepo in home and install all packages , I need , I want do this for OpenSUSE , all of us know DVD , does not has all packages , I need , so I have to download all packages and make localrepo in home and install all packages , I need , Can I do this for OpenSuse or not ,I want download all packages need by OpenSuse by rsysc and make loacl repo, How I can do this for Opensuse ?
View 9 Replies
View Related
Oct 6, 2010
I'm doing this wget script called wget-images, which should download images from a website. It looks like this now:
wget -e robots=off -r -l1 --no-parent -A.jpg
The thing is, in the terminal when i put ./wget-images www.randomwebsite.com, it says
wget: missing URL
I know it works if I put url in the text file and then run it, but how can I make it work without adding any urls into the text file? I want to put link in the command line and make it understand that I want pictures of that certain link that I just wrote as a parameter.
View 1 Replies
View Related
Mar 14, 2011
i use this code to download :wget -m -k -H URL... but if some file cant be download , it will retry Again and again ,so how to skip this file and download other files ,
View 1 Replies
View Related
Feb 21, 2010
I'm trying to download a set of files with wget, and I only want the files and paths "downwards" from a URL, that is, no other files or paths. Here is the comand I have been using
Code:
wget -r -np --directory-prefix=Publisher http://xuups.googlecode.com/svn/trunk/modules/publisher There is a local path called 'Publisher'. The wget works okay, downloads all the files I need into the /Publisher path, and then it starts loading files from other paths. If you see [URL]..svn/trunk/modules/publisher , I only want those files, plus the paths and files beneath that URL.
View 2 Replies
View Related
Mar 6, 2011
I would like to use wget to downlaod file from Redhat linux to my windows desktop , I tried some parameter but still not work , can advise if wget can do download file from linux server to windows desktop ? if yes , can advise how to do it ?
View 14 Replies
View Related
Oct 16, 2010
I have a link to a pdf file, and I want to use wget (or python) to download the file. If I type the address into Firefox, a dialog box pops up asking if I want to open or save the pdf file. If I give the same address to wget, I receive a 404 error. The wget result is below. Can anyone suggest how to use wget to save this file?
View 1 Replies
View Related
Nov 8, 2010
Can I download a Linux iso image from a Windows mirror? I don't see any problems, but my IT guy tells me that it just can't happen because a Linux download server uses a different protocol. But, I could be wrong...
View 7 Replies
View Related
Nov 19, 2010
I have recently been forced to do a hardware upgrade (my previous mobo died). Now, sounds works ok with,amarok because kde has recognized the new hardware and switched to it.
..... does not work, likely because flash uses alsa-oss which is probably not configured automatically. I have tried uninstalling and reinstalling both alsa-oss and flash, but it didn't solve the problem.
View 1 Replies
View Related
Jul 16, 2011
I chose Opensuse as my first distro. The problem is, whenever i invoke any one click installation from any website, (for example vlc), the yast manager is trying to download packages other than vlc which account for about 1.5GB. But I can see that vlc comes to merely 40MB. How to remove those unwanted downloads? and continue to install only what we wanted? I am running opensuse 11.4 with gnome on my notebook
View 6 Replies
View Related
Jun 21, 2010
is it recommended to download an iso file of fedora 13, will the file be destroyed?because i did it twice and it seems not working.
View 6 Replies
View Related
Jun 29, 2010
I'm trying to download two sites for inclusion on a CD:URL...The problem I'm having is that these are both wikis. So when downloading with e.g.:wget -r -k -np -nv -R jpg,jpeg, gif,png, tif URL..Does somebody know a way to get around this?
View 2 Replies
View Related
May 14, 2011
Let's say there's an url. This location has directory listing enabled, therefore I can do this:
wget -r -np [URL]
To download all its contents with all the files and subfolders and their files. Now, what should I do if I want to repeat this process again, a month later, and I don't want to download everything again, only add new/changed files?
View 1 Replies
View Related
May 6, 2011
if there is a mirror I could use to download a recent version of Ubuntu (e.g. natty). I'd like to use wget but can't find an address for a mirror.
View 3 Replies
View Related
Jul 28, 2011
I want to try to download an image of the earth with wget located at [URL] which is refreshed every 3 hours and set is as a wallpaper (for whom is interested details here). Wen I fetch the file with Code: wget -r -N [URL] the jpeg is only 37 bytes and of course too small and not readable.
View 5 Replies
View Related
Jul 2, 2010
I'm trying to download all the data under this directory, using wget: [URL] I would like to achieve this using wget, and from what I've read it should be possible using the --recursive flag. Unfortunately, I've had no luck so far. The only files that get downloaded are robots.txt and index.html (which doesn't actually exist on the server), but wget does not follow any of the links on the directory list. The code I've been using is: Code: wget -r *ttp://gd2.mlb.***/components/game/mlb/year_2010/
View 4 Replies
View Related
Dec 10, 2010
Is it possible to configure yum so that it will download packages from repos using wget?Sometimes in some repos yum will give up and terminate for "no more mirrors to retry". But when use "wget -c" to download that file, it will be successful
View 2 Replies
View Related
May 26, 2011
I had set two 700MB links for download in firefox 3.6.3 by browser itself. Both of them hung at 84%.I trust wget so much.Here the problem is : when we click on download button in firefox then it says save file & when download has begun then i can right click in downloads window & select copy download link to find that link was Kum.DvDRip.aviif i knew that earlier like in case of hotfile server there is no script associated with download button just it points to avi URL so I can copy it easily. read 'wget --load-cookies cookies_file -i URL -o log'I have free account (NOT premium) on sharing server so all I get is html page .
View 4 Replies
View Related
Jul 16, 2011
Is there a way for wget not to download a file but rather just access it? I use it to access a URL that triggers a process on a web server, but the actual HTML file at that location doesn't need to be downloaded and saved. I couldn't find anything in wget's help to show if there's a way to do this. Could anyone suggest a way of doing this?
View 2 Replies
View Related