Software :: Download Music From An External Website Onto Ubuntu?
Jul 29, 2011
What is the best and easiest way to download music from an external website onto Ubuntu? I have both Rhythmbox and mplayer installed, but it's not clear to me how you download music so that it is recognized by those players. Would it be easier to use a terminal for downloading?
Alternative to Internet Download Manager (IDM) to download movies from any website.Ok, so one of the cool things about IDM was that i was able to download movies from ..... and other sites that have video clips on their site, but now that i have switch all my computers over to ubuntu linux, i now need an alternative to this problem because IDM will not work with the firefox on ubuntu linux.So my question is, do you guys know of an alternative software for downloading movies from any site such as ..... and other sites?
I am thinking of using rsync to sync my Music folder to another folder called Music on an external USB drive. I will be using the Scheduled Tasks front end to schedule the syncs. What should the syntax look like when I put it in Scheduled tasks. I want this to be as simple as possible.
I have a strange problem with my browser(s), even Epiphany....when I go to my music site [URL] then log in, go to my page and click on link to play a song the browser Crashes. It never did this before and I dont know what I could have done to cause this. It only does this on this website. Could it be that they use a Quicktime plugin to play the music? I don't know cause I used to be able to play music on there before. It doesnt do this on my Reverbnation site page, just Icompositions. If I click on the icon to restart browser a box comes up saying "we're sorry"...your browser crashed for no apparent reason.....and offers a choice to either reload the page (that repeatedly crashes) or start new session. It isn't a big deal but this never used to happen before. Could it be some bad code on that website?
I'd like to download a whole website for offline reading, I know wget does this, but how would it work with a site that needs login to view content? Of course I've got the password to login, I just don't know how to let wget know that.
I'm new to Linux. I use Soalris all the time.In Solairs, I can go to "sunfreeeware.com" to download most of the third party softwares.But for redhat Linux, what's the good website to download "rpm"?
I want to download every page on a particular website that has: [URL[].. as the beginning so any page that begins with that will be downloaded, so for example [URL]...I tried "wget -r [URL]..
I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: URL... adds it to the end of the URL (URL...) and it downloads using the premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something. I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it.
The original thread was closed because "Sounds as if you are trying to steal a service which you have not paid for. We do not support that kind of activity here on Ubuntu Forums." However, it's not stealing since I am only going to use this with accounts that I have legitimately paid for.This might not be the right place to post this... if that's the case, I apologize - please move it to the correct location.I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: http://www.megaupload.com/?d=xxxxxxxxand adds it to the end of the URL (http://192.168.1.199/mu/?d=xxxxxxxx) and it downloads using his premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer to use my premium account? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something or because the computer has MU blocked. I want this to be a private site that only I have access to since it's my premium account and my money. I am not asking how to circumvent megauploads download limit at all (I've already paid for it... no need to circumvent it).
I just need a nudge in the right direction. Thanks in advance for any help you can provide.I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it. I assume this has something to do with setting up a proxy server - I just don't know how to do that and make it work like I need it to.
I found out where Ubuntu One downloads music from the music store. /home/user/.local/share/ubuntuone/Purchased from Ubuntu One/ I would like to change this folder to: home/user/Music/ Is this done in RhythmBox or Ubuntu One? Also, I moved the MP3s from the default location and they disappeared from my cloud storage, is it possible to turn this off? With a custom or default location? When does the sync take place? I know its after the purchase but it seems to take a while to start.
I need to mirror a particular website (all the pages under that particular domain) any pages (but not whole sites) that the website links to.
How to do this
wget -r --level=inf (or some other variant) will mirror the site.
wget -r -H --level=1 will get all the links (from all domains) to the first level.
Anyone have any ideas on how I could combine these, to get the entire of the main site and one level deep into external sites. I've been banging my head against the manual all afternoon.
i have a website named [URL]... now i want to access this website using proxy server(squid,or etc...) under my personal server named [URL]...means that [URL]...
looking to implement a website where business partners can download/upload documents. The files and the "partner areas" should be password protected.Are there open source projects / Ubuntu packages readily available for implementing this type of web-based file sharing service
I searched the faq, I did a search online for this etc. The reason I ask is that suse on the download page of 11.3 states don't download directly since it is very slow and has no check sum. Therefore the iso may be flawed and you have spent 10 hrs trying to download the iso. My "fast" internet through qwest is at 1.5 mbps download so I assume that is why it is so slow to download the iso of 4.7 Gb.Now for the question the download page suggets you use bit torrent since it does the checksum but will download the iso no faster than the direct method. Second is to use check metalink with the addon in Firefox downthemall. I did this and had quite a time getting the iso link for 11.3 to start downloading. I would start downthemall from the Firefox toolbar and I thought wow!!! it downloaded that iso fast, how cool. Well it was not the iso but the website for the download page! At one point I got it to start downloading the iso but came in this morining and it was still going after 17hrs and showed several hours to go. I gave up and deleated it.
This afternoon I just gave in and started the download again using the bit torrent and I am waiting for the download to finish. I did eveything the site indicated but after and hour I noticed under logger it said **** openSuse-11.3-DVD i586 iso: PIECE 2206 FAILED HASH CHECK. It shows this same message from the begining of the download 7hrs ago to the last few minutes on each section. It states it is 75% complete? Is this normal or am I wasting 10 more hours? Sorry if this is not the place for this question but I never had trouble with downloading 11.2 and burning it to the disk. I think I just used the Directlink method and crossed my fingers. It did download ok!
I need to download a file from a website which has a URL formatted like:
[URL]
This redirects to a .zip file which has to be saved. There is also a need to authenticate based on username and password.
I tried to use wget, curl and lynx with no luck.
UPDATE:
wget doesnt work with redirection. It simply downloads the webpage instead of the zip file. curl gives the error "Maximum redirection exceeded > 50 " lynx also gives the same error.
I usually use wget to download stuff from websites when scripting.I have a new requirement that requires me to authenticate then select some options to execute the download. How would I go about this? First thing that comes to mind is using keyboard macros in the Windoz world but I need to do this in bash or perl.
Most cellphones can play music. How does one sync Ubuntu One w. the music player in the cellphone. Alternatively how to download music from Ubuntu One to a cellphone? Access to the web/internet is assumed in this case.
There is a piece of streaming music I want, when I open it, the song plays using totem browser plug in. Is there a way I can capture the song and keep it as a file (or put it on my mp3 player), preferably without downloading anything.