When we download ubuntu iso images from the download mirrors, md5sums, sha1sums and sha256sums are provided for the download. But, there are also some .gpg files provided. Are they used to verify some signature or something? How do I use them to verify the signatures of the packages?
how to add files (and where) for anonymous download. I installed vsftpd and configure /etc/vsftpd.conf file...just few common options like allowing anonymous,download,upload. And now i can login with anonymous. But i dont know what to do next, i want to try to download and upload files.
I was having trouble getting php files to display properly on my ubuntu 10.10 LAMP setup. Everything was installed with defaults and working properly. testphp.php worked as long as it was in the sites parent directory, but any php files in user directories did not work. All browsers tried to download the php files located in /home/user/public_html instead. I tried to use the help documents here, [URL]..
Finally I was browsing around in the /etc/apache2/mods-available directory and looked at the php5.conf file. Here is the relevant information from the file:
# To re-enable php in user directories comment the following lines # (from <IfModule ...> to </IfModule>.) Do NOT set it to On as it # prevents .htaccess files from disabling it. # <IfModule mod_userdir.c> #Comment out this line
I tried to edit the help document linked above but it says not to do so! I couldn't find a reference for this fix anywhere else, so I decided to post it here.
I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: URL... adds it to the end of the URL (URL...) and it downloads using the premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something. I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it.
The original thread was closed because "Sounds as if you are trying to steal a service which you have not paid for. We do not support that kind of activity here on Ubuntu Forums." However, it's not stealing since I am only going to use this with accounts that I have legitimately paid for.This might not be the right place to post this... if that's the case, I apologize - please move it to the correct location.I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: http://www.megaupload.com/?d=xxxxxxxxand adds it to the end of the URL (http://192.168.1.199/mu/?d=xxxxxxxx) and it downloads using his premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer to use my premium account? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something or because the computer has MU blocked. I want this to be a private site that only I have access to since it's my premium account and my money. I am not asking how to circumvent megauploads download limit at all (I've already paid for it... no need to circumvent it).
I just need a nudge in the right direction. Thanks in advance for any help you can provide.I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it. I assume this has something to do with setting up a proxy server - I just don't know how to do that and make it work like I need it to.
I haved tried 3 times to download DVD-7 from http://cdimage.debian.org/debian-cd/...md64/jigdo-dvd, and every time it has failed with just 5 files left to download.
It says: I cannot begin to describe. All those hours of downloading for nothing! What the heck is happening here? When I try to just continue on, I get error code 3 aborts and have to just start all over.
I have installed 9.10 server on an old machine at home that I want to use purely for managing any software downloads.
I used to use the firefox add on DownThemAll on my ubuntu desktop environment so I am looking for a web based download manager for the server that has similar features (username/password restricted downloads, scheduling, pausing/restarting downloads) as DownThemAll.
I basically want to be able to add a bunch of downloads to a list and the server then downloads them. I need to be able to save a username/password combination for certain sites. I would also like to be able to see progress on the downloads and pause/resume them
I'm using vsftpd on my server. When I connect (using file-zilla) from other computers on the same network I can't download any files. I can upload, create directories, and delete stuff, but I can't download. I've disabled anonymous access and enabled local user log-in. My /etc/vsftpd.conf
Code: # Example config file /etc/vsftpd.conf # # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults.
I just installed ubuntu server on a spare pc I have and I plan on hosting forums on it. I don't want to install the gui on it and waste resources as the pc is not all that new. But I don't know how to download things into the pc and store them in folders through the CLI. I just want to know an easy way to at least download webmin into it and I think I can figure out the rest. I know how to install webmin from CLI.
Whenever a client tries to download a file from my server via ftp, SAMBA, Teamspeak 3 File Transfer, etc., they report very slow download speeds, around 3-6 kb/s. If I try a ftp file transfer locally, the upload speeds are normal, but I still experience slow download speeds.
My server is connected to a router, which connects to the internet. All other machines connected to that router can upload and download files at normal speeds. It seems to be a server problem, I just don't know where to start.
I'm trying to download a set of files with wget, and I only want the files and paths "downwards" from a URL, that is, no other files or paths. Here is the comand I have been using
Code: wget -r -np --directory-prefix=Publisher http://xuups.googlecode.com/svn/trunk/modules/publisher There is a local path called 'Publisher'. The wget works okay, downloads all the files I need into the /Publisher path, and then it starts loading files from other paths. If you see [URL]..svn/trunk/modules/publisher , I only want those files, plus the paths and files beneath that URL.
I recently installed vsftpd and can't see any files I put in the nopriv_user folder.
I want to be able to login anonymously without a password, see files to download and upload files to a pub folder.
After installing and configuring vsftpd I created the ftp_priv user by doing "sudo useradd -d /home/ftp_priv -m ftp_priv". Then I added a folder /home/ftp_priv/pub with permissions 777 and a test file.
I just set up an ubuntu server with websvn, webdav, etc. Works really nice, but additionally I would like to have an easy interface for sharing some files with firends/collegues via webinterface. I was thinking of something like WebShare form helios or mydrive/etc. with easy access and usercontrol.
I am searching for several hours now and didn't find anything really usefull. Is there any open source tool/apache plugin which can do that?
I know that you can use wget and cron to schedule downloads on a regular basis, I just don't know how. I wanted to download videos from this website:eed/M...?subshow=false but I don't exactly know how.
I couldn't find an answer myself. So, the problem is that every single bittorrent program I tried downloads files I checked to ignore so I know this is not a program related issue. For example, there are 2 pictures in a torrent, I select only one but in the end I have both downloaded.
I can't download files from my server that I have. code...
it have worked before when the server used port 21 but now when they changed it to 4700 it doesn't work. I have emailed them and they get it to work with FLASHFXP but that is a freeware and it is for windows.
in filezilla I get "Error: Failed to retrieve directory listing"
I recently downloaded ubuntu 10.4 and have recently solved my network problem. I enjoy listening to music, and I want to download it... but how!!!!! I have a firefox browser and have tried to download mp3 files (usually off beemp3). Once I press download it immediately brings me to a new window or tab and just plays it for me. There is no download no matter what application I install for linux or firefox. Is it just linux that doesn't allow me to download mp3 files or is it something to do with firefox?
On Lucid with Sun Java installed (and open source cousins removed), I've installed Vuze and opened the incoming port in my firewall. All the configuration and tests run fine, except..*The links on Getting Started page don't work.*Search doesn't work.*Even from the Vuze HD Network tab, I can click to download and nothing happens. I have Vuze installed on a Windows machine wired to the same router and the application there worked fine, right from the start. What more do I need to do to make it work on Lucid?
Well a bit earlier today I was trying to install the latest VLC following steps from this link:it instructed:Code:sudo add-apt-repository ppa:gnome-media-player-development/developmentsudo apt-get updateBut after the sudo apt-get updateI started getting errors at the end.. here is my log after running it:
Code: sendblink23@sendblink23-Ubuntu:~$ sudo apt-get update Hit http://us.archive.ubuntu.com lucid Release.gpg
suppose that we wanna install a program, so we must do this : sudo apt-get install program_name after installing that file, yeah indeed we can run that program, but where can actually we find the original-downloaded file in our ubuntu?
W:Failed to fetch http://ppa.launchpad.net/alexftimie/...source/Sources 404 Not Found , W:Failed to fetch http://ppa.launchpad.net/alexftimie/...-i386/Packages 404 Not Found , E:Some index files failed to download. They have been ignored, or old ones used instead.