Fedora Networking :: Download All The Files In An Http:// Folder?
Sep 18, 2009
Yahoo! is shutting down Geocities and I need to download all the files in my webfolder there, is there a program that will download all the files there automatically
I have a question about using ubuntu to download files from an HTTP server remotely and didn't know where to put it, so hopefully it falls under general support. Anyway, I am about to move into a place with an incredibly slow internet connection and a tiny data allowance and my brother has said that, if possible, I can use his internet connection to download any large files to a box I can just leave at his place, then I can simply come over to his place every few weeks and copy said files to a hard drive and all will be well. The problem is that I am not sure how to do this.
Today I went out and bought a few parts and built a cheap computer with a HDD big enough to hold whatever I need, however when I got home I realised I had no idea how I was going to handle the software aspect of this. Is there any way that I can access that computer remotely over the internet and schedule fairly large downloads from an http server? Also after talking to a friend I was told that I need to install the server version of ubuntu if this is to work, is this correct? Also, if its relevant the specs of the computer I have for this is using an "Intel Desktop Board D510M0 + Intel Atom Processor D510" which uses 64 bit architecture.
How would a make files in /home/user01/file available on the web as [URL]? Is it possible for me to have anyone to access that link to log in as user01?
I'm trying to move font files (.ttf and .otf) from the download folder to a folder Inkscape can find them in. I tried dragging and dropping them in Dolphin but I don't have permission! So tried in the terminal:
Code: ~$ mv ~/downloads/fonts/*.*tf /usr/share/fonts mv: cannot stat `/home/bryan/downloads/fonts/*.*tf': No such file or directory
Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.
nc -v www dot aalto dot fi 8080 HEAD / HTML/1.1 host: www dot aalto dot fi And it returns: 200 OK Content-Length: 858 Content-Type: text/html Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT [Code]....
I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.
I am trying to setup my webserver and I am trying to make a website to run under suexec but somehow I cannot start my apache it directly fails and SELinux is giving me errors and don't really know what to do with it, it is giving me some command to type but not sure if this will make my server less secure. The SELinux error is as follow:
Code: Summary: SELinux prevented httpd reading and writing access to http files.
Detailed Description: SELinux prevented httpd reading and writing access to http files. Ordinarily httpd is allowed full access to all files labeled with http file context. This machine has a tightened security policy with the httpd_unified turned off, this requires explicit labeling of all files. If a file is a cgi script it needs to be labeled with httpd_TYPE_script_exec_t in order to be executed. If it is read-only content, it needs to be labeled httpd_TYPE_content_t, it is writable content. it needs to be labeled httpd_TYPE_script_rw_t or httpd_TYPE_script_ra_t. You can use the chcon command to change these contexts. Please refer to the man page "man httpd_selinux" or FAQ [URL] "TYPE" refers to one of "sys", "user" or "staff" or potentially other script types.
Allowing Access: Changing the "httpd_unified" boolean to true will allow this access: "setsebool -P httpd_unified=1"
Fix Command: setsebool -P httpd_unified=1
I will write down how I did setup my server so maybe you can see a mistake I did. First I changed my Apache httpd.conf I added the following to it: Code: NameVirtualHost 192.168.1.2:80 <VirtualHost 192.168.1.2:80> ServerName localhost DocumentRoot /var/www/html DirectoryIndex index.html index.html index.shtml index.php </VirtualHost>
Then I created the username "ulyaoth" with the group "ulyaoth" as I specified with my suexec, then I created all the directories as specified in my httpd.conf and "chown ulyaoth:ulyaoth (dirname)" them to the right group and username.
I am used to Ubuntus simple sharing with samba. Just install it, reboot and then share the files.Then do I klick on network folder and see all the shared files on the computers in the network.
How do I install it so I only need to go into network folder and see the other computers shared files.Then, how do I share files?
I hope it's not so difficult and that I have to change i config-files.
I have Fedora 11 and have successfully set up VNC so that I can connect within the network (direct using vnc client/server setup). I am in the middle of working on a problem and have a fried who is in another city and was trying to set up VNC so that he could connect via http, which I know is possible (that it listens on range 5800 + the display number - mine is 3 in this instance). For the life of me I cannot get the java http viewer to work, here is what I have done so far: My vncservers file:
[Code]...
My router is set up to forward incoming requests on the range 5900-5910 and 5800-5800 to this computer. I have opened ports 5902, 5903, 5802, and 5803 on my linux firewall. So far as I know, this is all that is necessary..well. Whenever I attempt to connect via [URL]:5803 I get a time out error. What could be wrong?
I can ping certain websites, such as Adobe.com, but I cannot access them via http (i.e. through firefox or yum). Some websites work through http, like Google, while others don't. The ones that don't are always the same.
What really hurts here is I cannot yum to all repos i'd like to. Since the same sites cannot be accessed through firefox, I imagine there is some underlying problem with my system's HTTP setup.
My windows machine on the same network works fine. I have had this problem since I installed Fedora 10 about 4 months ago. I'd rather not reinstall as nothing really seems broken (aside from this http issue), my system is completely up to date.
If I use a public proxy website I can get to the sites I can't connect to directly. I've followed a FAQ from mozilla for Firefox that hasn't helped, but I don't think its a Firefox issue since yum suffers as well. I also followed the fedora FAQ and I have been using OpenDNS servers.
However, configured a website on a dedicated server using WHM/cPanel. The site was uploaded using the master account for the website.
The security issue is public users are able to upload files on to my server via the website. They could even access the root and execute whatever they want on the server.
I have consulted with 2-3 Linux experts. According to them, the PHP user has rights to execute anything on the server or upload & store files in whichever folder they want.
Can I protect my folders to avoid file uploads via the website. The application has security vulnerabilites. However, I want to prevent hackers to enter my site until the vulnerabilities are fixed.
I'm quite lost and not even 100% on which forums i should aim at..I have a relatively simple task yet for some reason its proving difficult to do!! Our server is running Fedora (running a live commerce site) recently we had some security updates applied so i can no longer connect remotely via SSH root. Using PUTTY i now connect under a different username and then SU under the root user/pass. All is fine, now all i want to do is download a file in the tmp2 directory. /tmp2/apache2-gdb-dump Can anyone tell me1) How would i download this file using Putty/SSH command 2) I'd much rather user a GUI tool for this kind of work but the substitute user step doesnt seem to be supported by common apps,as FileZilla.With this in mind, is there some software or steps i can take so i can connect, run a su command and use a nice gui to transfer these files
I'm having trouble understanding how to verify the download of the Fedora iso-files. know how to do this on a Windows system. I have been looking in the help section for checking the iso-files, but I'm not sure where to find the right hashes, like MD5, SHA1, and etc.
when I try to view php files on my linux box, they want to download instead of viewing them. I configured apache for php as the manual said but for some reason it doesn't want to parse the php. the http.conf file may need to be changed, that the line "AddModule mod_php4.c" was missing in the conf, however the AddModule and ClearModuleList directives no longer exist in the newer versions of Apache. These directives were used to ensure that modules could be enabled in the correct order. The new Apache 2.0 API allows modules to explicitly specify their ordering, eliminating the need for these directives.
I haved tried 3 times to download DVD-7 from http://cdimage.debian.org/debian-cd/...md64/jigdo-dvd, and every time it has failed with just 5 files left to download.
It says: I cannot begin to describe. All those hours of downloading for nothing! What the heck is happening here? When I try to just continue on, I get error code 3 aborts and have to just start all over.
I'm having a weird problem with GNOME. Inside a folder I have several files and when i open the folder graphically i can't see some of the files. If i use my terminal i can see those files. It's very strange!! I've rebooted my computer and the problem is still there (inside the same folder)
I'm setting up a network between 2 pc's where the one should act like "file server" and a normal pc to surf on internet.called ORLA-DESKTOP and the other pc is called OLGA-DESKTOP a pc connecting to the server and automounting the shared folder to the desktop Both pc's run ubuntu 10.04 Lucid Lynx The shared folder is located on the server in /home/orla/svenson
ORLA-DESKTOP have 2 users "olga" and "orla" in a group called "svenson" OLGA-DESKTOP have 1 registered user "olga" also in group called "svenson"
users on ORLA-DESKTOP can read/write/append and so on and fully manage everything in the shared folder.But on OLGA-DESKTOP the user can make a file on the pc and then drag'n'drop the file the the shared folder, and can also delete files in the shared folder. but cannot create a file directly into the folder like on ORLA-DESKTOP I have 3 configuration files made. 2 for automounting, Located on OLGA-DESKTOP 1 for samba server configurations located on the server ORLA-DESKTOP
The first one is /etc/fstab
Code:
# /etc/fstab: static file system information. # # Use 'blkid -o value -s UUID' to print the universally unique identifier # for a device; this may be used with UUID= as a more robust way to name # devices that works even if disks are added and removed. See fstab(5).
[code]....
To sum it all up the real problem is that OLGA-DESKTOP can't append to files in the shared folder. but users on the server have no troubles doing it..
I keep on downloading tar.gz files into my downloads folder and i cant do anything with them. what i need to do to install the file so i can use it? An example, i am trying to install Frets on Fire, and am failing bad.
I am having problems seeing files and folder names using Nautilus, but they are there as I can access them using CLI commands. Is there a way to get Nautilus to update its database or whatever it uses? I am using Nautilus 2.32.2.1. As is shown in my signature I am using F14 and Gnome 2.32.0.
I was happily running F10 and against my better judgement when it offered to upgrade me to F11 I decided to give it a try. The F11 install hung at 893 of 1626 packages installed. Some SE Linux package was in process of install. On the next boot the installer fails to upgrade for a corrupted root. It says I can backtrack and do a full install, but crashes with a bug when I choose backtrack.
So here is my question - Can i edit the command line and tell the installer to full install rather than upgrade? Or am I stuck with downloading a DVD ISO and doing a full install that way. I've done F8, F9 and F10 so full install from DVD doesn't bother me.
In Linux bash shell, for a given directory, how can I list:The create date for that directory The number of files in that directory The number of subdirectories in that directory.
when i used windows there was this wonderful editor named Notepad++.it was perfect(it still is) some of its best and useful features of it (for me) was:
1-open all files in a folder when drag and drop the folder on it 2-search and replace a statement in all open files 3-have an extended mode which include special characters like
and so on.. i want to know if there is an editor with this feature in ubuntu?
I've using RedHat/Fedora for years now, and every now and then I encounter the following situation :
I open a folder and it's empty. The folder was containing files and I'm 100% sure I didn't deleted them myself. Each time the folder is deep inside the hierarchy and is among other untouched folders. Sometimes it's a folder I never use, sometimes it's a folder I use almost everyday. The missing content is not large (a few regular files).
I'm currently running F13 but I've seen this behavior before on previous versions. This is kind of scary all my work is there and my backups are also done on a a linux backup server.
I'm puzzled, I cannot see any specificities to these folders, I had no crash or cold reboot, nothing I see can explain that. Could it be related to ext3?