Software :: Unable To Get The File In Http Server Through Wget Command?
Jun 24, 2011
i am using fc9 server i installed Apache web-server i kept some datafile in my html folder when tried to download remotely through web i can download the file tried to get the file in remotely through wget command i am unable to get the fileor is failed: Connection timed out Retrying below the steps i tried it
my target file is http://X.X.X.X/test.zip
wget -T 0 http://X.X.X.X/test.zip
wget http://X.X.X.X/test.zip
I need to write a script that executes a wget. The difficulty being, if wget just sits there with no reply for a very long time, I don't want the script to run that long.
How can I time out the wget command in the script if it doesn't give me a reply in a minute or so?
we have a Red Hat server and I'm using wget in crontab to run some PHP scripts. We've been doing this for some time now and it's been working fine.I tried to add another script using wget to run a PHP script behind HTTP authentication. However, despite the fact that the URL works fine and the username and password are correct, we are getting Connection Timed Out errors each time. What might cause wget to work for unauthenticated URLs, but not authenticated ones?
I've tried --user=/--password=, --http-user=/--http-password and Username:Password@ in the URL and all three fail the same way. Here's the command in question:
Again, wget works, the file with authentication works, but wget calling the file with authentication does not work.
UPDATE: Actually, I get the same timeout if I access the authenticated URL without authentication. Could that mean that Apache is rejecting wget requests for authentication outright?
I am having a problem with 5.4 that I did not have with 4.5. The problem happens only sometimes but in specific instances. Basically a summary of the problem is that certain network transactions timeout. The specific instances are with wget, rpm, http. The problem usually, but not always, occurs with pptp stuff. (NOT running pptp but getting pptp stuff). For instance, the following command, which finishes in seconds on non-5.4 OS's: wget [URL] downloads about 20% then gets stuck. About 5 minutes later it downloads another 20% and then gets stuck, etc. The same thing with rpm: rpm -ivh [URL] waits about 3 minutes and then gives an error. I think it does the same thing as the wget but wget will keep trying, while rpm gives up. The error from rpm: Retrieving [URL] ..five minutes later:
I can wget the above as I mentioned before and install it that way. Before I do it, yum works fine. Afterwards, yum exhibits the same behavior of timing out (because it is using the pptp repository). Also visiting the pptp web site from Firefox times out on certain pages. I originally thought it was some problem with the pptp site, but I notice that log into hotmail.com. Does the same thin (fine on other operating systems). A view with Wireshark on the wget (pptp) shows the my machine receiving a reassembled TCPPDU from 216.34.181.96 (Sourceforge), sending an ack, receiving a reassembled PDU, sending an ack, receiving, sending followed by the 5 minutes or whatever of nothing. Then sourceforge sends an RST and a SYN and the process is repeated.
When I put the machine directly on an AT&T IP connection (12.147.X.Y) everything worked fine. Same with Comcast on a direct link. The times I am having problems is when our router is hooked up to a Comcast IP (70.88.X.Y) and assigns 192.168.5.X addresses to our machines. So when I was doing the above from 192.168.5.27 going through the router through Comcast is when I had the problem. So it is probably something with the router, but it is hard to figure out since CentOS 4.5 and Fedora do not exhibit this behavior, nor does 5.4 on most sites (mail.yahoo.com for instance). I did verify, at least from what I could, that ICMP type 3 and 4 are not being blocked. If they were, the same problem would happen on other op systems. And I was able to ping, albeit just locally, but we looked at the router settings and ping was not blocked.
I would like to find out how I would use both curl and wget to sent an http post to get the hostnames of a few servers. I know am not even given any work of anything I have done, but the reason is that I am really lost, and I do not even know how to start it.
I tried:scp -r -P 1133 root@XX.XX.XX.XX:/home/imagesShouldn't that recursively copy /home/images from the server XX.XX.XX.XX through SSH on port 1133?Btw - I know you can do it with a tar or just a regular FTP program. The folderI am trying to copy is 40 gig, there isn't enough free space to make a tar (if the server would even do it)
I try to install my Fedora 12 machine from my http server.but i fail to install it on the client side machine give me a error like unable to retrieve.Here is the step what i did.
1)I copy the entire my rpm of F12 DVD to the /var/www/html/rpm
2) and start the http server (my http server is start without any problem).
3)on the client side machine i booted from f12 dvd after askmethod i select url.
4)and enter the ip address of 192.168.10.2(for client ip address),192.168.10.1(gateway),and 192.168.10.1(nameserver)
5)after that i enter the [URL] this is my http.conf file
I have a debian box running Apache2 and PHP5.2.6 lenny.
When a request is made via https, php displays the content fine. If the request is made over HTTP the file is offered for download, rather than displaying it.
I know its probably something trivial but I've never seen this issue.
The plot thickens, I can display PHP over HTTP in some directories but not others (which offer the file for download)?
Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.
nc -v www dot aalto dot fi 8080 HEAD / HTML/1.1 host: www dot aalto dot fi And it returns: 200 OK Content-Length: 858 Content-Type: text/html Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT [Code]....
I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.
I am running apache httpd-2.2.3-43.el5.centos.3 When i restart the http, it says the following error "Invalid command 'JkSet', perhaps misspelled or defined by a module not included in the server configuration "
Do I need to install anything like tomcat? or include any configuration setting in apache? kernel version: 2.6.18-194.32.1.el5
I'm running a webserver and i've uploaded serveral .txt files. I want them to be downloadable... For example if someone opens: [URL], to start downloading, not just to open in the browser.
I installed Nagios on my Ubuntu 10.04 server using apt-get and when I accessed the web console, everything was OK. I made some changes to apache (creating some new virtual sites) and since then Nagios gives me a warning message for HTTP with the message, HTTP WARNING: HTTP/1.1 404 Not Found. The sites that I created are working perfectly. I noticed that the attemps are 4/4. Does this need to be reset or does Nagios automatically reset that once it detects the issue is resolved?
We're having an issue with HTTP POST file uploads on our two Ubuntu PCs. For some reason, whenever one of our users attempts to submit a file in an HTML form, the request times out, usually with a 500 Internal Server Error message. This problem is not limited to one site, but occurs on all sites that use file uploads. Also, the problem does not appear to be with our network, as a Windows 7 PC on the same network can upload files to the same sites without any difficulties. The problem is not browser-specific; we have tested with Firefox, Epiphany, and Google Chrome and all produce the same results. The issue is relatively new, and was first observed within the last month; before this time, both machines had no problems uploading files.
Does anyone have ANY idea what could be causing this? I've tried a number of things, including rebooting the PCs, rebooting the network, disabling IPv6, etc. I'm not very experienced in Linux system administration, but I can use the terminal and am familiar with some terminal-based diagnostic tools, so if you need any additional info or want me to try something, please let me know! I've exhausted my own computer knowledge with regards to finding a solution to this problem.
I have written the batch file which will go to the website, wait for input (download button/exit), move to the next algorithym and repeat. My problem is getting the batch file to click the stupid download button. Can I use wget, and can you show me how to use it or point me to a really good api?
Code: @ECHO OFF ECHO INSTALLING ADOBE FLASH PLAYER PLUGIN UPDATE
when i wget aro2220.com it displays --2011-03-02 16:35:58-- url... 127.0.1.1 Connecting to aro2220.com|127.0.1.1|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 177 [text/html] Saving to: 'index.html' 100% 2011-03-02 16:35:58 (12.4 MB/s) - 'index.html' saved [177/177]
However, when I look into the file it is actually blank saying something like "It works! This is the default web page for this server" which can't be correct since that is not what Aro2220.com actually displays.
Furthermore, when I try to wget files I've put on the server for myself it returns a 404, file not found.
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
I am unable to use ncftp command I have defined all variables used. i have to copy the data to another server FTPS. When i am executing this command it is throwing error
ncftp -u : option unknown
I am copying total script what i am executing in my server. Please some one tell me is there any pistake in using the ncftp command , or tell me some other commands to copy data to remote server
I am working on linux server with below specifications.Linux EDT 2008 i686 i686 i386 GNU/LinuxWhile checking the status of the server using the command 'opmnctl status' and when server is down the output is not getting redirected to file.I m using the command as,opmnctl status > abc.txt.
In oreder to run an application software on RHEL 3 ES server, I created a link forcefully using following command from root id: # cd /lib64/tls/ # ln -sf libc-2.3.4.so libc.so.6 before that I copied file libc-2.3.4.so from a workstation with OS RHEL 4 WS so that a link can be created. Now I am unable to run any command except cd & pwd and it gives error messaage as given below: ls:relocation error:/lib64/tls/libc.so.6:symbol _rtld_global_ro,version GLIBC_PRIVATE not defined in file ld-linux-x86-64.so.2with link time reference.
Before running this command libc.so.6 was pointing to libc-2.3.so file in path /lib64/libc-2.3.2.so. I am now unable even to open a new window on the server.Please send me some solution as early as possible because this server is running production data and many users are runnig application on this server.
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
I am using wget to grep some data from a .htaccess protected website.I don't want to use the --http-user= and --http-password= variables in the script so I tried to create a ~/.wgetrc file.Whenever I run my wget script, it will never use the http_user and http_password examples to login to th website.
I would like to use wget to downlaod file from Redhat linux to my windows desktop , I tried some parameter but still not work , can advise if wget can do download file from linux server to windows desktop ? if yes , can advise how to do it ?