General :: Using Curl And WGet To Send HTTP Post
Oct 11, 2010
I would like to find out how I would use both curl and wget to sent an http post to get the hostnames of a few servers. I know am not even given any work of anything I have done, but the reason is that I am really lost, and I do not even know how to start it.
View 4 Replies
ADVERTISEMENT
Jan 29, 2011
I am trying to connect to the web interface found at [URL] using curl. This first requires login information to be entered at [URL], but I am having an issue with the login process. I am trying to submit the following form via POST:
Code:
<form action="j_security_check" method="post" id="login_form" name="login_form">
<center> <table style="background: #cac1cf;FONT-SIZE: 12px;">
<tr> <td align="center" colspan="2">Please enter your username and password:</td>
</tr> <tr> <td align="right">Username</td>
<td> <input name="j_username" style="width: 250px" id="j_username" type="text"/> </td>
</tr> <tr>
<td align="right">Password</td>
<td> <input style="width: 250px" name="j_password" id="j_password" type="password"/> </td>
</tr> <tr> <td colspan="2" align="center">
<input value="Enter" name="enter" type="submit"/>
<input value="Clear" name="Clear" type="reset"/>
</td> </tr> </table> </center> </form>
The command that I am using for this is the following:
Code:
curl -c cookies -b cookies -L -d "j_username=user%40domain.com&j_password=pass" [URL]
The command is properly formatted as far as I can tell. I tested it with another website using a similar authentication scheme using different POST variables specific to the form and it worked fine.
When I run the above command with the -v tag, it reveals this:
Code:
* Connected to lcl.uniroma1.it (151.100.4.74) port 80 (#0)
> POST /sso/j_security_check HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: lcl.uniroma1.it
> Accept: */*
> Content-Length: 44
> Content-Type: application/x-www-form-urlencoded
>
} [data not shown]
< HTTP/1.1 408 The time allowed for the login process has been exceeded. If you wish to continue you must either click back twice and re-click the link you requested or close and re-open your browser
< Date: Sat, 29 Jan 2011 15:26:41 GMT
< Server: Apache-Coyote/1.1
< Content-Type: text/html;charset=utf-8
< Content-Length: 1554
< Connection: close
<
{ [data not shown]
103 1554 100 1554 0 52 5081 170 --:--:-- --:--:-- --:--:-- 10223*
Closing connection #0
I cannot tell why the login timeout is expired when I try to do this, and my investigation toward this end has been fruitless. I saw a brief snippet on Google that vaguely suggested that the underscores in the domain name were at fault, but replacing these with their encoded counterparts did nothing to resolve the issue (that, and underscores should be fine when sent unencoded according to the standards). I have extensively perused the man pages and have come up with nothing to adequately explain this behavior. I also talked to a friend who has worked with curl in his line of work, but he mostly has experience in the context of PHP and has not dealt with this issue before. I am running GNU/Linux 2.6.35-22-generic-pae.
View 3 Replies
View Related
Jun 9, 2010
I'm trying to send files from a Unix server using http/curl to a Linux webserver running Apache. I get the following PUT error message when and the file does not send:
<title>405 Method Not Allowed</title>
</head><body>
<h1>Method Not Allowed</h1>
<p>The requested method PUT is not allowed for the URL
View 2 Replies
View Related
Nov 23, 2010
I have a really weird (but consistent) problem with my Kubuntu 10.10 install: I cannot post some HTTP forms.
First off, this is a client PC problem. My squirrelmail on the server works fine. I just use squirrelmail 1.4.17 to troubleshoot the ubuntu desktop problem
I used an old (07.04) Ubuntu install which worked fine. Then I wiped the disk and installed Kubuntu 10.10 on the same hardware. Everything works but **some** HTTP post does not work (I can log in but not send mail or save draft). I noticed I cannot log in to Yahoo, for example.
My webhosting account can display the apache access_log. When I hit the <Send> button the POST request never arrives to the web server.
I use a router (Dlink DL-604) behind a DSL modem and ooma box. There is a Windows 7 PC and a Kubuntu PC connected to the router. I can use squirrelmail just fine from the Windows PC.
I tried several steps:
- reinstalled Kubuntu
- installed Firefox and Chromium (on top of reconq)
- ran from a CD on my other (Windows 7) PC
- installed Wireshark and compared the traffic (but was unable to pinpoint a problem)
The result was the same: the <Send> button just keeps waiting; the POST request never makes it to the web server.
This sounds (and is) scary and suspect. The fact that the "demo" Kubuntu install (from the CD on my other Windows PC) using the reconq exhibits the same problem on a totally different hardware leads me to believe this may be related to Kubuntu. For example, I had to type this very message on the Windows PC as I could not post it on the forum from my Kubuntu box.
View 2 Replies
View Related
Jul 8, 2010
To make an rpc call I need to sent an xml file as post data.I know how to do this with wget. It works fine when I have the xml already filled in (depending on the node values the response from the call is different).owever I want to be able to edit part of this file, and then sent that as post data using wget. can edit this file using sed (I dont want to rewrite the files each time this gets used; and it does get used alot, with alot of different values).
View 1 Replies
View Related
Feb 19, 2011
I would like to download a webpage using WGET which needs a form submission (POST method) in order to appear. I can do that with this command.
wget --post-data="computer=hosts&style=list" http://www.example.com
The problem is there is more than one form on the requested page and I dont know how to tell WGET which one should it POST the data to.
View 3 Replies
View Related
Nov 9, 2010
I use slackware current, and curl and wget give the following errors:
Code:
repo@cannabis ~]$ wget -r http://users.telenet.be/reggersjans
--2010-11-09 13:48:14-- http://users.telenet.be/reggersjans
Resolving users.telenet.be (users.telenet.be)... ::ffff:74.117.221.11, 74.117.221.11
Connecting to users.telenet.be (users.telenet.be)|::ffff:74.117.221.11|:80... connected.
HTTP request sent, awaiting response... 400 Bad Request
[Code]...
View 7 Replies
View Related
Jul 7, 2010
we have a Red Hat server and I'm using wget in crontab to run some PHP scripts. We've been doing this for some time now and it's been working fine.I tried to add another script using wget to run a PHP script behind HTTP authentication. However, despite the fact that the URL works fine and the username and password are correct, we are getting Connection Timed Out errors each time. What might cause wget to work for unauthenticated URLs, but not authenticated ones?
I've tried --user=/--password=, --http-user=/--http-password and Username:Password@ in the URL and all three fail the same way. Here's the command in question:
[blahblah user]# wget -t 5 -O /dev/null 'http://Username:Password1!@test.example.com/sub/dir/file-name.php'
--2010-07-07 10:11:55-- http://Username:*password*@test.example.com/sub/dir/file-name.php
Resolving test.example.com... 000.000.000.000
[code]....
Again, wget works, the file with authentication works, but wget calling the file with authentication does not work.
UPDATE: Actually, I get the same timeout if I access the authenticated URL without authentication. Could that mean that Apache is rejecting wget requests for authentication outright?
View 2 Replies
View Related
Nov 29, 2010
I've been pulling my hair out trying to get wget to post data to a webpage to automatically download some files. I've tried many methods of syntax, but wget always downloads the html for the login page. A snippet of code I found in the login html page is below. Some of the characters are japanese, because it's a japanese website.
View 7 Replies
View Related
Jun 24, 2010
Is there any curl API to configure only the required protocol. If I have proper openssl installed, the installed curl will have all the protocols (like HTTP, HTTPS, FTP, File etc...) supported by default. Is there any way to allow or disallow only some of the protocol at the runtime. Say I need to support only HTTPS, FILE and I dont want to allow HTTP. Is there any way to do this?
View 1 Replies
View Related
Sep 10, 2010
There are a few web databases (also including my own php-based pdf manipulator), where I need to fill a html form, and upload file attachments.
About one year ago, these sites stopped to work correctly, when using Firefox (but they work from Internet Explorer). The problem concerns file upload.
Other users here also experienced this, and no firefox update corrected the problem in the past year (I am using Firefox 3.6.9 now, and the problem is still there).
When debugging my pdf creator, I found that the attachment-type of any file upload made by firefox is "text/html", irrespectively of what is the type of the uploaded file. Whilst files uploaded by IE have the correct attachment-type.
View 1 Replies
View Related
Apr 28, 2011
I am having a problem with 5.4 that I did not have with 4.5. The problem happens only sometimes but in specific instances. Basically a summary of the problem is that certain network transactions timeout. The specific instances are with wget, rpm, http. The problem usually, but not always, occurs with pptp stuff. (NOT running pptp but getting pptp stuff). For instance, the following command, which finishes in seconds on non-5.4 OS's:
wget [URL]
downloads about 20% then gets stuck. About 5 minutes later it downloads another 20% and then gets stuck, etc. The same thing with rpm:
rpm -ivh [URL]
waits about 3 minutes and then gives an error. I think it does the same thing as the wget but
wget will keep trying, while rpm gives up. The error from rpm:
Retrieving [URL]
..five minutes later:
I can wget the above as I mentioned before and install it that way. Before I do it, yum works fine. Afterwards, yum exhibits the same behavior of timing out (because it is using the pptp repository). Also visiting the pptp web site from Firefox times out on certain pages. I originally thought it was some problem with the pptp site, but I notice that log into hotmail.com. Does the same thin (fine on other operating systems). A view with Wireshark on the wget (pptp) shows the my machine receiving a reassembled TCPPDU from 216.34.181.96 (Sourceforge), sending an ack, receiving a reassembled PDU, sending an ack, receiving, sending followed by the 5 minutes or whatever of nothing. Then sourceforge sends an RST and a SYN and the process is repeated.
When I put the machine directly on an AT&T IP connection (12.147.X.Y) everything worked fine. Same with Comcast on a direct link. The times I am having problems is when our router is hooked up to a Comcast IP (70.88.X.Y) and assigns 192.168.5.X addresses to our machines. So when I was doing the above from 192.168.5.27 going through the router through Comcast is when I had the problem. So it is probably something with the router, but it is hard to figure out since CentOS 4.5 and Fedora do not exhibit this behavior, nor does 5.4 on most sites (mail.yahoo.com for instance). I did verify, at least from what I could, that ICMP type 3 and 4 are not being blocked. If they were, the same problem would happen on other op systems. And I was able to ping, albeit just locally, but we looked at the router settings and ping was not blocked.
View 1 Replies
View Related
Aug 7, 2010
I am using the following software stack:Linux version 2.6.32-21-generic (buildd@yellow) (gcc version 4.4.3 (Ubuntu 4.4.3-4ubuntu5) ) #32-Ubuntu SMP Fri Apr 16 08:09:38 UTC 2010 (Ubuntu 2.6.32-21.32-generic 2.6.32.11+drm33.2) (Kubuntu) cisco anyconnect vpn client 2.3.2016, Mozilla Firefox 3.6.8
The problem I have is that once I join my company vpn, I have full access to corporate services confluence, jira, servers, etc. However when I use firefox to try and resolve a jira and post a body of text the connection timesout.
If i use any other browser it works fine, if slow, if i transition workflows it works fine, and if i use windows and firefox with the same cisco client it works fine.
This appears to be a specific issue with Firefox. I have noticed that in general firefox is slower on ubuntu than on any other platform.
View 3 Replies
View Related
Jun 24, 2011
i am using fc9 server i installed Apache web-server i kept some datafile in my html folder when tried to download remotely through web i can download the file tried to get the file in remotely through wget command i am unable to get the fileor is failed: Connection timed out Retrying below the steps i tried it
my target file is http://X.X.X.X/test.zip
wget -T 0 http://X.X.X.X/test.zip
wget http://X.X.X.X/test.zip
[code]...
View 1 Replies
View Related
Jan 31, 2011
[url]
I ran across the above article, which described a DoS attack in which requests are sent very slowly to the Web server. I'm running lighttpd 1.4.28 on a Gentoo Linux server, and I'm wondering if there is anything I could do in preparation to defend against such an attack.
A bug report [url] seems to indicate that there was a patch in place already against this sort of attack, but I wanted to be sure that was the same thing and if there was anything else I needed to do.
View 3 Replies
View Related
Jun 10, 2011
I have proxy running. I have seen LAN machines sending packets by iftop -P -F 192.168.10./24
[Code]....
How do i set my iptables so that I can only send and recieve http,smtp,ssh,dns,dhcp request in and out of the proxy
[Code]...
View 3 Replies
View Related
Sep 23, 2010
We're having an issue with HTTP POST file uploads on our two Ubuntu PCs. For some reason, whenever one of our users attempts to submit a file in an HTML form, the request times out, usually with a 500 Internal Server Error message. This problem is not limited to one site, but occurs on all sites that use file uploads. Also, the problem does not appear to be with our network, as a Windows 7 PC on the same network can upload files to the same sites without any difficulties. The problem is not browser-specific; we have tested with Firefox, Epiphany, and Google Chrome and all produce the same results. The issue is relatively new, and was first observed within the last month; before this time, both machines had no problems uploading files.
Does anyone have ANY idea what could be causing this? I've tried a number of things, including rebooting the PCs, rebooting the network, disabling IPv6, etc. I'm not very experienced in Linux system administration, but I can use the terminal and am familiar with some terminal-based diagnostic tools, so if you need any additional info or want me to try something, please let me know! I've exhausted my own computer knowledge with regards to finding a solution to this problem.
View 3 Replies
View Related
May 7, 2010
I'm posting this from my Vista Laptop. I recently upgraded to Lynx and I can't access my Gmail account via the web. I can download emails with thunderbird, but I can't send emails. I can log into chat boards and read posts but I can't make posts of my own.Basically, Ubuntu won't let me push data out to the internet.
When I tried to post to this board from my Ubuntu box, it ends up where to save the file newthread.php.
View 7 Replies
View Related
Jul 8, 2011
Hi, is it possible to send a request to a http server without using sockets?
View 7 Replies
View Related
Jun 19, 2011
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
View 2 Replies
View Related
Apr 29, 2011
My issue is that i'm trying to send emails with postfix and gmail as the mail relay,i'm trying to send emails to my self by sendmail -bv user@gmail.com
In the logs, i can understand that it been delivered to the destination,
taken from: /var/log/mail.log:
Apr 30 00:05:23 moni postfix/pickup[10490]: 9C7552170C: uid=0 from=<root>
Apr 30 00:05:23 moni postfix/cleanup[10495]: 9C7552170C: message-id=<20110429210523.9C7552170C@moni.localdomain>
Apr 30 00:05:23 moni postfix/qmgr[10491]: 9C7552170C: from=<root@moni.localdomain>, size=283, nrcpt=1 (queue active)
code....
When login in my gmail account i can't see nothing under the sent / inbox / spam folder.
it's seems like the mail are been sent.. but nothing is happening.
View 1 Replies
View Related
Jul 12, 2010
using variables in CURL.Here's my code:
transfer_to_pcid="AAAAAAAA"
transfer_from_pcid="BBBBBBBB"
basic_password=`ssh rsync@some_test_domain 'curl --silent
[code]....
View 2 Replies
View Related
Jun 14, 2010
we got godaddy VPS working and however.the strange thing is that we don't have CURL installed by default.we should get it with the server itself. but we didn't we have root ssh access. i saw this post: [URL] but it didn't worked for me.
SYSTEM INFO:
PHP Built on: Linux ip-myiphere.ip.secureserver.net 2.6.18-028stab059.6 #1 SMP Fri Nov 14 14:01:22 MSK 2008 i686
PHP Version: 5.3.2
Web Server: Apache/2.0.63 (Unix) mod_ssl/2.0.63 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
Web Server to PHP interface: cgi-fcgi
View 8 Replies
View Related
Apr 1, 2010
I would like to pipe a raw email from cPanel to curl, using curl to send the raw email via a post variable.wever, I am unsure of the command line syntax that would receive the piped email and post using curl.Ideally, the email would pipe to the curl command "curl -d 'emailvar=RAWEMAILHERE'
View 2 Replies
View Related
Jan 20, 2011
in the middle of script, i need to send the output of (send command on line 8) to a file
#!/usr/bin/expect
spawn telnet 172.20.64.133
expect "ENTER USERNAM <"
[cod]....
i treid the below on line 8 :
1- send "show command;
" > logfile.txt : gives an error extra character after the "
2- logsave logfile.txt 'send "show command;
" ': error invalid command
3- i simply tried to send the output of the whole script to file logsave /home/logfile ./script : seems that logsave work under root only
4- ./script > logfile : the problem with this is that the output of echo or (read "enter your id") command will not be displayed on the screen (actually nothing will be displayed, i have to open the log file to see the output). is there any way to save the log of the "send" ? or to save the log of the complete script without hiding the output on the screen?
View 2 Replies
View Related
May 6, 2011
I'm using cURL in ubuntu to download some files like
While some files may be missing from this sequence but when I just use curl -O [url]
cURL will download a 404 error page for those missing ones. How can I avoid this?
View 1 Replies
View Related
Aug 9, 2011
I need to download a file from a website which has a URL formatted like:
[URL]
This redirects to a .zip file which has to be saved. There is also a need to authenticate based on username and password.
I tried to use wget, curl and lynx with no luck.
UPDATE:
wget doesnt work with redirection. It simply downloads the webpage instead of the zip file. curl gives the error "Maximum redirection exceeded > 50 " lynx also gives the same error.
View 1 Replies
View Related
May 16, 2011
I'm on Ubuntu 11.04. I have read around about how to use curl to download a list of URLs from a text file, and everyone says to use Code:curl -K URLlist.txt. This is what the curl man page says as well. However, for even a simple file with one URL, this command outputs a bunch of weird symbols for me instead of downloading the file.For example, I have a text file "test.txt" with one line in the following format:
Code:
url = "http://www.example.com/image.jpg"
I use the curl command to download this file:
[code]....
View 7 Replies
View Related
Sep 27, 2010
Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.
nc -v www dot aalto dot fi 8080
HEAD / HTML/1.1
host: www dot aalto dot fi
And it returns:
200 OK
Content-Length: 858
Content-Type: text/html
Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT
[Code]....
I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.
View 2 Replies
View Related
Aug 4, 2010
I installed Nagios on my Ubuntu 10.04 server using apt-get and when I accessed the web console, everything was OK. I made some changes to apache (creating some new virtual sites) and since then Nagios gives me a warning message for HTTP with the message, HTTP WARNING: HTTP/1.1 404 Not Found. The sites that I created are working perfectly. I noticed that the attemps are 4/4. Does this need to be reset or does Nagios automatically reset that once it detects the issue is resolved?
View 1 Replies
View Related