I've been pulling my hair out trying to get wget to post data to a webpage to automatically download some files. I've tried many methods of syntax, but wget always downloads the html for the login page. A snippet of code I found in the login html page is below. Some of the characters are japanese, because it's a japanese website.
To make an rpc call I need to sent an xml file as post data.I know how to do this with wget. It works fine when I have the xml already filled in (depending on the node values the response from the call is different).owever I want to be able to edit part of this file, and then sent that as post data using wget. can edit this file using sed (I dont want to rewrite the files each time this gets used; and it does get used alot, with alot of different values).
I would like to find out how I would use both curl and wget to sent an http post to get the hostnames of a few servers. I know am not even given any work of anything I have done, but the reason is that I am really lost, and I do not even know how to start it.
i am using Ubuntu 10.04 when i downloaded some thing using wget like wget [URL] where this page will get downloaded and second thing sudo apt-get install perl-doc i installed documentation for perl the same i have for postgreSQL... how to use these perl documentation in learning perl.
I'm trying to download all the data under this directory, using wget: [URL] I would like to achieve this using wget, and from what I've read it should be possible using the --recursive flag. Unfortunately, I've had no luck so far. The only files that get downloaded are robots.txt and index.html (which doesn't actually exist on the server), but wget does not follow any of the links on the directory list. The code I've been using is: Code: wget -r *ttp://gd2.mlb.***/components/game/mlb/year_2010/
I'm trying to use wget to retrieve some data from our tape backup utility (HP Command View 1/8 G2 Autoloader). The URL requires two parameters for the info I want to retrieve. I have searched for a few hours and have tried numerous combinations to get the data but the parameters aren't being executed. I have escaped the URL as well.
I am trying to connect to the web interface found at [URL] using curl. This first requires login information to be entered at [URL], but I am having an issue with the login process. I am trying to submit the following form via POST:
Code: <form action="j_security_check" method="post" id="login_form" name="login_form"> <center> <table style="background: #cac1cf;FONT-SIZE: 12px;"> <tr> <td align="center" colspan="2">Please enter your username and password:</td> </tr> <tr> <td align="right">Username</td> <td> <input name="j_username" style="width: 250px" id="j_username" type="text"/> </td> </tr> <tr> <td align="right">Password</td> <td> <input style="width: 250px" name="j_password" id="j_password" type="password"/> </td> </tr> <tr> <td colspan="2" align="center"> <input value="Enter" name="enter" type="submit"/> <input value="Clear" name="Clear" type="reset"/> </td> </tr> </table> </center> </form> The command that I am using for this is the following:
Code: curl -c cookies -b cookies -L -d "j_username=user%40domain.com&j_password=pass" [URL] The command is properly formatted as far as I can tell. I tested it with another website using a similar authentication scheme using different POST variables specific to the form and it worked fine.
When I run the above command with the -v tag, it reveals this: Code: * Connected to lcl.uniroma1.it (151.100.4.74) port 80 (#0) > POST /sso/j_security_check HTTP/1.1 > User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18 > Host: lcl.uniroma1.it > Accept: */* > Content-Length: 44 > Content-Type: application/x-www-form-urlencoded > } [data not shown] < HTTP/1.1 408 The time allowed for the login process has been exceeded. If you wish to continue you must either click back twice and re-click the link you requested or close and re-open your browser < Date: Sat, 29 Jan 2011 15:26:41 GMT < Server: Apache-Coyote/1.1 < Content-Type: text/html;charset=utf-8 < Content-Length: 1554 < Connection: close < { [data not shown] 103 1554 100 1554 0 52 5081 170 --:--:-- --:--:-- --:--:-- 10223* Closing connection #0
I cannot tell why the login timeout is expired when I try to do this, and my investigation toward this end has been fruitless. I saw a brief snippet on Google that vaguely suggested that the underscores in the domain name were at fault, but replacing these with their encoded counterparts did nothing to resolve the issue (that, and underscores should be fine when sent unencoded according to the standards). I have extensively perused the man pages and have come up with nothing to adequately explain this behavior. I also talked to a friend who has worked with curl in his line of work, but he mostly has experience in the context of PHP and has not dealt with this issue before. I am running GNU/Linux 2.6.35-22-generic-pae.
I am writing a bash script where I would need to down load few file from server but the glitch is authentication is being performed by SSO/Siteminder server. Does anyone aware of a option or trick with wget or curl to authenticate against SSO and then download the file from the server.
Standard http-user and http-password definitely does not suffice the need.
I have written the batch file which will go to the website, wait for input (download button/exit), move to the next algorithym and repeat. My problem is getting the batch file to click the stupid download button. Can I use wget, and can you show me how to use it or point me to a really good api?
Code: @ECHO OFF ECHO INSTALLING ADOBE FLASH PLAYER PLUGIN UPDATE
I am attempting to "export" the progress bar from wget display using sed. Basically, we have an app that starts wget to download a large file and we want to show a progress bar. Our application has a dbus interface to receive the download progress.
So we were think of a command like: wget [] | sed [] | dbus-send[] The problem at the moment is, how do you get the matched string out of sed and into dbus-send? I can get the progress string by: sed -u 's/[0-9]*%/&/'
This populated '&' with the correct percentage, but I cannot seem to get this out of sed.
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
I am using read() in c++ to get data from a serial port. However, if no data is available on the serial port the function blocks until dta arrives.Example code:
I wrote a code for login verification..I got output with GET. But I need output with POST since it is more secure. Any error in my code.
javascript code: var xml; function verifyusernamepasswd(pass) { //pass is password that will be passed as parameter xml=new XMLHttpRequest(); var url="http://localhost/loginvalidate.php"; var para="q="+username+"&p="+pass;//username is global xml.setRequestHeader("Content-type", "application/x-www-form-urlencoded"); xml.setRequestHeader("Content-length", para.length); xml.setRequestHeader("Connection", "close"); xml.open("POST",url,true); xml.onreadystatechange=statechanged1; xml.send(para); } .....
The verification does not return anything, cos my alert is not displayed at all.
I have some data files that should be distributed with my program. Using dist_pkgdata_DATA in Makefile.am, I get these files installed to /usr/local/data/share/package-name. The problem is that data is read-only, and my program needs to modify it. Playing with dist_sharedstate_DATA, dist_localstate_DATA, dist-data_DATA varibles, I got different installation directories, like /usr/local/com, usr/local/var, but data is always read-only.
How can I distribute modifiable data files with my package? I need some common directory for all users, or maybe local data in a user directory.
My friend has a website whereby once you have logged in on one page, you are redirected to another page, with a url similar to:
[URL]
the random string changes each time you log in, however the login page has a static url What i was attempting to do is run a script to get some data from the members page (after uve logged in) - however ive been having some trouble in how to do this, as the variable url with the random string will become invalid after a certain time, and i did not want to consantly change it.
While reading through some documentation i read that wget should be able to login to a form login website however ive had no luck, the command i was attempting to use was:
and even both combined. However neither has worked as the html dl'd is simply the login page website. I cannot post a direct link to the website as it is private, however ive looked at the source coding and ive extracted (what i think) is the relevant bit, which is:
just curious, this might not be possible, but is there any way with ajax to open an ftp connection, download a file and then turn around and POST it to a web server? the reason i ask is because i'm writing a script on a shared hosting plan that doesn't give me permission to directly talk outbound on the ftp port but i need to synchronize a file from a csv on an ftp server, this means i have to do the process manually, is there any way i can just click the button and let my browser do the work?
I think what i need to do is update the certifcate for the apache2, but I'm not sure how to do this, where to put it, and then which of the thousand apache config lines needs to be changed
I am trying to generic way to convert the string datatype to other primitive data type. To achieve, i used Template . But i getting error and couldn't resolve the issue and error reported is also clueless.
For some odd reason, I cannot post on the ubuntu forum and the LinuxMint forum. Yea, I know.... the irony... I am using Mozilla and have tried Chromium, but that did not fix my problem. When I click on "submit" to post a thread, the page will just say "loading..." and nothing happens for a really long time. Does anyone know what is up? I tried posting on one other forum that I go to often and it seems to work out fine. I haven't tried any other forums though.
I'm just starting out on a project relating to web search, to be done in C++. Which library should I use to help with downloading web pages into memory so that I can process them? The big thing is I want to be able to download the pages into variables/structures without actually putting them onto the hard disk.I googled and saw libcurl, but I was confused by some of the examples and wondering if this was really what I wanted.
have a tag in XML file in unix like this <EmailAddress>abc@gmail.com</EmailAddress> this tag is there for multiple times in the xml file and the data is in continuous line like below State>UN</State><Zip/><CompanyName/><EmailAddress>FDF@gmail.COM</EmailAddress><PromoType>UNKNOWN</PromoType></Promotion></PromotionList<State>UN</State><Zip/><CompanyName/><EmailAddress>zd4946@gmail.com</EmailAddress>I have to check the data in between bold tags is valid or not ... means have to check whether its a email address or not and have to find the length of the attribute means tag ...script is in kshsorry if its already asked...i checked but i didnt get Exatly matching result for my requirement
I'm trying to work out the best way to achieve the following.
1)php page that grabs data from a local database. (not a problem)
2)It then needs to send this data to a c program/service running on a remote server. (I probably need it to be able to handle 4+ million reocrds in an array)
3) The c service then needs to process the data and send it back to the initial php script that called it. i was hoping this could be in a an array like structure of some kind. 4)update the db with the results.
I was thinking of using gsoap to write a simple c soap service that php can communicate with. Would this be the right way of doing this or would something like sockets in php be a better way of sending this volume of data as an array or struct to linux c socket if thats even possible.
I need to read data from a socket but it should be always listening because data arrives continuously .. I thought something like this would do it but it doesn't work .... I already set the socket options before
I declared a variable as int data type which was a placeholder for a resulttatus (meaning, based on the result status that variable varies from 0/1/2/3. It can contain only these four values).data type is 4 byte integer in C#. But I can declare this variable as which represents a 1 byte integer. Since that variable contains only 1 or 2 or 3 or 4, declaring the variable asis wastage of memory space.Before declaring any variable we need to just think of the memory space needed for that variable, our requirement,
I'm looking for a way to access the raw data on a CD.I have a small program that I'm using to play raw PCM data pushed into stdin.I'd assumed that I could just use:play-pcm < /dev/cdromBut this isn't producing any data.Will I need to do this programatically or is there a simple way for me to grab raw data from an audio cd in the same way I might do so for a data cd?