Programming :: Attempting To Record The Redirect Url Using Wget And Logging In In Seperate Page?
Feb 10, 2009
My friend has a website whereby once you have logged in on one page, you are redirected to another page, with a url similar to:
[URL]
the random string changes each time you log in, however the login page has a static url What i was attempting to do is run a script to get some data from the members page (after uve logged in) - however ive been having some trouble in how to do this, as the variable url with the random string will become invalid after a certain time, and i did not want to consantly change it.
While reading through some documentation i read that wget should be able to login to a form login website however ive had no luck, the command i was attempting to use was:
and even both combined. However neither has worked as the html dl'd is simply the login page website. I cannot post a direct link to the website as it is private, however ive looked at the source coding and ive extracted (what i think) is the relevant bit, which is:
I've been googling this problem a lot these last couple of days, with no luck.The thing is, I need to record audio from an old Tascam four track cassette recorder. I have three tracks on the tape and I want to record them to three seperate tracks on the computer. I don't have and cannot afford a decent multi-track soundcard (one of the reasons I'm using the cassette recorder, another being really cool drum sound). This means I cannot record the tracks seperately and sync them afterwards, because the speed of each playback isn't 100% reliable.
I have a USB guitar link from Behringer, which I could use and has one mono plug. Pulse Audio picks that up as a seperate input and with Jokosher I can assign line-in left and right to two seperate tracks and the USB link to a third one. The problem is however that Jokosher constantly freezes up and I've never been able to make it work properly. So my question is: is there any other way/software I could use to record from two seperate audio sources?
I have a crontab that wgets a PHP page every five minutes (just to run some the PHP code), and I want to send the output to /dev/null. I couldn't find it in the wget manual.
using this parameters the main html page and all the images will download in the same folder. Instead i would like to have the html page in a folder and all the images,css ecc in a subdirectory for example i want to have:
After reading this pdf on top 5 things to log for security, ive decided to attempt this for my webserver. how i might setup some logging systems to do these tasks. Basic things i need to be able to do: Record things like password attempts on htaccess files, from what IP address, and how many attempts there were. Any useful links anyone can think of to get me started? Im a student programmer at university so any programming i should be able to cope fine.
quesiton is how I can redirect http://www.thispage.com to http://my.page.com and still retail displaying http://www.thispage.com.The issue is if I point in httpd.conf to directory where my.page.com it doesnt work because it knows it address.
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I am attempting to connect my new Brother HL-2230 printer over CUPS. However, when I try to log in to http://localhost:631/admin it asks me for authentication. When I enter the user name for my administrator account (limao) and password, it reprompts me for my username and password again. using
Is there a way to configure Apache so that if a user tries to access a webpage that doesn't exist under mydomain.com, that it redirects that user to the index.php?
wanted to know if i can execute commands on linux console through a web page and redirect it back to my web page !For example :if i send a query "ls"it should execute this command on my linux console and also redirect the list of the files to the web page from which i give the command !
Since I have upgraded to Lucid (and thus to Mozilla 3.6.6), I am experiencing a bunch of annoyances with Mozilla.
1) logging into my GoDaddy account keeps displaying the login screen and not my accounts page (yes, I checked the password) 2) creating an account in bugzilla resulted in an "Invalid OpenID transaction" error message instead of a confirmation page 3) the [URL] page popped up a parser error instead of the website.
Midori and Mozilla 3.0.8 work just fine. Does anybody else have problems with mozilla 3.6.6 and above?
I am exploring the Python 3 standard library and am currently attempting to test the bin function. It converts an integer into a binary string. I believe the module I wrote is flawed somehow. Here's the source code:
Code:
#!/usr/bin/python3.1
#This module tests the bin() function.
import sys def get_input(): x = input("Enter an integer: ") def use_bin():
[code]....
As you can see, the binary form given is always 0b10111. I'm no expert on binary code (or hexadecimal notation), but surely 9000 and two would have different results?
EDIT: Added a line in the module to repeat back what integer the user entered, and then the binary form. It would appear that no matter what integer the user enters, Python thinks it's "23".
Example output:
Code:
>>> Enter an integer: 1 You entered 23 The binary form of this integer is 0b10111 >>>
I configure squid to work with squidGuard , and all thing work properly , but there is problemfirst look to this squidGuard.confdhhome /usr/local/squidGuard/dblogdir /usr/local/squidGuard/log
I am writing a bash script where I would need to down load few file from server but the glitch is authentication is being performed by SSO/Siteminder server. Does anyone aware of a option or trick with wget or curl to authenticate against SSO and then download the file from the server.
Standard http-user and http-password definitely does not suffice the need.
I have written the batch file which will go to the website, wait for input (download button/exit), move to the next algorithym and repeat. My problem is getting the batch file to click the stupid download button. Can I use wget, and can you show me how to use it or point me to a really good api?
Code: @ECHO OFF ECHO INSTALLING ADOBE FLASH PLAYER PLUGIN UPDATE
I've been pulling my hair out trying to get wget to post data to a webpage to automatically download some files. I've tried many methods of syntax, but wget always downloads the html for the login page. A snippet of code I found in the login html page is below. Some of the characters are japanese, because it's a japanese website.
I am attempting to "export" the progress bar from wget display using sed. Basically, we have an app that starts wget to download a large file and we want to show a progress bar. Our application has a dbus interface to receive the download progress.
So we were think of a command like: wget [] | sed [] | dbus-send[] The problem at the moment is, how do you get the matched string out of sed and into dbus-send? I can get the progress string by: sed -u 's/[0-9]*%/&/'
This populated '&' with the correct percentage, but I cannot seem to get this out of sed.
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
Does unistd.h declare functions in the kernel or in the stdclib? Or in any other C file. I need to know from the inside how some of the functions work.
My hardware is the FriendlyArm mini2440 Samsung board. I run our main application from init.d/rcS. It uses printf's to display continuous program debug information that can be seen on the serial ttySAC2 console. I would like to be able to remotely view this information by a telnet connection when needed. What is the best way to accomplish this? The telnet user can possibly activate some script or program. I saw there is a ttysnoop program that I could not get to compile.
There must be a simpler way. There must be several ways. Instead of using printf I could do something else. I do not want to use a normal log file to the flash file system. I had thought of using a ram file in /var directory but it would complicate matters as I would need to limit its size and the telnet user needs to view the program's printfs in a real time fashion.
I am again struggling to make a script work, but hey, it is fun, I am learning new things. I discovered the set -x option which was, for me, like the second coming. Still, what I am not able to do is redirect ALL output to a (log) file, including what is produced by the -x setting. Let's assume a very simple script: Code: #!/bin/bash set -x source="/home/atelier/Bureau/" ls -la $source and I am running it as . test.sh >> /var/log/test.rmcb.log
The result of ls goes inded into the log file, but the rest still shows on the console where I am running the script: Code: ++ source=/home/atelier/Bureau/ ++ ls --color=auto -la /home/atelier/Bureau/ Is there a way to redirect EVERYTHING to the log file ?
I feel kind of embarrassed posting here, but this is technically a scripting sub-forum. Here is the problem. I have a folder with various files which include .txt files as well
How can i redirect same content to each of the .txt files in the folder?
I have tried Code: $ echo "hello" > *.txt -bash: !": event not found Code: also cat ~/otherdir/test.txt > *.txt -bash: *.txt: ambiguous redirect Can anyone help me with this?Ok i solved it Code: #! /bin/bash for file in *.txt do echo "Text that needs to be written" > $file done
I have a little complex Makefile system. A parent Makefile call dozens of Makefiles in subdirctories. And the subdirctory Makefile calles shell script to do real building. I want to grab all output this Makefile system generate. So, i employ "make 2>&1 > make.log". but not all output messages are filed into make.log. The message generated by sub-makefile called shell script cannot be recorded into make.log. And another curiouse thing is, if i launch "make 2>&1 > make.log" in a perl script, all output do be sent into make.log.
I would really like to capture the output of scp and my file's progress. Scp updates the transfer rate every 1 second, and I will like to save the transfer rate at every update. So for example, if the file transfer takes 30 seconds, I would like 30 reports of the transfer rate.
The output looks like: Code: file.dat 1% 3664KB 938.5KB/s 05:48
Whenever I try a simple redirect like: Code: scp file.dat 192.168.1.100:~/ &> output ... it does not save the rate at every update, it only shows the final rate.
If I try using typescript by starting "script" ... it's the same deal.