General :: Wget Works But Elinks Won't Go Online?
Apr 11, 2011
I've got a bit of tricky situation with a Red Hat box, our network uses a proxy server to connect to the internet. I've exported its address using "export http_proxy=http://usernameassword@server_iport" and to test I ran a wget against www.google.co.za and got a response, but using elinks to connect to google gives a proxy requires authentication error. It cant possibly be the credentials as wget can connect. I've confirmed that the proxy was exported by running "env | grep -i proxy", I also tried appending that line on both "/etc/profile" and/or "/root/.bash_profile" with no luck. one other thing I noticed is that passing those arguments directly to elinks works, not only this but "ssh -X server_ip_address" then running firefox& and giving it the proxy address also works. Is the any other way to make this line global, meaning to tell all applications that require internet connection to use the proxy? The server is sitting at a remote location and I only have access to a shell.
View 2 Replies
ADVERTISEMENT
Dec 11, 2010
Would it be possible to use wget to order something on e.g. [URL].
View 4 Replies
View Related
Apr 8, 2011
I am interested in making wget do a slightly different function for me. I have downloaded it, built it (1.12) and it works perfectly right out of the box. would like to have it login to my creditcards.citi.com https website, give my user id, my password and "select NEXT-SCREEN label=Account Activity", then capture the account activity that returns.
I got these three values in my firefox Selenium script that runs perfectly time after time. My big picture goal, is to be able, on a crontab, to dump my account activity every night at midnight. I am not married to this idea if anyone has a better or different route.
View 1 Replies
View Related
Jul 6, 2010
I just installed the 10.04 on my laptop a compaq 6510b that was was running xp. Ubuntu cant seem to connect to the internet unless i have it hooked up via ethernet cable but the wifi recognizes my network. in windows i use to log in via the wirless network to my boardband connection.
View 3 Replies
View Related
Feb 19, 2010
I tried to set up my internet connection using the network manager and not the classic iufp method. At first it just didn't work but after deactivating IPv6 I got it working: firefox can connect to the internet. What troubles me is the online update with Update Applet. The following message pops up: "Error: packageKit error repo-not-available: Failed to download /media fromAlso when I go to Yast->Software->Software repositories I get the msg:Systemadministration is blocked by PID 2459 (/usr/sbin/packagekitd). Close the application and try again.Using the command "sudo zypper refresh"eates the same message.I spent already hours trying to figure out the problem without luck
View 3 Replies
View Related
Oct 1, 2010
At some point, the install process asks whether to test internet access. I replied Yes, and the test failed. Retries also failed, leading me to conclude, mistakenly, that the ethernet card was not supported by SuSE 11.2. Finally I elected to skip the test, resigned to the loss of internet access. The install program then proceeded to access online repositories without difficulty, and I had no further difficulty accessing the internet. So why the misleading test result?
View 3 Replies
View Related
Oct 3, 2010
I'm creating a website to sell some materials, but I don't know how to do online payment. How it works online payment?
View 2 Replies
View Related
Jun 19, 2011
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
View 2 Replies
View Related
Mar 9, 2011
I installed elinks from a slackware (slack 12.2) package, and no javascript anything is working. is that something i can only enable before install, or is there something in the setup menu i can do to turn it on?
View 1 Replies
View Related
Jan 29, 2010
how I can get elinks to open pdfs using xpdf? I know that there are other browsers/pdf programs but I'm using this across the internet and this seems to be the lowest bandwidth option!
View 1 Replies
View Related
Feb 9, 2011
I installed elinks on my computer, but I didn't realize until afterward that it was going to be just black&white by default. I did some googling and saw ways to edit a file before compiling so it was enabled. my question: is it possible to turn the colors on AFTER compiling it?
View 1 Replies
View Related
Jan 13, 2010
After an afternoon of googling, I'm beginning to wonder if I'm the only rxvt / elinks user who also wants to copy text from the browser! My problem is thus... rxvt uses Shift-Left-Click to paste, and elinks uses Shift-Left-Click to select. I need a way to either change elinks (preferably without disabling mouse support - which solves the problem BTW) or RXVT. RXVT is on cygwin, and elinks is on archlinux.
View 2 Replies
View Related
Aug 5, 2010
I install a nautilus plug-in which in turn installed the elinks browser and now all of my .html files are opening in elinks instead of firefox so how do I get .htm/.html and other web type files to open in firefox instead of elinks.
I also tried removing elinks from my system but then the files just open in gedit so again how do I associate .html (and other web files) with firefox.
BTW firefox is already set as my default browser in System > Prefs > Prefered Applications > Web Browser
View 3 Replies
View Related
Jul 19, 2010
Before i configured proxy server elinks open web pages. I configured squid proxy server but the elinks display error Unable to retrieve web page. squid running successfully but configuration not working My Configuration isacl mynet 192.168.1.9/255.255.255.0 http_access deny mynethis simple configuration is not working i dont know what i put wrong.Note : Centos 4 running as a guest Operating System in windows xp(Microsoft Virtual PC 2007)
View 1 Replies
View Related
Sep 6, 2011
I need to mirror a website. However, each of the links on the site's webpage is actually a 'submit' to a cgi script that shows up the resulting page. AFAIK wget should fail on this since it needs static links.
View 1 Replies
View Related
Mar 5, 2010
I have a computer under Linux with several network cards, for example: eth0, eth1, eth2, eth3. Is there some way to run any downloader, like aria2 or wget only through one interface, for example eth0?
Main problem: for some reason I can't use iptables
View 2 Replies
View Related
Oct 6, 2010
I'm doing this wget script called wget-images, which should download images from a website. It looks like this now:
wget -e robots=off -r -l1 --no-parent -A.jpg
The thing is, in the terminal when i put ./wget-images www.randomwebsite.com, it says
wget: missing URL
I know it works if I put url in the text file and then run it, but how can I make it work without adding any urls into the text file? I want to put link in the command line and make it understand that I want pictures of that certain link that I just wrote as a parameter.
View 1 Replies
View Related
Mar 14, 2011
i use this code to download :wget -m -k -H URL... but if some file cant be download , it will retry Again and again ,so how to skip this file and download other files ,
View 1 Replies
View Related
Apr 2, 2011
I need to download about 100 packages so I'm using wget-list to make it easier. My question however, is once I've made the list (I assume it's in a .txt format), is there a way I can insert comments into it that wget will ignore? Something like this:
#This is a comment
http://someurl.com
http://anotherurl.com
View 2 Replies
View Related
Aug 9, 2011
I have a crontab that wgets a PHP page every five minutes (just to run some the PHP code), and I want to send the output to /dev/null. I couldn't find it in the wget manual.
I'm looking for something like:
wget -o stout http://whatever.com/page.php > /dev/null
View 4 Replies
View Related
Jan 15, 2010
I had the bad surprise that wget doesn't redownload when a file of the same name already exist.
Is there an option to force him to redownload without deleting the file first on linux ?
View 3 Replies
View Related
Jan 5, 2010
I have a website that I need to go to often and disable a monitor. To disable I need to login to the website-> click on monitor -> then uncheck a box.
I am told that I can do this through a scipt using the WGET command. I got the parameterized query and then tried to execute it through a *.sh script.
The script generates a php file in the location from where it is executed. When I go to the site and check the monitor is not disabled.
View 2 Replies
View Related
Oct 1, 2010
I m trying to access a site through a perl script for a project of mine, and i use a system call fora wget.
The login form is this
Code:
I mean should i add in the --post-data all the hidden fields? should i try using perl's md5 function for the last two fields? anyone has any idea on what are the elements i should be sending along --post-data?
Is there a way to --load-cookies from mozilla or something similar instead of creating new cookies with wget?
View 1 Replies
View Related
Mar 6, 2011
I would like to use wget to downlaod file from Redhat linux to my windows desktop , I tried some parameter but still not work , can advise if wget can do download file from linux server to windows desktop ? if yes , can advise how to do it ?
View 14 Replies
View Related
Oct 5, 2010
i am using Ubuntu 10.04 when i downloaded some thing using wget like wget [URL] where this page will get downloaded and second thing sudo apt-get install perl-doc i installed documentation for perl the same i have for postgreSQL... how to use these perl documentation in learning perl.
View 1 Replies
View Related
Oct 7, 2010
I want the wget command to work on my linux machine.This is the output of the uname command on my machine Linux kalpana
Quote:
I get the error -ksh: wget: command not found.So can anyone tell me how do I install the wget utility on my machine.
View 5 Replies
View Related
Jun 25, 2010
I am trying to download data/file from web server where htpassword has been setup, I have tried with browser it its working fine, but when trying to same with 'wget' its not working, how to download the file. Below is the command I am using. [URL]... admin[:]password (may be smily get overide)
View 4 Replies
View Related
Mar 20, 2011
When I wanna use wget to download some file by http, which conditions fulfilled on the server would make that successful. I mean that such service httpd is running and so on.
View 1 Replies
View Related
Apr 12, 2010
I am trying to use wget to access a RESTful interface, but I can not figure out how to do HTTP PUT with wget. How can I do it? Or isn't it prossible?
View 2 Replies
View Related
Jun 29, 2010
I'm trying to download two sites for inclusion on a CD:URL...The problem I'm having is that these are both wikis. So when downloading with e.g.:wget -r -k -np -nv -R jpg,jpeg, gif,png, tif URL..Does somebody know a way to get around this?
View 2 Replies
View Related