Fedora :: Cannot Test Proxies, Or Remote File Availability Without Both HTTP::Request:ommon And LWP::UserAgent
Mar 30, 2010
I am trying to istall litb for P2020DS. I got an error the following error:
[hwtesting@HWLSRV1 ~]$ cd /home/hwtesting/ltib-p2020ds-20091119
[hwtesting@HWLSRV1 ltib-p2020ds-20091119]$ ./ltib
Don't have HTTP::Request:ommon
Don't have LWP::UserAgent
Cannot test proxies, or remote file availability without both
HTTP::Request:ommon and LWP::UserAgent
add folowwing line to User Privilage section:
hwtesting ALL = NOPASSWD: /bin/rpm, /opt/freescale/ltib/usr/bin/rpmvisudo
I edit the sudoers by visudo command and insert this line just under the following line:
root ALL = (ALL) All
But still I am getting the following:
Don't have HTTP::Request:ommon
Don't have LWP::UserAgent
Cannot test proxies, or remote file availability without both
HTTP::Request:ommon and LWP::UserAgent
Using netcat, nc(1), craft a valid http/1.1 request for getting http headers (not the html file itself!) for the main index page of www dot aalto dot fi. What request method did you use? Which headers did you need to send to the server? What was the status code for the request? Which headers did the server return? Explain the purpose of each header.
nc -v www dot aalto dot fi 8080 HEAD / HTML/1.1 host: www dot aalto dot fi And it returns: 200 OK Content-Length: 858 Content-Type: text/html Last-Modified: Thu, 02 Sep 2010 12:46:01 GMT [Code]....
I really don't know what does it mean. Question 2: Using netcat, nc(1), start a bogus web server listening on the loopback interface port 8080. Verify with netstat(, that the server really is listening where it should be. Direct your browser to the bogus server and capture the User-Agent: header "Direct your browser to the bogus server and capture the User-Agent: header" I don't understand this question.
Recently My ISP provider blocked any kind of http proxies can be used in browsers. When I put my proxy settings in my browser, it keeps loading with no response. I've squid proxy running on my own server and worked fine before that modification.
I am trying to setup a High-Availability HTTP Load Balancer With HAProxy & Heartbeat using the below links.
I have all RHEL 5.4 servers hosted on VMWare.
[url] [url]
This is the scenario, as given in the links as wells as my setup.
Load Balancer 1
Load Balancer 2
Web Server 1
Web Server 2
I have followed all the steps mentioned in the links religiously except the 2.2 here, in which it is asking to configure the vhosts. I could not really understand , what is to be placed in /etc/httpd/conf.d/vhosts.conf file and in which Web Server.
Due to this step only, I think I am failing in Failover test given in Point 4.1 here. I am able to open the webpage by [url] which gives the content of Web Server 1 (http1.example.com). But, when I try to shutdown the http service (to check failover), it does not shows the contents of Web Server 2 (http2.example.com)
Although, I am able to succeed in Failover Test 4.2, in which shared IP 192.168.0.120 switches when I try to start/stop the any of the Load Balancers.
I can ssh to my server which is on a LAN accessing the'Net through a Linksys modem/router.I want to be able to configure the Router by using the it's web interface, but the server only has a Command Line Interface and I can only run text browsers like Lynx,hich, although I can log onto the router, the Javascript routines mean that I can't configure the router.I can't access the router's web interface from the 'Net because the router is set up to pass any requests on port 80 to the server.Is there any way I can communicate with the router by sending HTTP requests from my browser external to the LANhaving these relayed to the router by the server and then the server relaying the responses back to my browser.
My application has to listen to http request and it must be able to read the http header and then forward the request from proxy. All these things must be done on C/C++. please help me. Awaiting for your reply.
Hi, In squid i have blocked some sites like facebook and ......I want to know is there any way when user type in his browser like www.facebook.com instead it show something like following it automatically redirect to www.google.com
Error The requested URL could not be retrieved The following error was encountered: Access Denied.
Basically I want to redirect the http request so the user should not see the page not found error but www.google.com page may open automatically.
i want to redirect the packet to proxy server. can u help me.
Present network.
MY internal network ==> switch ==> proxyserver ==> router ==> internet. (for internet i use to connect proxy, in web browser==> lan settings ==> proxy server ip address )
What i want is
My internal network ==> getway or firewall ==> switch ==> proxy server ==> router==> internet. ( where this getway or firewall i can configure for forward http request to proxy server.)
so that i can separate my internal network from intranet but able to access the internet.
Cannot get vmware server to work properly running on ubuntu server 9.04
Trying to access the web interface have to highlight the url and keep hitting enter several times to get to the login and after logging in it is real slow and nothing works cannot create virtual machines
I'm trying to see regular http responses from my wireless ipad (victim) from my wired pc (attacker). Everything's working great but I can only see the http requests not the responses.
I've done much reading and googling and tried registering in more relevant forums but some forums were shutdown, so I've come here.
Code: # setup ip forwarding echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward # use ettercap to do the mitm using only mitm sudo ettercap --iface eth0 --text --plugin autoadd --only-mitm --mitm arp:remote /192.168.0.1/ /192.168.0.155/
I have to retrieve a http request from a particular port using libcurl. I'm using localhost .I am done with retrieving http request using socket programming. how to start integration of libcurl in simple socket programming code.
i am forwarding HTTP request to a internal server, it is quiet successful but access logs donot show the ip of the external m/c. Rather it shows the ip of the machine on which i have enabled port forwarding.
I installed WordPress 3.x on my localhost/Apache server, but I can neither install plugins nor update anything.This happens with both the stable WP3.0 version and the 3.1 beta. When I try to search the Plugin Directory from the WP dashboard, I get this message: An Unexpected HTTP Error occurred during the API request.When I run an update, I get a page asking for the login credentials for the ftp user ("To perform the requested action, WordPress needs to access your web server. Please enter your FTP credentials to proceed. If you do not remember your credentials, you should contact your web host."). Since I'm part of the 'ftp' group on the system, I enter my system login information, click Proceed -- and get a blank page that does nothing.
I've gone to YaST, and I see that the system ftp user has a 6-character password (which may or may not be mine). I'm afraid to change it and risk screwing up other ftp-related functions. I'm running openSUSE 11.3, and am obsessive about updating. I will note that I have an old 2WIRE router that often requires me (including Zypper repos) to enter IP addresses instead of DNS-based URLs to successfully download stuff. Not sure if this is related, but just in case...
I am trying to install mysql 5.1.44..so i downloaded the binary package, i extracted it and then followed the instructions that were in the manual but i keep getting this error when running this command
Installing MySQL system tables... 100315 20:07:27 [Warning] Can't create test file /var/lib/mysql/mosty.lower-test 100315 20:07:27 [Warning] Can't create test file /var/lib/mysql/mosty.lower-test
Usually we require vnc to take remote sessions. There was one another i think it was called xdrp or xrdp. I am asking this out of curiosity, is there any way to take remote sessions using http. Like in web conferencing, we invite users to join the conference and then we are able to share desktop. Is there any way to do this on one-to-one basis ? is such a technology exists for linux (for any disto) ?
I'd like to remotely administer my Linux machine at home whilst I'm at work. Only ports 80 and 443 are avaiable, through an HTTP proxy. I don't want to install tunnelling software.What I really need is something that'll run on my server and display a console inside a web browser.
Newbie on the block with Centos 5.3 on a test server and and Ubuntu on a test workstation Note the wiki on VNC for a windows workstation but presume I could do this more easily from "Jaunty" (Ubuntu v 9.04)
i have a problem........ How to redirect local http port to remote ip ddress(192.168.10.64) using iptables..my destro is Centos 5.3 my rule is this iptables -t nat -A PREROUTING -s 0/0 -d <my local ip> -p tcp --dport 80 -j DNAT --to-destination 192.168.10.64
I think I had this problem before but I don't remember what was done to correct it. I just tried to do an update and got this error message.
Code: Test Transaction Errors: file /usr/lib/gstreamer-0.10/libgstfbdevsink.so from install of gstreamer-plugins-bad-free-extras-0.10.18-1.fc12.i686 conflicts with file from package gstreamer-plugins-bad-0.10.17-2.fc12.i686 file /usr/lib/gstreamer-0.10/libgstoss4audio.so from install of gstreamer-plugins-bad-free-extras-0.10.18-1.fc12.i686 conflicts with file from package gstreamer-plugins-bad-0.10.17-2.fc12.i686 file /usr/lib/gstreamer-0.10/libgstsdl.so from install of gstreamer-plugins-bad-free-extras-0.10.18-1.fc12.i686 conflicts with file from package gstreamer-plugins-bad-0.10.17-2.fc12.i686 file /usr/lib/gstreamer-0.10/libgstshapewipe.so from install of gstreamer-plugins-good-0.10.21-1.fc12.i686 conflicts with file from package gstreamer-plugins-bad-0.10.17-2.fc12.i686 file /usr/bin/gst-camera from install of gstreamer-plugins-bad-free-0.10.18-1.fc12.i686 conflicts with file from package gstreamer-plugins-bad-0.10.17-2.fc12.i686 file /usr/bin/gst-camera-perf from install of gstreamer-plugins-bad-f...
when I try to access any page even small html pages it stays like 3 seconds in HTTP request sent; waiting for response. state..even when I use Lynx locally on the server..bypassing any possible network issues..logs dont show a thing..the server itself is a high end server with nothing running on it apart from apache which is not serving anny clients now, firewall is disabled and hostnamelookups are set to OFF.
I need to redirect all http/https/ftp traffic through the remote proxy, but when I changes connection settings in browser or in System->Preferences->Network Proxy it doesn't work well: instead of getting page content browser asks for saving some short (8 bytes) file with the same content for all requested pages. It happens in Chrome/Opera/Firefox. This proxy requires authorization and works on computer with Windos XP. It worked well when I was using Windows 7 and Proxifier, now I have Ubuntu 9.10 with all available updates.