I have a debian box running Apache2 and PHP5.2.6 lenny.
When a request is made via https, php displays the content fine. If the request is made over HTTP the file is offered for download, rather than displaying it.
I know its probably something trivial but I've never seen this issue.
The plot thickens, I can display PHP over HTTP in some directories but not others (which offer the file for download)?
For one project I use a web hosting service. I wanted the entire site to be https, so I bought a service from them in which they automatically install a trusted cert so people can access the site through https protocol. Since http is still available, though, I need to do automatic rewrites or something to change http into https requests. (I don't have access to their Apache server configuration files or anything like that.)I found on the net this code to add to my .htaccess file:
Code: RewriteCond %{HTTPS} off RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}
How to best manage both http and https pages on the same apache-server without conflicts. For example, if i have both 000-default.conf and 000-default-ssl.conf pointing to mydomain.com, and don't want users who visit mydomain.com without specifically type the https-prefix to be redirected to the https-page - how to handle users using browserplugins such as https-everywhere etc?
Another option would be to create a subdomain ssl.mudomain.com and have users who want to reach the ssl site to have to type ssl. I have tested several things with https everywhere enabled in my own browser, and it seems really hard to make this working the way i want, in one way or another i always end up getting redirected to the ssl-site automatically.
The reason i need this to work is because i run one site that i don't care much about SSL, that is the "official" part of that site, and i also host some things for friends and family on the SSL-part. This would not have been a problem if it wasn't that i use self-signed certificates for my ssl-site and the major user become afraid when a certificate-warning pops up in their browser and therefor leave the site.
* a router/gateway. The external interface have the public IP, an other the DMZ, a third the internal room* a DMZ with the web server* an internal network (internet public room)I redirect the http port 80 to the web server. You should see him there.But I can't see this web site from the internal room. From the public IP /URL I have some sort of non existent message (sorry forgot to copy it). If I call for the private IP, I get the home page (but not the CSS files)the gateway nat's the networks.What is the trick to see the web site from the internal network?
I had setup an SSL secure server awhile back, such that: [url] works but [url]does not (note the different: in the first, I use HTTPS, whereas the second I use HTTP) How can I get both to co-exist?
How can I set my server to listen at a different port for http access. I would like to use port 8080 (to circumnavigate isp blocks). Also can I do the same thing for sftp connections?
I have my Linux laptop running Katatonic Koala at the moment. It is connected via CAT5 to a switch. The switch then connects to my router. All five of my computers are connected to the switch, actually. The only one that won't talk to any sites other than https secure sites is the Linux box. I am not well-versed in the inner workings of Linux and need some help in what I need to do so that regular http sites work. You guys always have the right anwers so I will wait humbly for your replies.
I have a server (Fedora 12) setup at a client's datacenter and the network is setup to allow me ssh access into the server, but prevents me from opening any connections from the server. However, I need to make http and https request from the server. What I'd like to do is forward all http/https traffic through another machine outside the network.
I've been looking at the documentation for ssh and the various options there and have gotten so far as to enable initiating an ssh connection from the client network back to my machine, but am not sure where to take it from there.
Here are some of the commands I've used so far:
Code:
I'm attempting to bind port 80 to be forwarded through the local machine. I assume I use "ssh -R" to create a dynamic tunnel to forward requests but I must be missing something.
I am trying to configure iptables for only HTTP and HTTPS traffic. I start by blocking all traffic, which works, via:
Code: iptables -F iptables -P INPUT DROP iptables -P FORWARD DROP iptables -P OUTPUT DROP
I then try to allow HTTP and HTTPS on eth0 with these commands, which does not work:
Code: iptables -A INPUT -i eth0 -p tcp --dport 80 -m state --state NEW,ESTABLISHED -j ACCEPT iptables -A OUTPUT -o eth0 -p tcp --sport 80 -m state --state ESTABLISHED -j ACCEPT
Code: iptables -A INPUT -i eth0 -p tcp --dport 443 -m state --state NEW,ESTABLISHED -j ACCEPT iptables -A OUTPUT -o eth0 -p tcp --sport 443 -m state --state ESTABLISHED -j ACCEPT After these commands I should be able to access the internet. Does anyone know why this is not working?
I am working on a project to create a video conferencing environment. For this I use a default installation of BigBlueButton on ubuntu 10.04. One of the main problems here is that it's not safe enough to share classified documents trough this software. It's a simple webserver that uses nginx. What I want to do is make this connection secure.
One of the problems is that I don't only have a connection trough port 80 but it uses the following ports: Port 80 (HTTP), 1935 (RTMP), 9123 (Desktop sharing). I would like to use a proxy instead of some tunneling or vpn to do this. Would anyone happen to know anything about squid or another equivalent to do this?
When I click on an http link in Thunderbird, nothing at all happens. There's no error message on the console, there's no new browser starting, and there's no new tab opening with the browser already running.
I've tried: sudo update-alternatives --config x-www-browser, choosing firefox.I've also tried: adding a new string value to Thunderbird's warranty-voiding config: network.protocol-handler.app.http, with a value of /usr/bin/firefox. This was recommended in various threads.But no luck.There's an entry for https - firefox on the Attachment tab of the preferences, and https links are indeed opening in Firefox. But not http.KDE 4.4.2 itself has default mail client and browser set to Thunderbird 3.0.4 and Firefox 3.6.3 (both from the repositories, no website downloads).
What is the best way to go about setting up multiple virtual hosts on the same box, one using http and one using https/ssl? I'd like to serve them from the same ip address if possible; I know it's possible in apache 1.3.
Is there a driver or utility to mount an iso image over http or https? There is a httpDisk driver for windows but I can't find anything similar for Linux.
There may be a way of using curl/wget but not sure if this is possible.
I have 2 web server in my office : http and https. You will find attached the httpd.conf and ssl.conf. I can acces the https server from home, but not the http one.
What I did : configure the router to forward port 80 to my fedora 11 machine open port 80 with system-config-network created a virtualhost
The same exact steps have been done for port 443
I can access both server locally but only the https server remotelly.
Here are my iptables :
Code:
you can try to acces my servers using [url]
I made httpd to listen to port 8080, and done all the port forwarding/opening stuf, and it works. so is it a bug ?
Finally found my error seams like turning off UseCanonicalName to off did the trick
I really think it's a bug now. It was definitively working last week, I just added content to the main host of my website, and now i can't acces it from port 80. If someone think it's not a bug or find someting missing/wrong in my conf file.
I need to redirect all http/https/ftp traffic through the remote proxy, but when I changes connection settings in browser or in System->Preferences->Network Proxy it doesn't work well: instead of getting page content browser asks for saving some short (8 bytes) file with the same content for all requested pages. It happens in Chrome/Opera/Firefox. This proxy requires authorization and works on computer with Windos XP. It worked well when I was using Windows 7 and Proxifier, now I have Ubuntu 9.10 with all available updates.
I've noticed that when I open firefox I get really strange HTTP and HTTPS connections showing up in firestarter (which as I understand it is just a GUI for IPtables). They connect to various bits of a site listed as 1e100.net (when you use "lookup hostnames") such as wy-in-f18.1e100.net, they stay connected all the time as far as I can see unless I close firefox. I've heard people say they are connected to Google, but I can close all tabs after loging out of google and still see them... it's very odd.
I am doing this as a test for bigger deployment. I have Apache running on CentOS5 in a clean VM (just a few tools installed, PHP and such).If mod_ssl is set to listen on 443 in /etc/httpd/conf.d/ssl.conf the the site at https://192.168.1.137 loads just fine.If I change the listen port to eg. 9443 in ssl.conf and reload httpd the page wont load athttps://192.168.1.137:9443 - I set the eth0 to be trusted and disabled SElinux in case that was interfering but still no luck.
This is where it starts: I have 2 networks. The first: 192.168.1.0/24 composed by the router which has access to the internet with the IP 192.168.1 and the server (who is a gateway) with the IP 192.168.1.42 The other network: 192.168.2.0/24 composed by the gateway with the IP 192.168.2.1 and the clients (on the 192.168.2.0/24 subnet). To sum up, the gateway has 2 IPs (192.168.1.4(eth0) and 192.168.2.1(eth1)). On this gateway, I have squid installed (and listening on port 3128). I also made a redirection to redirect some computers who want to access to the web (port 80) to squid (port 3128) with this command: /sbin/iptables -t nat -A PREROUTING -m mac --mac-source CLIENT_MAC -p tcp -m tcp --dport 80 -j REDIRECT --to-port 3128
At this stage, everything works fine. The clients can access the web by the proxy without "knowing". What I wanted to do, is redirect also the port 443 (HTTPS). Actually, when a client wants to access to, for example, [URL]. He cannot. So I would want to be able to redirect people (without passing by any proxy) directly to google. Like a NAT. But the problem is that I can't. The thing would be to, in the gateway, take all the packets with port 443 in destination and handle them to the router 192.168.1.1. Then, when the router sends the packet back, the gateway takes the packet and handles it to the client. I tried putting ip_forward to 1, but the problem is that all IPs and ALL PORTS are forwarded. And I just want port 443 to be forwarded.
I've been trying to make myself anonymous, but I cant find 'Tor' anywhere, tried 'yum & kpackagekit' neither have it. I did find 'Privoxy', installed it, set proxy for HTTP and HTTPS in Firefox, but it says 'unknown proxy' when I try to use it! I've been to the Privoxy web site and read through the 'User manual', but most of it is 'geek' to me!
I installed Nagios on my Ubuntu 10.04 server using apt-get and when I accessed the web console, everything was OK. I made some changes to apache (creating some new virtual sites) and since then Nagios gives me a warning message for HTTP with the message, HTTP WARNING: HTTP/1.1 404 Not Found. The sites that I created are working perfectly. I noticed that the attemps are 4/4. Does this need to be reset or does Nagios automatically reset that once it detects the issue is resolved?
I have Ubuntu Server (x64) installed on my box with Apache2 and Squid. For awahile port 80 (http) was fine, I could update packages and use wget. Then one random day port 80 became blocked for incoming traffic. I couldn't use apt-get and had to change to an ftp mirror to update. Also wget is not working.
It appears that my ISP is blocking port 80, so I can't set up a proper website on my home computer. I'd like to choose a different port to use (they block 443 also), and I'm not sure how to do this with Fedora (or any Linux flavor for that matter