Networking :: Setting Up To View One Website Through Another?
Mar 16, 2010
I have several web servers, say myserver1, myserver2, and myserver3, behind a firewall that the higher-ups run. They recently changed the firewall to block all ports except for the ones they want open. I have www.myserver1.com viewable to the outside world, but www.myserver2.com is not viewable to the outside world. I was wondering if there is a way to set things up so that people could go to [URL] and view the [URL] and [URL] website content.
myserver1 is running Solaris 7, which will (hopefully) be upgraded to Ubuntu 10.04. myServer2 is running Ubuntu 8.04. myserver3 is running Windows XP. I was wonder if this is possible with any OS, not just mine.
I run a gaming clan forum on IPBFree. By all accounts the servers are Down, permanently. Taking with them 2 years of my Clans Forum posts. I was wondering if maybe Firefox or Chrome keep some sort of local Cache on the HD that i can try and salvage as much data as possible from.
when i type localhost i can see the page, when i type 127.0.0.1 i can see the page, when i type 192.168.0.30 i can view the page when i type the websitename.dyndns.info i can't seem to view the page or when i type 96.24.144.216 can't see the page.
I am running opensuse with LAMP, and this is my first time setting up this type of server (usually am a windows junkie) My problem i am having is that I am unable to view my website from outside the local network. I have setup my router for a dynamic dns and forwarded all the ports through the router and the local firewall. I ran the apache setup through YAST2 and everything seems fine locally but when i attempt to access it elsewhere its not connecting. computechsolutions.dyndns.biz is the dynamic dns address i have setup through my router.
to allow the services I need, am I missing anything ? I assume allowing ssh will also allow scp ? (heck I will allow sftp as well anyway).However my problem is I am connecting remotely, so the only way I can do what I want is to actually do a
Code:
sudo ufw default allow
then use a list of the services provided by
Code:
less /etc/services
and deny each service individually? This seems a pain as if I turn on the firewall with default deny it will boot me out of my ssh connection?
I am trying to install a router on my CentOS 5 box, while i have properly configured IP address, Subnet, Default GW and DNS, I am at the point of trying to set up the Proxy. I have added a line: "proxy=http:ip_address:8080/" to the file of "/etc/yum.conf" The Proxy server here does not require any username nor password however I still cannot ping any website, while I tried to ping 74.125.45.100 (google.com) it returns the following message:
I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: URL... adds it to the end of the URL (URL...) and it downloads using the premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something. I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it.
The original thread was closed because "Sounds as if you are trying to steal a service which you have not paid for. We do not support that kind of activity here on Ubuntu Forums." However, it's not stealing since I am only going to use this with accounts that I have legitimately paid for.This might not be the right place to post this... if that's the case, I apologize - please move it to the correct location.I know a guy who has a website setup where he can download files from Megaupload with his premium account without signing in. He takes the MU link's ID, eg: http://www.megaupload.com/?d=xxxxxxxxand adds it to the end of the URL (http://192.168.1.199/mu/?d=xxxxxxxx) and it downloads using his premium account logged in on the computer he has his site hosted on. We don't get along well and I would rather not ask him how he does it.
How would I set this up on my own computer to use my premium account? I can see this being extremely useful for me if I need to download some of my artwork or projects from MU but I don't want to sign in because I'm on a public computer or something or because the computer has MU blocked. I want this to be a private site that only I have access to since it's my premium account and my money. I am not asking how to circumvent megauploads download limit at all (I've already paid for it... no need to circumvent it).
I just need a nudge in the right direction. Thanks in advance for any help you can provide.I already have everything installed on my computer to host a site. I have a simple "Hello World" page running on my webserver right now. I don't need help getting that part set up, just the rest of it. I assume this has something to do with setting up a proxy server - I just don't know how to do that and make it work like I need it to.
I just installed AWStats on a new server and can't get perl to properly work as a cgi in apache. I've done this before and got it to work, and I've tried a couple suggestions found from google with no luck. When I try to access [URL] from my browser, Apache doesn't give any errors or anything, it just outputs the perl file in plain text, which implies that it's not handling the .pl file as a perl cgi, but I just don't know why.
I have installed Fedora 12 x86_64 and vsftpd. I would like to set up an user for FTP so that he/she could only view/edit files in one certain folder (the one that I set up). How would I go about doing that?
I am running Fedora 9 in virtual box on laptop. Want to set Fedora display size so that I will not have to scroll up or down or sideways to view screen contents. Have tried a couple of options such a resize box and change from full size but these do not help reduce size of Fedora display. Am wondering if it as something to do with the rectangular size of laptop monitor???
I have been working in macromedia dreamviewer for editing html and php files, Just now I moved to linux system by installing xampp , my question is that I need a best html and php editor that supports both the design view and code view as like in dreamviewer.
I have met following problem: there is a website I cannot access. When I am trying to connect to this site I get an error message. I tried all browsers installed on my system. Seamonkey, Firefox, Konqueror. All failed. I don't understand this completely as I have no problem with connecting to internet. Now I am thinking that I myself somehow unintentionally blocked an access to this particular site. Is it possible? Under W$ I have no problems at all.
How would I set up a website that would be only accessible locally. There's a router machine (server) that keeps provides internet access for a number of client machines. I need to set up a learning platform (moodle) locally. The server machine runs moodle server (apache server) and students should have access to their accounts locally (no need to be accessible outside of LAN). First of all, what would be the best network configuration for it.Sorry for a dumb question, but could I just come up with any domain name if everything stays locally within LAN?
I have ubuntu 10.10 server with a web site I am mess up in NIC configuration. I have only one web site on my server. I Have 2 Internet connection with static IP. I have 2 Network Card as follow eth0 119.155.152.140 (1st internet with static IP without firewall) eth1 203.135.30.240 (2nd internet with static IP without firewall) when i restart my networking it give me following error
I am trying to create a shell script (CGI) which will create a website with an embedded flash mp3 player. At the same time I want this script to overwrite the file currently in the playlist with another file which will allow me to change the song currently playing on the flash player. The reason I am doing this is so that I can copy the song I want to play to this file at a low bitrate reducing its size. The code I am using in my script is:
My hosting server is running Linux / Apache. It would be very nice to be able to link some files (preferably hard links, but symbolic links also would help), but haven't a clue how to do so. I would be willing to write a server side php script if that would do the trick.
i have an AP set up and would like to have all requests for a website sent to a specific ip address.. and am trying to get this to work in IP tables
user-->AP-->google.com
no matter what site they try to goto it takes them to [URL] i want to use this to require people to login to my server when they connect to my AP before they can go any further.
I have 10 systems working on LAN with fixed IP given to all the 10. I have ubuntu installed. "ifconfig" shows my IP on eth0 interface. Query : I've my blog at /var/www with Apache installed and it works fine when i give the IP address of eth0 (Not 127.0.0.1). And it works from any system when i gve the IP address.* How can i change the IP address to a simple URL. Eg: Instead of I've tried:1) Adding entry to hosts & configuring httpd.conf in apache. Did not work.2) I don't want any bind, as it should be accesible only within the LAN
I've actually gotten a webserver (hardy heron, server edition) working before and had a Joomla site up I could find on the net. But I did a fresh install of Lucid Lynx (desktop edition, that I added some packages to to use as a server). Between the router (wrong ip address?), port forwarding (set right?), my domain nameservers (godaddy) and the DynDNS service I use (for dynamic ip) I must have a wrong setting because, my joomla website shows up at localhost, but not 'out there' so to speak. I would like to get this site out there and working soon, as it's one I'm just doing free for a Community Garden.
how do i teun an IPaddress in to a website name that i payed for? useing apachie 2. think stupid what line number to enter what code(s) i use notepad++ to edit system files with coreftp and putty by my side.
I am having a problem with internet browsing. Whenever I type the address of any webiste like Google or URL...or any other, the firefox gives message at "Status Bar" that "Looking up google.com" or "Looking up yahoo.com" etc for every website for about 20-25 seconds and than launches the website successfully. Why this is happening and how can I correct it?I also tried this in Opera and Google Chrome and same problem occurs in them.Moreover, if I ping any domain inside terminal, it instantly starts pinging (with ttl 51) and recieving data from that domain resulting that the problem is with browsers.
I set up an apache webserver on a redhat enterprise server 6 last week. It works fine on the localhost. However, the webpage can't be accessed from the other computer. I didn't modify anything related to 'allow,deny' in httpd.conf. The only thing I've done is I added a rule in iptables to approve the access from a computer with a specific IP address. Since I am quite new to iptables, I don't know if there is anything wrong with my setting.
Even I stop iptables, the problem is still there. I don't know if my setting of iptables is correct. Or, there is anything else that I should do?
I don't know if the title is a correct description of the problem I have. I registered a domain name in godaddy, created a subdomain name and pointed it to some name servers. After some time, I moved my website so I change the name servers of this subdomain name. But the problem now is it is still pointing to the old name servers. Everytime I visit the website in my computer, I get errors. But I try it in another computer, I works. So I suspect there is something left in my computer that still tells my browser to go for the old name server. I've cleared cache and temporary files of this browser but it still does not change.
My new line provider/ISP won't give me a static IP address - it's DHCP or nothing. Is there some trick I can use to allow myself to host a web page? I was thinking of getting a static address elsewhere, like at my sometime place of employment, and redirecting, but it just moves the problem downstream.Realistically, assume I leave my computer on and hardly ever reboot (CentOS is pretty stable), then will I typically keepthe same address for weeks or months at a time? If so, I could live with logging in to my registrar a few times a year to change the address, as bizarre as that would be
I am using squid 2.6 (as a proxy server) in my cent os 5 box.The clients computers are factehing the web pages successfully. The firewall (IPTABLES) are already disabled.The problem is we have an internal web based application by which the users add the data in it. when the user type the ip address in the browser i.e http://10.1.7.21:81/mis squid shows Code:ERRORThe requested URL could not be retrievedWhile trying to retrieve the URL: http://10.1.7.21:81/mis/The following error was encountered:Access Denied.Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect.Your cache administrator is root.We have another proxy server MS ISA 2006 and by changing proxy from Linux squid to MS ISA we can access the page.