In the office there is a local network with samba+openldap PDC. The local domain name is company.net. The company desided to create a corporate Website on a remote hosting and desided that the site's domain should be company.net which is same as local network's domain name. So now it is not possible to reach that corporate website from within the company's local network because, as I guess, bind9 which is installed on above menioned PDC looks for company.net on a local webserver. Is there a possibility to let people from this local network browse the remote site?
I am running a CentOS 5.6 Server with a website and Forum. The website has a contact form and users can email various people in the organization. I have setup sendmail to send any mails like this to my email address on a Windows machine.The problem is that there emails don't have a header which has the senders original email header. It comes from localhost on my sendmail. How can I get any mail generated from the apache site to ALSO send mail to root on the Centos Server? That way I should be able to see the header and report the spammer or block them.
I have to ubuntu machine (9.10 and 10.4) with a openvpn tunnel between them.This is the situation:
Code: NetworkA 192.168.0.0/24 | UbuntuA br0:192.168.0.3 (openvpn bridge between eth0 and tap0)[code].....
UbuntuA has one only interface etho and there are two openvpn instance: one bridge istance with br0 and another instance with tun0. UbuntuA is not the gateway for networkA. UbuntuB is the gateway for NetworkB.I need to comunicate between pc on networkB e those on networkA.This is the "ping situation" (no pc tested has an active firewall):
ubuntuA vs ubuntuB: OK ubuntuB vs ubuntuA: OK pc on NetworkA vs ubuntuA and ubuntuB: OK[code].....
I've been on a quest to enable full routing through my openvpn tunnel between my office and the colo. Masquerading will work, however it will throw off anything key based and makes a lot of things just more difficult and vague in general. Is there an easy way to do this via iptables? I tried using quagga hoping it would magically solve my problems, however it does not seem to do my routing for me . I just did a basic static route within zebra...
I have installed proxy server on ubuntu.I have done every process which is needed to establish proxy server.internet is also working fine through proxy but the sites which needs to be blocked it is not blocking.it is opening.I have made entry of sites which i needed to be blocked in block_dstdomain file in proxy
I am trying to set up a web based FTP site on my home server, I got ftp up and running and I can log into it using an ftp client but I want to set it up so I can get to it from the web. I put the directory in the www/html folder but that does not seem to help it
If anyone could point me in the right direction that would be great. I also need to let anonymous users get access to it.
I have three locations with a central office connected to two remote locations. At the central office I run on a cisco asa 5505 two site to site vpns. The remote end of the first site is a checkpoint firewall , and the remote end of the second site is racoon on debian. Both sites are up and working. However, where at the first site traffic goes both ways, at the second site it only works from the central office to the remote office.
For example, I can ssh from a host in the central office to a host in the first remote site (through checkpoint firewall,) then ssh back from that host at the remote office to any host in the central office. In contrast, after I ssh from a host in the central office to a host in the second remote office (through racoon), I cannot see the central office hosts (ping the ip address of a central office host, ssh, etc. all fail.) The vpn settings at the central office (the cisco asa 5505) are identical. So it seems to me that some routing magic is missing on the host running racoon at the second remote office. Where would such setting reside? racoon config files? iptables?
I have start apache, and can see my site on 127.0.0.1, but I have router and don't know how to check my site from outside (Internet bellow). I have no domain name registered. Just want to check web server.
That should be easy. (or not?)
The router has static IP xxx.xxx.xxx.239 from the my internet provider and assign 192.168.1.100 to my computer.
Only from internal lan. I have secondary IP address added for SSL site, and we are natting that address from that outside. I can hit it if I change my hosts file to that local IP on the server itself, but when using external NAT address it never hits the server. But if you telnet to that external address with port 80 or 443 it works.
Here is a sample of my config with servername and ip addresses changed:
I have httpd server installed and I need to do redirect all the requests from clients thats start with a http://abc.*.* on to [url]. I know this has to be done with Rewrite conditions but Im at a loss coming up with the right condition. this site doesnt have any virtual hosts configured. Currenlty I only can access the apache test page.
I installed for the sheer pleasure of it a webserver. includes apache2, postfix mail server, MySQL, MyDNS nameserver, PureFTPd, SpamAssassin, ClamAV, etc etc. (I have tried to use the config panels ispconfig and webmin)
The situation: There are 3 sites listed on the server, All 3 on different virtual hosts (obviously). The http access goes trough the 20080 port since my ISP blocks everything below 1024 (the bastards). I use a linksys router (192.168.1.1) with dmz pointing to 192.168.1.100 (server) The sites on the server:
inphone.be fraksken.be fraksken.is-a-geek.org (but appearently not geek enough) I also gave them own IP's: 192.168.1.95 192.168.1.96 192.168.1.100
I've moved a web site from one server to another.I'm also moving the domain name to the new server.In Apache I've got the web site configured up and running with no problems. What I'd like to know if this is possible. I'm wanting to avoid a complete web site rewrite. I've looked into mod_rewrite but I don't think this is right.I only want the web site to look aesthetically pleasing. So if it's not possible so be it. (I'm not a programmer but do have access to the code)
My requirement is to install Ubuntu SERVER 8.04.3 LTS on a Dell 2550 machine.
Problem. - Bug in ubuntu (desktop and server) stopping CD install on this hardware platform. It hangs half way through and is something to do with the SCSI CD drive. Lots of stuff on the net about it, but no fix as far as I can see, at least nothing that works for me.
- So, tried boot off CD and then used F6 and added a cdrom-detect/try-usb=true line together with putting a USB with the ISO burnt onto it in the USB port. Still hangs during install (can't see USB perhaps? USB not correctly burnt? Tried many many times though, and eventually gave up at 2am)
- Next tried PXE boot. Now getting somewhere. Ok I installed tftpd32 on my windows machine, and copied the netboot files from the ubuntu site to the tftpd directory. (I dont think this will help me install the server, just the desktop??) Anyway, it booted from the win tftp machine (wahey!), and went through install, BUT when it tried to find the mirror site it couldnt. In fact it couldnt find ANY mirror site. (Internet access from windows and all other PCs on the network is working fine).
So, to ensure I am installing the SERVER version, and to use PXE boot, and to NOT use the internet to download the files (i.e. load them up locally somewhere), what do I do?
I'm new to Linux and I recently decided to try out Ubuntu. I used the Wubi installer so it's on Windows (just to see if I liked Ubuntu) and it works fine, but with one exception: the internet (or maybe just Firefox) isn't working. The internet connection worked just fine on XP. On Ubuntu, it says that it was able to connect to the wireless network, but when I try to go to any page on Firefox it just keeps trying to load perpetually and never connects to the site, eventually just showing the "Server not found" page.
A while back, I put a site up under a LAMP setup, and followed a guide from ubuntuforums that I googled to set up SSL encryption for the site.
That site works great, but since then, I've added some other sites to the same LAMP server. They load fine as well, but if I type in https:// before going to the latter sites, the browser attempts to redirect to the first, and warns that it is a fraudulent certificate, and that I'm at risk by going to the site.
Obviously, it isn't an attack site, the certificate is just set up for only one domain. How do I prevent my non-SSL sites from redirecting to the SSL-encrypted site?
To setting up the mail server for my site. The situation is such that it is necessary to allow access through the site (built under LAMP) to the mail server. Ligament postfix + dovecot good option? or who have a similar configs mail server.
I'm going away for the weekend and will only be able to bring my laptop that has the latest release of Ubuntu on it. My question is: will i be able to access my company's Terminal Server site [URL] from browser in Linux, and be able to launch the applications? (Outlook, etc.) Seems to connect to Windows Server 2008. If this works in Linux, that would make my life a lot easier..
If a site reaches a point where 1 machine simply isn't enough to handle it, what exactly is done? I've heard of "round robin DNS load balancing", but that's obviously not going to work if the site has CGI scripts/a substantial backend of any kind as that data will have to be copied over to every server, and I'd guess that that's not feasible. If the site uses a MYSQL server, and a second web server to take over the extra traffic from the first is put up (how would traffic be redirected to it?), when it makes a database query, it still has to connect to the 1st server, so, we're not really solving the problem at all, it remains just the same. So - how is it done, for large, high traffic sites? (possibly like LQ itself?)
I have a simple issue that I think can be solved with several different methods. I basically want to create a personal server solution that allows me to do two things:
1.) I want to be able to remotely backup data to my server.
2.) I want to be able to pass traffic through it and use it as a proxy.
I am off to college next year and I want to leave a computer/server back home to do the two things stated above. I was thinking of using an Asus Eee Box PC like this:url
I want a low power reliable machine that will only be used as a remote solution. I won't be hooking up a monitor to it (that is, after I set it up).
It will be on 24/7 for easy access.
I will be accessing this server from a Windows 7 based machine.
I do not mind at all installing Linux on my server, but I am not an experienced coder so I will need software with a GUI that can help me set this all up.
I am having a web server (apache) and 3 sites are hosted in it, named as www.web1.com,www.web2.com and www.web3.com. I need to restrict www.web2.com to Internet users and allow only to local network. At same time I need to allow www.web1.com and www.web3.com to both Internet and LAN users.
Slow access to web site using squid and Internet explorer.I am trying to troubleshoot an issue I am stuck on. We have a website that is loading .htm documents extremely slow when using Internet Explorer 8 behind Squid. When we bypass the proxy and go directly out to the internet all is fast and pages load fine.But when the proxy is on documents will take sometimes up to 6 minutes to load.This issue is only apparent using Internet explorer 8.I do not see the issue when using firefox with Squid.I have tried to use the no_cache directive thinking it may have been the cache but that didn't work either.I am attaching our access.log, store.log and squid.conf.
Normally, on my website, files are either handled by WordPress or by me doing FTP. I'd like to copy my entire site to a new folder. I don't want to copy it down to my local drive (with wget) and then just upload it. How close is this to the line I'd need (except near midnight)
mv -r fromfolder \%todaysdate%
I've played with Unix and Linux for a few days over the years, but I'm a Windows (and DOS-prompt) guy. So, I don't know how to get to the server's command prompt on my 1and1-hosted site.
I have no idea why, but my website times out as soon as you try to use the cart.(which invokes SSL)It seems to have happened after I went live from testing, but I checked my configurations, and I can't see anything awry in my apache2 confs.Has anyone run into this before?I don't know how or why this happened or how to even begin fxing it