Server :: Access The Log Files For Multiple Websites?
Apr 1, 2010
I have managed to configure AWstats to work from one server to access the log files for multiple websites on multiple servers and even managed to write a little front end php site to access all the different conf files. I want to know if there is a way of combing two conf files for the same site but are on two different servers in a loadbalanced situation. so my conf files look something like: [URL].. I have to monitor bandwidth for our websites and its gets tedious switching between the various conf files for the same site and totaling it up so I was hoping I could combine the output of both conf files on screen using some funky command in the awstats.pl script
View 1 Replies
ADVERTISEMENT
Aug 1, 2011
I currently have a single mysql instance for a couple of websites. Both websites require that the user provide a username.
The thing I just noticed is that two people might choose the same username for both applications. Both of them would have grants to each others' tables, although neither would ever be able to access them, since they can only access through php scripts.
But, I'm now guessing this is not best practice. What does one do? Run multiple instances of mysql for several websites?
View 4 Replies
View Related
Sep 25, 2010
I've currently got a centos server setup in my home. It has 1 website running on it and I am using DynDNS.com's service to access the server. DynDNS works by redirecting a free url to the IP address of your server. I would like to add a second website under a different url. However, I'm not sure how to add a second IP address (which I would then use with DynDNS to create a new url).
View 3 Replies
View Related
Aug 25, 2010
I currently am running 10.04.1 and have successfully setup my home web server to run a single website. My current settings are:
-I have registered the domain name annarrankings.com through godaddy
-A record is - host = @ and points to = 71.114.220.3
-CName is - host = www and points to = @
-on my server I have the site running in /var/www
I've done some research and found that to run multiple websites I need to setup VirtualHost.
-So I created a folder /var/www/annarrankings.com and moved my site to that folder
-Edited Apache2.conf to add the following line
-I then went to /etc/apache2/sites-enabled and copied the default file to a new file called annarrankings.com. Here's the annarrankings.com file after I edited it
-I then created a link in /etc/apache2/sites-enabled to the annarrankings.com file in /etc/apache2/sites-available
-Next I editied /etc/hosts
-When i went to enable the site using a2ensite annarrankings.com I got the following
I figured this was ok since I had already created a symbolic link earlier (a result of trying to following multiple tutorials and ..... videos at once) so I reloaded apache2. I created an index.html file in /var/www/ just for testing purposes and when I load www.annarrankings.com I get the file located in /var/www/ instead of the website located in /var/www/annarrankings.com Do I need to change my A record or CName in godaddy or did I just do this completely wrong?
View 8 Replies
View Related
May 22, 2011
i was tasked to setup a proxy server to block access to some websites. i'm using centOS 5 and Squid 7:2.6 STABLE21-6.e15...i appended the following and tested the configuration with the supposed server i am using and the it does seem to work but now i'm wondering how i can test it with a client computer..i have 2 LAN cards and i just connected the other to one PC (can a direct connection work or does it need to pass thru a switch or hub)...i just can't figure out how it should be... how do i configure the 2nd LAN card to use this computer as its proxy server?
View 8 Replies
View Related
Mar 4, 2011
I would like to have a script which I can use to test if files on some certain sites are available and if their size is bigger then 0 bytes. I do not want to download them, only check them. If I would use wget they get downloaded I think.
View 1 Replies
View Related
Feb 5, 2010
I am setting up a samba server to operate in a windows AD domain. I want to set permissions for multiple groups to have different levels of access to one group of files, and it looks to me like unix permissions will not do that? I always hear about how robust linux is, and it seems to me that their file permissions model is WEAK compared to microsoft's?
View 2 Replies
View Related
May 8, 2009
I would like to know if I need multiple IPs' to setup two SSL urls on the same Apache server? Two ssl certificates, one IP - is it possible?
View 4 Replies
View Related
Jul 15, 2010
I'm setting up a server on Ubuntu 10.04 for development. It all seems to work nicely, I just have one thing that's bugging me. I have a project in /var/www/portfolio, which is as you may guess my portfolio. Instead of the link http://localhost/portfolio I'd like to use http://portfolio.nl. So I set up a file 'portfolio.nl' under sites-available and a symlink on sites-enabled containing this: To get this to work on my local machine I set up this in /etc/hosts:
[Code]...
View 2 Replies
View Related
Dec 15, 2010
I am trying add three namebased virtual hosts in local apache2 webserver OS ubuntu 10.10. The three sites are :www.site1.eka,www.site2.eka,www.site2.eka
The first I created a file is virtual.conf in conf.d directory its content is :
# we're running multiple virtual hosts.
# NameVirtualHost *:80
Next I created following files in sites-available directory. [URL] is as follows:
#site1.eka (/etc/apache2/sites-available/www.site1.eka)
<VirtualHost *:80>
ServerAdmin webmaster@site1.eka
ServerName www.site1.eka
ServerAlias site1.eka .....
When I visit the [URL] in browser it says server not found.
View 3 Replies
View Related
May 30, 2011
I am trying to create multiple FTP sites on one Linux server (using multiple vsftpd-sitename.conf files) with default port '21'.
Below is the sample vsftpd site configuration,
# cat vsftpd-site1.conf
listen=YES
write_enable=YES
[code]....
I am unable to create above vsftpd site with port '21'. Below is the problem,
# vsftpd vsftpd-site1.conf &
[1] 14448
500 OOPS: could not bind listening IPv4 socket
I wonder, i can able to create above FTP site with another port (example, listen_port=60001 ). In Linux(vsftpd), can i use default ftp port '21' for multiple FTP sites?
View 2 Replies
View Related
Jan 12, 2010
I need to find a solution where I can monitor multiple sites at once and know when they go down. There are fraudulent sites (not mine I am in the infosec industry)
I have been playing with Nagios and Zabbix all day with not much luck. I have managed to get both installed and running and researched for hours. I just need to figure out how to add domain names to be monitored and I am stuck. I know you have to do it in the conf files but I cannot get a domain to show up?
View 3 Replies
View Related
Jun 1, 2010
I have this intra net server project going on and now I moved to 10.04 however there are still some things that I would like to see clarification and instructions on. I am interested to set up multiple parallel websites for my apache server, however I am not sure how to do this exactly. Now I have solid address rivera.wippies.net and port 80 redirecting to my server. What I would like to get done is that I get multiple independent of each other websites for my server I was thinking of making websites like this
/var/www/site1 (which would be as rivera.wippies.net)
/var/www/site2 (which would be as rivera.wippies.net/othersite)
/var/www/site3 (which would be rivera.wippies.net/secondothersite)
etc, so that I have multiple "individual" websties for my server. Requirements would be that each of these websites could have SSL encryption as needed available too, since some of the website could have confidential information.
View 9 Replies
View Related
Jul 24, 2011
I find FTP server software confusing in Linux. Using ServU for Windows for an example, all I need to do is to create users via the ServU interface and choose a folder I want that user to have access to and their permissions, and viola, they can connect to that directory, and that directory only.
But in the the land of Linux, it apparently can't be managed this easy. I have a web server with multiple domains, and therefore multiple users need access to their own web root. So with that in mind, what FTP server software should I use (there are plenty out there) and how would I go about to create a user per domain, so that they can log in using FTP to manage their site, and only have access to their own web root, and nothing else?
View 2 Replies
View Related
Nov 18, 2010
I had Fedora 12 installed on my laptop and everything was working okay.
After making a fresh install of Fedora 14 I can now only access some websites, others cannot be found.
I have spent hours trying to resolve this without success.
View 3 Replies
View Related
Aug 4, 2011
I just installed Fedora 15 and everything is working ok except i can't connect to some websites such as facebook or hotmail. I tried doing the same on windows and everything is ok there. I've been searching through forum about this problem but nothing seems to help. Has anyone else had this kind of problem and managed to fix it?
View 10 Replies
View Related
Apr 26, 2011
I have been studying LAMP for a while now and all my websites are on the localhost. But if i wanted to access my websites from a different computer, how do i access do that? LAMP (192.168.0.2) client (192.168.0.3). Do i need to setup the DNS servers as well in my home network for me to access my intranet (LAMP).
View 2 Replies
View Related
Jun 21, 2010
recently I have bought a new wireless router and now I have problems connecting to some (secure) websites. I can't login to gmail and facebook while I am still able to connect to these sites using my old router. I have also tryed to connect to these sites using my new router on windows and it worked with no problem.It is really strange for me, although I don't know that much about networking. I am using Fedora 13 and a TP-Link (TD-W8901G) Wireless router and I have these problems using both wired, and wireless connections with this router.
View 5 Replies
View Related
May 4, 2010
I'm using Ubuntu x64 (dunno which version, but I don't think it matters) and I'm concerned about security with PHP.I remember using lighttpd and I had some mystic configuration and the secuirty was perfect for me - if one website gets hacked then the others are still safe (kinda).Now with apache2 if I enable safemode I'm still able to go outside web directory and actually I can go really far untill user/group matches.I tested the system with r57shell and I was able to mess up other websites.Is there a way to disallow access to other websites?
View 5 Replies
View Related
Aug 28, 2010
I am in the process of setting up my own Linux gateway/firewall using two nics eth0(external network) and eth1(internal network). The Linux gateway hands out ip addresses using dhcp3-server, and uses iptables to route the traffic correctly. Clients are able to connect and access the internet...everything is working great, HOWEVER I can't access my apache virtual hosts websites from the internal network? They work just fine if i access them from the outside world
I can type ip of the web server, 192.168.0.201 and it shows the first virtual host listed in my /sites-enables/000-default folder. but i can't use any DNS entries. I don't have any internal DNS servers running. This doesn't makes sense, because if i replace the linux firewall/router with my normal linksys wrt54G router it works just fine.
View 4 Replies
View Related
Jul 3, 2010
Find and replace in multiple files in ssh?
View 1 Replies
View Related
May 18, 2009
I am working with DM355 target board. Here we record. The video coming from IP cameras. Now I have to write c program for copying. The recorded avi files with date and time to NAS server using scp. I wrote a script to copy single file to NAS server.
#!/bin/bash
DATE=$(date +%Y%m%d_%H_%M_%S)
mv Camera1.avi Camera1_$DATE.avi
scp Camera1_$DATE.avi root@192.168.1.4:/root/test/
mv Camera1_$DATE.avi Camera1.av
But I have to write c program for copying multiple avi files with Date and Time to NAS server.
View 5 Replies
View Related
Mar 7, 2010
I just installed Opensuse 11.2 dual-booting with Windows 7 on a laptop. I have the wireless set up and connected, but I can't seem to be able to connect to any websites or ping Google, etc.I'm kind of a Linux newbie, so I'd like to get some advise as to what I can check to get it working? (Windows 7 on the same machine is able to connect - I'm using it now)
- I've tried both automatic DHCP configuration and also setting the ip/dns server manually, no luck
View 4 Replies
View Related
Nov 21, 2010
My internet connection used to be a direct LAN connection to my provider. Back then, everything would load fine on both Windows and Ubuntu (dual boot). However, a while ago they started needing me to dial using a username and password (over a PPPoE connection). But since then, I haven't been able to browse certain websites on Ubuntu, even though there have been no such issues on Windows. Some example websites are - Ovi's sign in page (although share.ovi.com loads fine, and nokia.com loads fine), Live Mail (works on Chrome(ium) and Opera but not on Firefox (both 3.6 and 4)) and other random websites.
Some of the websites that don't load show timeout messages on Chrome and for some websites, the browser will keep trying to load without an end (I've left it like that even for hours but not noticed anything different happen).
I have tried changing the DNS servers to the ones suggested in the comment. I have even tried booting from a Fedora LiveCD and then changing the DNS to those (and even to the ones of OpenDNS), but the exact same thing happens. Here's output of ipconfig on Windows: Opera error messages seem to be a little more informative and they have the following errors in turns:
Secure connection: fatal error (552)
Secure connection: fatal error (40)
Followed by: Opera was not able to connect to the server. The server may be using the unsupported SSL 2 protocol, which is not considered safe enough for secure communication. The site owner should upgrade to TLS 1.0 or newer. Does anyone know why this is happening and how it can be fixed?
Update: Just saw here [URL].. that someone else was having similar problem and solved it by putting a NetworkManager.conf file in /etc/NetworkManager. What needs to be in that file?
View 1 Replies
View Related
Jun 11, 2010
I'd like to be able to limit access to a particular website, based on the time of day. I would also like to be able to password protect this if possible.So for instance, from 7am until 10pm daily, I can access URL... but after 10pm it redirects to 127.0.0.1 or something. And this configuration be protected by only allowing a certain user (other than root) to change the config?
View 7 Replies
View Related
Dec 14, 2010
I have been using lynx for sometime specially when on a slow connection. I was wondering if there's a way of accessing flash-based websites (completely flash-based) using lynx?
View 9 Replies
View Related
Sep 23, 2010
I have configured my squid that have a limited access to websites but still some website were accessable vis https so I removed transparent from squid. Now what changes do I have to make in iptbles
View 1 Replies
View Related
Sep 17, 2010
I don't know is this the right place to ask, but i must ask some questions Here's my problem.I'm a student in highscool,and here we use Linux(ubuntu) OS .Every classroom has like 30 PC's connected with the main computer(the teacher's one) so....3 days ago we were forbidden access to some websites it says This domain is Blocked.By the way the Linux version installed is 7.04(feisty Fawn) i tried disable cookies that did not worked,also tried to whitelist some website,that also didn't worked out
View 7 Replies
View Related
Jan 21, 2011
I have been trying to get Squid to work so that I can restrict access to a particular web site during certain hours every night. I can't seem to get it working, however. I am still able to access the site. The following are the relevant lines from my squid.conf file:
acl restricted-domain dstdomain "/etc/squid/denied_domains.acl"
acl test time 19:00-20:00
acl bedtime time 22:00-23:59
[code]...
View 2 Replies
View Related
Jun 16, 2011
I have 16 linux servers that use /etc/hosts files to see and talk with each other. I'm adding servers to this pool of servers. It is required to do host resolution via the /etc/hosts files. DNS or NIS are not alternatives. Aside from manually editing each of the 16+ /etc/hosts files every time I add a server or editing one /etc/hosts file on one server then scp'ing it to all the other servers, is there anyway to edit the /etc/hosts on one server and "push" it onto the other servers that need the new /etc/hosts file?
Everywhere I've looked on the Net, there hasn't been any suggestion except for the options I mention here.
Or am I just whistling in the wind?
View 6 Replies
View Related