Ubuntu Servers :: How Many Connections Can XAMPP Handle
Jul 23, 2010
i would like to know how many connections per sec/min/hour, xampp can handle. I'm going to run SMF forum on my box, because i'm not ready yet with real server solution. System stats are: G31M-S2C; 2x2GB Kingston@920Mhz; E5200@3500Mhz; 500GB@7200 Seagate, all powered by Lucid x64. In a time i will migrate to a quad, mobo with raid support, etc. but at that point i'm want to know how much connections can xampp handle. Because of the forum nature, i think of auto deleting topics witn no new replys in 5 days for saving place (there will be minimum 1 and maybe max 5 photos per thread, maximum size 2Mb for each).
I want to share my WiFi internet connection over LAN, so I'm trying to set up IP forwarding. An old tutorial tells me to go to Network Settings in YaST to do this, but that applet says that NetworkManager is now handling all of that stuff.How do I enable IP forwarding while NetworkManager handles my internet connections?
I have heard the wpa_supplicant in roaming mode, although absent from the various Debian (and Debian-related) forums and wikis, can handle automatic connections to a variety of encrypted and open networks as well as network-manager does... and better when using the MadWifi drivers!
I cant get to my phpmyadmin on xampp I tried doing it on the terminal but i get this wierd message
XAMPP: Another web server daemon is already running. XAMPP: Another MySQL daemon is already running. XAMPP: Starting ProFTPD... XAMPP for Linux started.
But i cant get in phpmyadmin What can i do linux pros?
I'm using ubuntu 9.10 64bit and I'want to configure php mysql and lampp. Also the package mysql-ser is already installed, so if I install MYSQL GUI tools(Administrator, query browser..) from my synaptic package manager will it detect the mysql already installed ? Also tried configuring lampp by extracting it in my /opt directory and i started the server, which MySQL ll it detect the one which was previously installed otr the oin that in gui tools ?
Also think the current ubuntu repository does't have latest build of mysql GUI tools, so how the above mentioned things ll change if I try to download the individual package.
Oveall what I want is to configure Mysql GUI tools, Mysql workbench, xampp and php(For this I'm using Bluefish and Komposer). Please help me out so that I can leave windows for my Web-projects.
I'm trying to run Xampp so that I can test Drupal out. I'm running Lucid Lynx Beta. I'm having a lot of trouble getting Xampp to run. I downloaded Xampp version 1.7.3a and installed it but when I type http://localhost in Firefox to see if Xampp is working it just says:
Quote:
This is the default web page for this server. The web server software is running but no content has been added, yet.
i want to Xampp run and start all time when i start pc. i don know correct command for use crontab when i want to use it i must delete ># m h dom mon dow commandor not ?
I have recently set up my web server using the LAMPP package for Ubuntu. Now I'm running Ubuntu Server 10 so my machine boots into the console and not gdm. Also I have my machine autologin as root so I can simply turn my server on and not mess around with logins. Now I created a simple shellscript for starting lampp and then I put it into etc/init.d/, chmod -x file.sh to give it executable permissions, then I update-rc.d file.sh defaults but nothing happens on startup. It simple takes me to the console and nothing more happens. I have to manually start lampp on every bootup.
I'm new at using Ubuntu, I really need to use XAMPP / LAMPP for testing my websites. The problem is that, I can't save files to the HTDOCS of my XAMPP.
I got my hands on a couple old servers. An HP tc2021 and a Proliant ML110. Sure they're ancient, but I thought they would make a couple of great Ubuntu Servers for a new "start up" business I'm trying out. Now I've got to decide the best scenario to distribute the load between the two servers. I'm going to make an internal domain and will probably be running the following: Kerberos, Samba, Apache, Postfix, mysql, bind, dhcp, SVN, GCC, and Nagios. So in summary, I'll have the following roles; domain controller, web server, file server, and network monitoring services. how they would handle splitting these services up between two servers?
I have been using wamp on an xp box and now have set up ubuntu with the localhost server seeming to be going okay.
As this is just a desktop graphical test server with no real public hosting, I was hoping to find a control panel like in wamp where one can stop/start php, apache, mysql, phpmyadmin and see logs etc.
I currently have a windows server running with XAMPP installed.I want to try out ubuntu server, I am a complete linux newbie and was wondering if there was a similar package to XAMPP out there with:ApachePHPMySQLAnd some form of ftp server
I'm trying to set up a virtual web server using virtual pc and a net-tuts how to. So I went ahead and downloaded Ubuntu 9.10 Server Edition, but it only comes in 64-bit, and vpc doesn't handle 64-bit. How I can get around this? I have a machine that I could set up as a server, but that is also only 32-bit.
I have been plagued by this for some time. How many times do you need to run chown -R user:www-data or similar to your webroot directory.I have been searching via Google and this forum. I have yet to find a definite answer to handle uploading and creating new files usable by apache2.Scenarios can vary. Some folks put there webroot inside a /home directory. Some users leave the default location as /var/www.I have a two part question.. Why do I often read "Apache runs as user=www-data, therefore files need to be readable by such (www-data)", but the default install in Ubuntu includes an index.html with the following?
have a problem with my network-manager in ubuntu 10.10.when I dial one of my vpn connections, my other vpn connections be disabled and I can't use them!I tried to restart network-manager and gnome-panel, but it does't seem to solve this problem.
How do clients handle offline syslog servers?Will the log files be buffered locally to be sent to the syslog server when it comes back online, or will any log data generated during downtime be lost in cyber space?
[Note: I'm typing this entire thing out. It probably isn't 100% verbatim. And I am using an older version of the kernel because 2.6.30.9.96 was not behaving either.]
Code: BUG: unable to handle kernel paging request at efd86f5c *pde = 00000000
I am noticing really odd behaviour after upgrading from 8.10 to 9.10 (via 9.04). My server frequently becomes unreachable. I am using it as an application server, running Apache, JBoss and MySql. Once the server goes idle, all web connections time out. SSH also times out. Usually the server wakes up on second SSH attempt and then everything: Web, SSH etc seems to run fine.
This is a server machine with no GUI. Can anyone point me to power management or other such settings I can tune from commandline? I have disabled power management by adding kernel parameter acpi=off. I still have the problem. The first network connection after the machine has gone idle takes a long time. All later connections run pretty smoothly.
I have a small business that I run Squid and Dansguardian on Ubuntu for network proxy filtering, among other things. This works great, but does not block SSL connections on Port 443, such as https proxies. I understand that this is because this type of configuration is a "transparent proxy".
Is there a way to set one up "non-transparent" and, would that filter https?
I cannot blanket block 443, because some sites need it.
I have read that one can re-compile Squid to work with SSL, but not being a super guru, not sure of the implications of doing that.
I want to set up my laptop to allow connections to certain users with passwords from anywhere over the internet via SSH but I'm unsure about how I would go about doing this. I only thought it would be the case of setting up the open-ssh server on the laptop then, using my external IP and PuTTY on another PC outside of the network, connect to the external IP through port 22 so I tried this and wait 3 or 4 minutes or so and it says the connection times out.
I have also configured my router to use port forwarding but this doesn't seem to help much either and I have LAMP setup to allow connections to external IP : 80. The only thing I am able to do is access the laptop through the local network by using its internal IP's like 127.198.0.1:22 or something. I was wondering if anybody knows if and could tell me how I would do this as I really want to be able to access my home computer/laptop from my work sometimes, especially if I have work at home which is not with me at work or something.
I know this is possible but I'm quite new to this type of thing and don't know what is going wrong. Have I missed something or do I need to change any .conf files or anything?
My setup: Ubuntu Karmic Server Edition 64 bit Dell PowerEdge T610 4 internal NIC RJ45 1 add-on NIC Fibre 1 Gb Samba 3.4.0 two shares intended for WIN clients
Connections: eth0 (the first internal NIC) is part of a private network of 4 servers connected to a Gb switch. this connections serves as a fast link among these servers to regularly transfer (backup) large quantities of data. Only I can utilise it from within the server room (among those servers, obviously).
eth0: 192.168.0.AA1 eth4 is the Intel add-on NIC with 1 GB fibre connections to the public network of our institution. This is the link/IP my WIN clints have to use to access their shares. eth4: 134.XXX.YYY.ZZ1
My Problem: Despite having including both interfaces in my smb.conf only the internal connection via eth0 lives up to my expectation and delivers up to 50 MB/s. All clients trying to connect via eth4 will be able to see and access the shares, but file transfers commence with speeds severely below 0,5 MB/s with lots of aborts and warnings from the WIN file explorer.
So I experimented with the setting "interfaces" in the global section of smb.conf --- to no avail. I even set samba only to eth4: same problem. Only way to get flawless & fast transfers is the way through eth0. The samba log files of the clients I tried do show some errors, but I fear I am unable to interpret them properly.
/etc/network/interfaces
Code: # This file describes the network interfaces available on your system # and how to activate them. For more information, see interfaces(5). # The loopback network interface auto lo iface lo inet loopback # The primary network interface Intel Gb link via fibre auto eth4
I'm curious how I can use a Windows client with two separate accounts to connect, at the same time, to a SMB server hosting two shares. (Provided permissions and accounts are all in order)
Scenario:User1 is always logged onto a Windows client mapped to a Public share on a Linux SMB server.I need a way to keep User1 connected to the Public share and then when needed, allow User1 to provide User2's credentials to connect to a Restricted share.The only way I've been able to do this is to disconnect from the Public Share then reconnect to the Restricted share using User2's credentials. (This is the issue because I need to keep User1 connected to the Public share).Is this a limitation of SMB? Or am I missing a configuration? Please point me in the right direction
OK, so, basically, not so long ago I had a modem + LAN cable kind of internet setup, and my friends and colleagues had no problem connecting to my Apache, ircd, etc.
But a few months ago my ISP changed it's policy, and now I have a single cable, plugged directly into the 'eth0' port, which connects to WAN (static IP) and, through PPPoE, to the net (dynamic IP). (Sorry, my knowledge in networking is close to nonexistent)
So, now there is a problem. My friends CAN still connect to my FTP and httpd on Windows XP, through both the external, dynamic IP, as well as the static WAN IP, but my Slackware (WAN IP is set up with DHCP, PPPoE - through pppoe-setup, with firewall at '0') is refusing access. No sings of connection is shown in the /var/log/access_log.
Also, VoiceChatter server DOES log the connection attempt, but it refuses connection, sending a 'Auth challenge', and then cutting connection. (The 'challenge' bit was never there before the new net setup)
All connections are done through WAN static IP (though test with netwide dynamic IP yield the same results =)
I run Slackware 13.1, didn't touch the firewall settings at all, and, as mentioned, pppoe firewall is set to '0' value.
I have a question about connections with an ubuntu server. Is there a way to know if the terminal computer (which is connected to my linux server) is using WinSCP to connect (on a Windows platform) or a linux system? It seems they are using the same port (22) and I think the exact same protocol (SSH / SFTP), is there a way to differentiate between the two though? And going farther there, is it possible to limit the connections only to linux terminal computers and reject requests coming from Windows computers?
I have slapd-server running but it seems to refuse connections in a very odd way. Wireshark shows that everytime JavaEE-client tries to connect, only 2 packages are sent. As I understand, in tcp/ip protocol, the first is just "hello, who's there". The last is just a message consisting of ACK and RST. I think RST means "we're done". At this point I don't think any credentials are checked so I don't know what could be wrong
I know Ubuntu can do amazing things, but I was wondering if it can use different Internet connections for different websites.
The Setup: We have 1x unshaped ADSL connection at 4MBPS (fastest available) that's used for office related things, Skype, General browsing etc. We have another ADSL connection, this time shaped and running at 4MBPS, I want to send all requests to facebook, twitter and downloading sites like fileserve, filesonic, hotfile etc. to the shaped connection. Can iptables be used to do this? The unshaped ADSL router is connected to eth0 and has an IP of 192.168.0.1 the shaped ADSL router is connected to eth2 and has an IP of 10.0.2.1 Local lan is connected to eth1 and has a range of 192.168.1.0/24 Can iptables send a certain webpage (*.facebook.*) to eth2 and other pages (*.google.*) to eth0 ?