Ubuntu Servers :: Minimal Hardware - Business Creating Webpages
Sep 8, 2010
I am thinking of starting a business creating web pages and supplying a 'enterprise server' type solutions, whilst still being highly cost effective - eg recomend the client to use an 'old' pc for the majority of their server needs (LDAP, mail, firewall, web server etc). I plan on doing all of this on a linux platform, so as to pass the cost savings of related 'microsoft enterprise systems' to my clients ~ thereby making my proposition more interesting to clients. However I'm not sure if the 'old' pc idea will 'cut the mustard' in terms of serving web pages. So my real question is, at what point does the speed of the internet connection reach a bottle neck with the speed of my CPU? As an 'example' in case I'm not being very clear (which I'm not sure I am).
My old pc has an AMD athlon chip in it, and equally old 30GB and 40GB HDD (SCSI ~ did I mention that they are old!). My intention is to set up a system to the above type specification at home, to see how long it takes me to do, and to give me an idea of what I should charge clients. I understand all the technology, but initially (to save on personal startup costs) I was intending to use my old pc as my personal gateway to the world, $40 for the web registration for 1 year is a considerable saving compared to the $20 per month for a hosted service ~ although as soon as things are going and being 'profitable' I would most likely either upgrade my server or get a hosted service. Any tools I can download onto the server to 'determine' the power consumption over any given period?
After a long time I tried ubuntu(9.10) again on my fileserver, I have some remarks; why does a minimal server installation include X/openoffice? I don't need document conversion on a fileserver and I bet a lot of people don't. Wouldn't it be better to create a new server package and leave minimal minimal? low memory installs (64mb) don't work unless you configure swap by hand in between things, 64mb ram is a lot in my eyes. I mean, not to be rude but if I wanted all this I could've better installed Solaris.
That said it's stable and running fine. Since it's my home fileserver I tried to convert my previously created raid10 mirror on an adaptec 1200 card to a softraid 5 solution. This is wat I did:
I am certainly a Linux Newbie..I have a business with a network of PCs that need not get client files corrupted by viruses & other bad things from the internet...YET I would like to let my employees(who are way worse newbies than me..om Windows even...& stand good chance to mess up computers) use internet at times if they wish. My first thought was just separate side by side PCs, one to get on for Clients/business network etc, & the other to get trashed by the internet.
What I am thinking is a better alternative (& I need to know from yall who I'm sure are way beyond this newbie whether this thinking is correct) is to put something like Puppy Linux 5.0 on small memory USB flash drives & let them each have one to use for internet, simple application functions etc. Can I safely believe that running internet browsing on the flash drive with Puppy Linux booted & running as the OS is NOT going to potentially infect my windows XP business PCs with viruses etc?
I just downloaded a ubuntu minimal cd and installed it and everything, my goal is to create a super lightweight distro based on ubuntu with openbox, slim, conky, almost like crunchbang linux.. So how would i go about creating my own distro from my minimal install? Do i need to install synaptic package manager? (Ubuntu 10.04 is the best! Just a little too heavy for my netbook !)
I always used to wonder how do web hosting companies host their websites. My problem is I saw some one having their website.I have a degree in Computer Science. I am not clear as how do these web hosting companies give the logins to their users as root. Meaning how can a hosting provider provide root login to say 1000 users and each having a different IP address.
I was looking at replacing an old Microsoft Small Business Server 2000 with an Ubuntu system. I was looking at the following Dell system:- [URL]. I was concerned about the "Embedded software Raid" and if Ubuntu will work with it? The system is fairly basic:- We want a server with 2 x 500g SATA hard drives mirrored. We need to connect 8 Windows computers logged into a business domain. Accessing mail and a Microsoft Access data base.
I'm trying to create an archive of a websites images because it tends to go offline now and then. The problem is, when going to the image in full view, it opens it on a php webpage. I've tried using 'wget -m -A.jpg' but it only saves the thumbnails from the menu page instead of the actual images.
looking to implement a website where business partners can download/upload documents. The files and the "partner areas" should be password protected.Are there open source projects / Ubuntu packages readily available for implementing this type of web-based file sharing service
I'm really desperate as I have spent the better part of the last 10 hours trying to sort this out before my boss finds out
When I try and browse to one of our websites the browser wants to download the file as the server will not process the php file. I get which is a: application/x-httpd-php
What's really odd is that from inside the network where the server is located everything works fine, its only from the outside that this happens?
Everything was fine until I ran an system update from webmin that updated a ton of things including Apache2 and PHP5.
Its a self hosted server that was running UB server 9.10, but I have since upgraded to 10.04.2 LTS but no luck.
Apache version 2.2.14 PHP Version 5.3.2-1ubuntu4.9 Joomla 1.5 -latest
I'm using ubuntu 9.04, I want to design webpages in PHP for that I installed apache 2 on my machine. I'm using my mobile to connect to the internet when ever the mobile is connected and the wired connection established the server running properly. But when the wired connection disconnected my browser page didn't show the localhost home. How can I configure the apache on my machine.
I was thinking of creating an extremely minimal version of Xubuntu using XFCE. I have a Dell Mini 9, a netbook that uses a wireless-g card requiring bcmwl-kernel-source to work.What I would like to do is use either the alternate CD or mini.iso minimal install file to perform a command line install-style installation of the system.So far, what I am thinking (from reading this [url].... article:
HTML Code:
http:[url].....is to start off with these packages to begin with:
xorg slim (if possible with 9.10, unsure if it is still available. in short, i want to use a lightweight display manager) xfce4 xfce4-goodies xubuntu-default-settings bcmwl-kernel-source aptitude
My opening questions are: Should I go with mini.iso or the Xubuntu Alternate Install CD (or the Ubuntu one)? If so, which one? What additional packages will I need to make the hardware accessible and fully functional? All I can think of so far would be sound (I'd like to stay away from PulseAudio if possible, it wreaks havoc with my computer), my webcam, and the memory card slot, if additional packages are needed for it?What other "core" packages should I include in this list? Should I include Synaptic, or other packages, and why?What do I need to take into consideration, since this is both a directly- and battery-powered computer?
HTML Code: http://ubuntuforums.org/showthread.php?t=1155961 post regarding a "Ubuntu-Desktop-Minimal"-type system.
I am trying to set up a simple home file-server for media and backups, using an old Atom board I had lying around and 1GB memory, so I don't want a full desktop. All goes well with installing server 10.10, using LVM for my data disk. However, I wanted some GUI tools since I am not familiar with the CLI, so I installed gdm, xorg, and gnome-core as suggested in some threads and forums.So far so good, it boots into the Gnome desktop, but I can't get sudo access with anything (synaptic, gkedit, etc.) - always "incorrect password". I am fine from the console; I reset my user password, no luck; I set up another admin user, and that also works in console but not the desktop.I have no idea where to go next and can't find anything that works in the forum
As I do not want to entrust my firefox data Mozilla's Server and have a server runnig anyway, I wanted to setup a server which provides weave service. I found this and know from different sources of the web that this indeed works: [URL] It is a stripped down Weave server intended for personal use. All in all, it does not sound to complicated. The developer says
Quote:
You ll need a relatively recent php with sqlite, mbstring and json support, and apache, preferably running ssl. Give it a shot and let me know how it worked for you in the comments. I have Fedora 14. Only problem is that Fedora does not provide a php-sqlite package. For this I found a repo [URL] So I installed php-sqlite (as well as php-pdo, php-cli and php common) with
Code:
yum --enablerepo=remi intstall php-sqlite
In consequense all php releted stuff was replayed by remi provided packages. In addition I installed json to get JSON support. The author describes the setup quite easy as follow:
Quote:
SERVER SETUP
Add the following line to your apache config:
Alias /weave /<path to this folder>/index.php
Restart your apache server. Point your browser at h[URL]. Enter "blah" for the username and garbage for the pwd. Auth will fail, but it will create the db (you can cancel the subsequent request for auth). You should now see a file called weave_db in the directory.
You can create and delete users by running the create_user script.
CLIENT SETUP
in about.config, set extensions.weave.serverURL to [URL] You can run it under http, but this is insecure and not recommended.If I disable or set SELinux to permissive mode I can point my browser to [URL] Here starts the trouble. After attemping to login no weave_db is created. I tried even 777 permission in that directory. I can run the create_user script with php and add a user, but the database does not seem to be recognized, I am unable to login into weave service.
trying to set up a very minimal X on my 10.04 64 bit version of Ubuntu running nothing but Open SSH and acting as a firewall/router, making a PPP connection. The reason, is that I want to set up KVM/Qemu and run a virtual machine. The virtual machine will have a graphical environment and will be connected to my TV so that I can watch movies, stream TV etc. using it. So far, I have done the following:
I have Webmin installed on an Ubuntu server. I currently have a successful apache server running on port 80, however I want to create a virtual host on port 81. When I try I go to servers->Apache Webserver-> Create Virtual Host I change the port to 81 and the document root to /var/port81www then I click create. How ever when I goto 192.168.1.5:81 (local ip, I know I have to port forward but its not even working local) it does not work.
Installed 10.04 as a LAMP. I want to be able to create a new intranet site for testing purposes.
When creating a new site with in apache, what is the recommendation for the folder? With in the var/www? Everything appears to want a domain address and since its local only, what do I use as a domain?
I have webmin installed and I would think creating a virutual server would be my first step, but I am getting hung up on the domain address.
I am interested in creating a photoblog on Wordpress. Before I jump in I thought it would be wise to ask a few questions here first rather than getting into trouble and then firing absurd questions left and right. I am not very experienced regarding servers but not afraid either I was reading How-to's online reagarding installing Wordpress on Linux [url] and few questions came to mind :
1) Like any other server, does the computer that will run Wordpress have to be up and running for 24/7?
2) Since I will install Wordpress on a desktop, should I be concerned about my machine being compromised? I am not an expert on internet security so this is a big concern of mine.
3) Is it a good idea to install Wordpress on a personal desktop at all? Does running Wordpress from a different partition of the hard drive (if it is possible) help at all.
I'm trying to use a technique suggested by a fella at this website....
[URL]
He suggests adding an echo line to the actionban line in order to create or add to a file that will contain a list of all the IP's that fail2ban has banned.....but it doesn't seem to generate any output. .....here is the command.....
What is the recommended filesystem to use when creating a home server/nas?
I'd be sharing files using SAMBA, DLNA Server or some sort of streaming. I'll have two win7 laptops, 2 ubuntu desktops and ps3 accessing the files. Most of the time the server will just 75% read from 25% writing.
I have a little problem: I have a share folder on Ubuntu server: - Dump That folder is share with SAMBA and everyone can put files on it My problem is the following: When someone create a folder, the folder permissions are automatically set with: (let's take my username: Yann)
Owner: Yann Group: Yann
Clearly that's wrong.. I want the Group to be auto set has "users" so everyone can access the folders on that share. Anyone know how to change this ? chmod and chown is getting a bit boring
I recently created a webserver to host my website, using a Ubuntu 8.10 based system. (With some help from my experienced brother of course).I now want to create a mailserver to go along with my website. In setting up postfix to work with gmail smtp servers, I ran into a lot of permission errors.
I have an Ubuntu server running Samba and I would like to share out the cdrom drive to the network. I made a share of the /media directory and it seems to work fine when I insert USB drives and I am able to browse and work with files. However, when I insert a cdrom it automatically mounts to /media/<volume name> and I get a permission denied error when I attempt to access it over the network. I am assuming this is happening because the permissions do not include the execute bit and being a read only file system I can not change this. I made the directory /media/cdrom and manually mounted the cdrom to it and I can successfully access it over the network just like the USB drives. So my question is: Is there a way to make the cdrom automatically mount and unmount to /media/cdrom when I insert and eject disks instead of to /media/<volume name>? Or maybe just have the permissions automatically set so Samba users can open it instead of just see it.
I am looking into creating a web caching server for myself using fedora 10. I believe I need to use squid for this but it seems to have a lot of features. Basically, all I want for now is to be able to cache web pages that I and my network users use the most, increasing access time and lowering the load on my internet connection. Can squid do this and can someone point in the right direction on an article on how to configure such a thing?
I'm putting a server together and have run into a boot up problem. (I thought about putting this in the server forum, but it might be a more generic problem that others have seen and know how to rectify.) The install seems to have gone just fine. I have the /boot partition on an internal IDE drive. The rest of that drive and another are mirrored in a Raid0 configuration (using the Linux software to do that) for data storage. The swap partition is a part of the Raid5 SCSI array that also has the / (root) partition on it.
After installation it would not finish the booting process. I suspected that GRUB didn't like all the Raid arrays and such, but it seems to be fine. I can say that because the machine will boot into rescue mode with the GUI splash screen and I have access to the whole directory tree. I have already searched on-line and following prudent advice, ran the yum update while in the chroot /mnt/sysimage mode. That only took overnight to download and most of this morning to complete. Still no dice. Used vim to delete the rhgb quiet commands in the grub.conf file so I could see where the kernel seems to be hanging.
So right after the "Creating initial device nodes" is a line about my generic PS2 wheel mouse. So I tried a USB mouse. Got more output so tried swapping out to a USB keyboard. Got a little further with more information about input devices, but still stops. Also, I tried a PCI video card just to make sure the onboard video wasn't the problem - no change. So, if someone in the Fedora community knows what loads up or is configured right after the mouse and keyboard, I might be able to figure out what's causing the computer to hang during the boot process.
I need to create a lot of users locally on my server.I have these info:username:GID:UID.How I can make a "for cycle" for make a multiple useradd? (useradd -u UID -g GID -m /home/USERNAME -s /bin/bash USERNAME)I tried to do this:
I'm using FC10 and I want to create a symlink to my movies directory in my home folder:
This is what I did: I created in /var/www/html ln -s /home/username/movies movies
Then in /etc/httpd/conf/httpd.conf DocumentRoot "/var/www/html" <Directory /> Options FollowSymLinks AllowOverride None </Directory>
<Directory "/var/www/html"> Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory>
<Directory "/home/username/movies"> Options Indexes FollowSymLinks Order allow,deny Allow from all </Directory>
Restart apache and then the test page is working.
The directory /home/username/movies has following permissions: drwxrwxrwx 2 apache apache 4096 2009-03-05 23:43 movies When trying to access my webpage at localhost/movies I get the 403 Forbidden Error. Ok then, entering: sudo -u apache ls /var/www/html > movies This works, sudo -u /var/www/html/movies returns the permission denied error. As well sudo -u /home/username/movies Is the user apache chrooted by default? SELinux is in permissive mode. What can I do?