Ubuntu Servers :: Setup A Secure And Reliable Server?
Dec 27, 2010
how to setup a secure and reliable server, i have three ubuntu 10.10 servers a Dell PowerEdge 850,1850 and 2850 which has a Dell PowerVault 220s attached to it.The Dell PE850 Server Consists of:
Intel Pentium D 3.0GHz
4 GB RAM
Eventually 2x250GB Sata Hard Drives
I would like to setup a reliable webserver, mail server, DNS and Dynamic DNS, DHCP, SQL, FTP, Samba (with Roaming Profiles), PXE Boot Server.I know how to setup most of the server modules, i would just like to know the best way to do it tho. I also want to no how to setup the secuity of the system correctly, and setup and partition up my hard disks to allow for the best reliabilty, even when a server crashes.I would like to now how to set these servers up from start to finish in a sence.
I am going to set up a file server on Ubuntu. I have searched a while, but can't seem to find a guide to what I want. The requirements specifications are the following:File server: possible to upload, change and download files.Linux (Ubuntu) clients, Windows clients if possible.Access restriction to deny access to other than registered users.Only the user should be able to read the content of the files.Ideally root should not be able to see the individual files, but in worst case it is ok for root to see the files.Root should not be able to open the files.Point 1-3 is easy to find out how to set up. But I can't seem to find a way to deny root to view the files. The only solution I can think of is to encrypt files or a whole folder, but I don't know how to set it up.
The setup is for a home network, but the server used as a file server will have a web server as well. If someone manages to get access to the server I don't want them to be able to read the files.
I am running an Ubuntu Server on a VirtualBox VM running on my windows machine. So I've created a self-signed certificate using the following tutorial: [URL]
From this tutorial I'm left with 3 files: server.key server.csr server.crt
Then I found this very similar tutorial that has an extra bit on installing the certificates in apache: [URL] So I followed it's instructions which boil down to this:
[Code]...
So I'm thinking this should work now. However in Chrome I get: SSL connection error Unable to make a secure connection to the server. This may be a problem with the server, or it may be requiring a client authentication certificate that you don't have. Error 107 (net::ERR_SSL_PROTOCOL_ERROR): SSL protocol error. IE8 gives me a typical "Internet Explorer cannot display the webpage" Note that [URL] fails while [URL] works fine, so it's definitely something in my ssl setup I'm thinking.
Since all what I can find on google is about benchmarks I'd like to ask you if a 64 bit kernel would be more stable , secure and reliable than a 32 bit one.
I ask this question because apparently the 64 bit instruction set offers more advanced security features (i'm saying apparently because I'm not able to give details since it been a really fast read) which would be used by a 64 bit Operating System (Apple also stated that 64 bit applications are less likely to be "attacked").
I have to assume that a 32 bit one does not use them right? Should I stick to 64 bit? (to be honest that "not for everyday" thing on ubuntu download pages made me wonder lol, because since intrepid i ALWAYS used the 64 bit version)
My "lowest" computer has a pentium processor (1,6 ghz dual core) according to lshw I have NX enabled and my ram is 2GB (might seem useless using a 64bit kernel on 2 GB but i'm more concerned about security now)
I am seeking for a secure and reliable mp3 and video player. For security reason, I'm reticent about Universe repository. Any way to get my MP3 player and my video player to work without having to resort the Universe repository? Preferably can play RMVB files.
I would like to use my Ubuntu server machine as a proxy so I can browse a little more securely/privately while I am traveling. I connect to a lot of open Wi-FI networks.I have Squid setup on an old laptop running Ubuntu Server 10.10 at home, and the main machine I will be using to connect to the proxy is a computer running Windows Vista.I am able to connect and use the Ubuntu Server machine as a proxy while traveling with the squid config file modified with http access set to 'allow all'.
Obviously this isn't the ideal setting.After lots of reading and Googling I can't figure out how to allow only my Vista laptop to use the proxy.I'm a little lost with the ACL settings required.
I'm learning to secure my server in the best way I can think of: By learning to attack it. Here's what would like to accomplish. I have SSH set up on a linux box in a offline lab environment. Username: root Password: ajack2343d Now, I know I can simply brute force this as I know the password, but there has to be other ways, and I wish to learn them.
I've set up a server for the first time today and I'm reading up on how to secure it. But I was wondering if anyone here would give me some tips from personal experience on what to do before going online with my website for the whole world to see. I'm running Ubuntu Server edition and Apache. Am I good to go with default settings or is there anything recommended that I should first do?
I'm trying to setup an open-source project, I have a couple of developers on the team but nobody has experience with Apache. I would like to setup a simple home server for Bugzilla on Ubuntu 10.04, so my question is, is there a server that comes secure out-of-the-box so that simply adding files to /htdocs would suffice?
As far as I know, servers are stable and don't go down easily, but every single server will eventually go down some day, either from hardware/software failure or from hacking.
But as sysadmins, our job is to keep servers running healthy as long as possible.
So I'm conducting another short survey (I might start more survey threads, and thank everyone for kindly replying my previous post):
1. Have you encountered server failures? What's the most common cause for server failure? 2. What is your most important trick in avoiding your server go down? 3. What security rules do you follow to protect your servers?
I currently run Win 7 and want to upgrade my computer to a server to accomplish the following... I have a VPN Service. I want the server machine to connect to a VPN providing a secure connection. Then, I want all the machines in the house (windows based) to connect through the server onto the VPN connection. Hopefully this makes sense. Would it be better to stick to Windows Server 2008 or switch to Ubuntu?
Does anyone know how to go about setting up a secure IMAP email server that is able to be accessed from outside the network? Similar to how you can access your google email account from your computer using Thunderbird.
Using Thunderbird as mail client, I notice an option in the mail account's Server Settings which reads "Use secure authentication" which allow secured transition of your username and password.I also have my own mail server. Hence, how do I enable this functionality for my mail server (I'm using Postfix & Dovecot) ?
I have been running a fedora server without issues for many years and recently updated to FC12. It turned out that there is a bug in OpenSSL in FC12 that caused me a lot of problems and I had to build apache, mod_ssl and OpenSSL from source to get it working.I'm now looking at migrating to another distribution becasue Fedora is not appropriate for a production server environment in a growing company.
What I want to know is what is the most reliable distribution with regards automatic updates (bug free that is)? I'm comparing RedHat, Ubuntu LTS, Debian, etc. at the moment. Is there any report that looks at this topic?I'd expect RedHat would have the best, but I also have to take cost into account as it is the only platform that you have to pay for these updates.I read this but it seems to quickly move away from the "bug-free" question initially asked.
Can someone recommend piece parts construct a reliable green server?
I am thinking: -Case with hot swapable drives, and energy efficient power supply -SSD harddrive for 5900 RPM drive -low power CPU, but one with a decent amount of power, IE. 2.0ghz dual core or better.
I followed the tutorial found here [URL] but when I try to access [URL] I get the following: Code: Secure Connection Failed An error occurred during a connection to www.mydomain.com. SSL received a record that exceeded the maximum permissible length. (Error code: ssl_error_rx_record_too_long) Not sure what I might have done wrong... I have retraced all of my steps and I don't believe I missed anything.
i got an old server with a 64mb flash-disk in.this seems to be the boot-disk.would it be possible to get ubuntu server on this?there is also an ide-disk in.in my opinion it should be good to boot of the flash disk and put the /home or /srv on the ide disk.someone any ideas about this?
I am completely new to everything linux, however I decided that it was time to do something with an old pc I had and I was thinking of setting it up as a home server. I look around for an easy guide on how to set up the server from the begining but I can't seem to find one that's good for a complete beginner like me.What I would like to do is to had the server as a place where I keep all my movies, music etc and be able to acces it from my home network and also from outside of it. Also, I would like to connect my two printers to this server and share them on the network.
Is there a place where I can find simple information about what I want to do and is it possible for a complete beginner like me to do what I want to do?
Do we really need the lines if I am allowing any machine to connect? And I have only 1 network card. This is acting as a dedicated TFTP server.I ask as I get an error on the get and put commands as below:
tftp> get xxx.txt tftp: error received from server <File not found> tftp: aborting
However although it gives an error it DOES GET the file.The put command however does NOT put the file. I have nobody / nogroup setup for the directory permissions as well.
tftp> put .vimrc tftp: error received from server <File not found> tftp: aborting
I need to set up a server for my company.There are only 4 people working here and we want a server that will allow us to run a cloud-like system. We want to be able to log in and use programs and have our files available from within the office and at home. I've used Citrix in the past and would like something similar, but free/open source if possible.I want an Ubuntu based server that gives the user a windows compatible environment. What I mean is it needs to be able to run windows software.What kind of hardware do I need? We want to run fairly resource hungry software like GIS and mapping software. I want to run Ubuntu server but what other software will I need?
Ive tried on multiple occasions to setup a dhcp server, but all the google searches return legitimate results it seems, and ill set everything up, except the last few instructions i have to configure /etc/dhcp3/dhcpd.conf and /etc/default/dhcp3-server, but ubuntu says its not there.Im running Ubuntu Server 11.04 and using nano to edit files
I need to setup a squid 3 proxy with https bumping. Unfortunately I'm not very familiar with squid and https in general.
I already perfomed the following steps:
1.) compile from source
Code: Select all./configure --with-openssl --enable-ssl-crtd make make install
2.) configuration (http) I used this guide: [URL]
3.) configuration (https) [URL]
The server is now working for http and https, but is the server secure, too? Is the default config already secure or do I need to configure additional security features? (e.g. things like cert validation, cert pinning, [dont know what's importend], ...)
I am trying to setup a web-based secure ftp client that can handle not only file transfers to and from one of my company's servers, but also allow new clients of ours to visit our site, create an account of their own and use it to log in and begin transferring files. This way, the users can manage their own accounts.
I don't know a lot about exactly what is running on our server, though I am almost positive it is debian based. I really only have access via ssh and ftp. I may be able to do more in the server room, but haven't tried. I thought about using net2ftp, but that doesn't seem to work with sftp, and also doesn't allow the creation of new users on the server.
Is there anything out there for me??You will undoubtedly require more information from me, so please let me know what it is and where I can find it and I'll get back to you as quickly as I can.
I recently setup a server running subversion. It is working well and all, but I am curious as to how to make it send out email alerts to my team when a commit occurs.Also, are their any good GUI tools to manage a subversion server in Ubuntu. The command line is fine since I made some scripts but a nice GUI would be great! Maybe that could be a fun project to work on.
I'm running Ubuntu 10.04 server as a guest on a Windows 7 host using VirtualBox. I've set the VirtualBox configuration to use a Bridged network connection so that I can access the internet through the Ubuntu guest and to access the Ubuntu apache server through my Windows host.This is all running on my laptop which connects to various routers using dhcp (some ip addresses start with 198. while others start with 10.) What I need is a single static ip address (or hostname/url) to setup my cms (drupal).how I can accomplish this given the varying routers the laptop connects to?
I am trying to setup a DNS server on my local network. When I set linux clients to use it, it works as expected. However, when I set windows clients to it, the root name doesn't resolve. For example, I have a zone called daniel. On linux "anything.daniel" resolves to the correct ip as does "daniel" which is the behavior I want. However, on windows 7, "anything.daniel" resolves correctly, but "daniel" doesn't. I am new to BIND9 so my config is mostly copy and pasted. Here is my zone file for daniel (where #.#.#.# is the ip I want daniel to resolve to):
@ IN SOA ns1.daniel. admin.daniel. ( 2007031001 28800 3600
Been messing around with Ubuntu 9.1 for the last few weeks and am loving it so far. Been trying to get in the terminal and learn a little something, to no avail. LOL I have been googling and searching the site today for info on networking. My Linux box is a desktop, with my main HDD mounted with music, and movies and some other stuff. My intent is to network the two laptops in the house (Windows XP and Windows 7) to the Linux box so I can listen to my music and watch movies when not in the office. I have found some info, mostly involving Samba, and plan to install Samba tonight and fiddle with it. My issue was with security. I have read a few posts and they talk about the fact that if you share files in this manner, the set up is not secure at all. Is this something i should really be concerned about? If the folders I share only have my music and videos in them,
i have been searching ways to setup an ftp server. but it seems, everything i tried does not work or not what i wanted.how to setup an ftp server, allowing only permitted users. I'm using the desktop edition live cd.