I administer several web hosting (combined with mail relays and other services) production servers under Debian GNU/Linux. I began giving these public services two years ago via three boxes: the first is a gateway which controls traffic via iptables (it's attached to a DSL modem) between a public subnet (the DMZ) and a local network which connects several workstations. In the DMZ subnet I maintain two Pentium-III era boxes, they've grown in services since I set them up. Actually, I think I should buy new ones, but, you know, I want to save money and lenghten its life.
So, they've grown in data hosted, but I've never implemented a resilent backup system. I've set up some rsync tasks sheduled via cron jobs to copy the entire UNIX file system in each of the DMZ boxes, but I'd like to be prepared before an unexpected "real" crash of some HDD, I mean, some problem that renders a disk unusable.
AFAIK, sysadmins sync entire HD backups which are capable of recovering a system via swapping the unusable unit with the backup unit. Maybe the best fashion is to implement a RAID mirroring the unit, I'm I right? So, keeping my systems as they are, I mean, capable of using 4 parallel ATA units, what would you do? Use dump, rsync or some other way to have an operational second unit with an exact copy in a bootable second drive, in order to quickly swap it if the main unit fails?
Comes to my mind to partition a second unit (so making it bootable) and backup daily via rsync only those parts of the unix file system hierarchy which are necessary to boot a system properly. What do you think about this workaround?
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
We have several production machines serving our static content. I want an automatic operation where I would only need to upload to one of them and it will get replicated/mirrored to all machines.Either that or distribute automatically from local source to all machines. First option is obviously better since all machines reside on same LAN and remote to our office.
Another feature I am looking for is to get a full report on what got transferred and if any problems occurred immediately (since usually these deployments can mean downtime for us). Looking around I saw rsync.Using CentOS 5.4-5.5 btw.
I got a Web Server running great. But I need to back it up. It doesn't for some reason detect either of my USB External HD's. How can I make a TAR or something and burn it to DVD and then be able to restore using that disk.
I am currently working on managing multiple linux servers in remote locations, servers particularly user for web hosting. I need to backup data to a backup server but rsync which i currently using doesnt helps is there any tool to backup every server with out modifying it bcos there are hundreds of servers so installing a tool in every server is time consuming process.
i want to run a postfix server as a backup mx, but anybody knows how can i collect the fist server mails with this one? this is multipop action but how can i do it with dovecot or any other pop collector?
This is my first time to set up a production web server and I got some few questions on our migrations:
1. Our website from the Web Hosting company already gaining 5000000 hits/month and 35000 unique visitors/month, problem is we only have 2x4mb dedicated line here in the office and one IBM x3650 m3 for our LAMP, you think guys its enough to handle that kind of traffic if we start moving our web server here in the office?
2. If I register www.example.com to GoDaddy for example, do I still need to setup a DNS (BIND) server on our side?
3. This is my current Apache config: Apache/2.2.3 (CentOS) DAV/2 mod_fcgid/2.3.6 mod_auth_kerb/5.1 PHP/5.1.6 mod_python/3.2.8 Python/2.4.3 mod_ssl/2.2.3 OpenSSL/0.9.8e-fips-rhel5 mod_perl/2.0.4 Perl/v5.8.8 with PHP eAccelerator.
Anything to share to increase the performance of the web server?
i have a production server running RHEL 4.0 with 2x146 GB on a RAID 1 with OS and another storage with 2x300 GB on a RAID 1 with the application, it's the database and application.
No LVM was installing and configured before, and now the second array with mirror of 300 GB is running out of disk.
1. i have 2 new hdd to build another mirror 2 x 300GB.
how can i create a LVM to star using the vgextend anytime i need it?
I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?
I'm trying to setup a server at home, it has some practical implications, but largely it is just to take a stab at it. But I need the help of someone with more experience than I in defining exactly what I'm looking to do.
Here's what I have: old PC running Gutsy server connected to router. Several laptops at home connected via wifi to router. All laptops running either Windows or Ubuntu. Here's what I'm looking for: The server centralizes file storage for all clients. I would likely incorporate a RAID and some synchronised imaging of the files. I also want the server to create disk images of the clients hdd, regardless of client OS.There would also be some shares that would be publicly accessible (myself and friends accross the country would be able to access the same drive).
So I was thinking something like what corporate environment would be nice, you log into a profile that exists on the server. Like a dumb client...all data would be stored on the server. But I'm thinking that's more like a network boot and wouldn't work via wifi (or would it?). Also that wouldn't lend itself well to laptops used on the road in areas without net access. now I'm thinking each client would have its own locally installed OS, and they would just access networked shares. I could store sensitive files on the shares, but that wouldn't provide complete backup solution for each client.
Without rambling on anymore, anyone care to throw out some ideas? I'm really just looking to see if I can do what I want. The focus is on centrallizing files, securley backing up data and client OS's and ability to restore said images quickly.
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
My J2EE application is deployed in Weblogic 9.2 MP3 in Linux box.The problem is "api response time in production system is higher that test system( test system has more data and load w.r.t production .)"we got large amount of page fault in production Garbage collection log ,where as in test system (with same load ) page fault is Zero.And this page fault making my production system slow.In respect of JVM(JrockIt)both the system memory settings(Xms1024m -Xmx1024m -Xgcprio:throughput) are same.
any system/kernel parameter set is missing in production which might cause this large amount of page fault. If any more information is required let me know?
Linux Version in Prod: Linux version 2.6.18-53.el5 (brewbuilder@hs20-bc1-7.build.redhat.com) (gcc version 4.1.2 20070626 (Red Hat 4.1.2-14)) #1 SMP Wed Oct 10 16:34:19 EDT 2007[code]....
I made the terrible mistake of upgrading my live Debian Lenny web server with the the dist-upgrade option in apt-get. I didn't realize this was actually an unstable upgrade and now I have had to make all sorts of choices of what configuration files to keep or upgrade ect. The apache conf files were actually bad after the upgrade and I had to replace them with the backups (phew) and the system is currently still up and running. However my virtualmin installation is no longer working due to a issue with perl ( but thats another question I guess to ask somewhere else maybe ). Anyways... I'm very scared to restart because my server is co-located somewhere else and Im the only one who has ever worked on this server so I would need to go there and fix it myself if it doesn't restart. Basically I have two questions.. is there an easy way to move back to stable packages..If so is this recommended?
And also I'm currently trying to fix some broken dependencies in the package manager but when I run "sudo aptitude -f install" It keeps telling me it is going to remove all of these packages (listed below), some of which I know are very important to the system and I cannot figure out why it would keep trying to do this. I get an error on "phonon-backend-xine" whenever it tries to upgrade just saying this
"(gtk-update-icon-cache:12343): GdkPixbuf-WARNING **: Cannot open pixbuf loader module file '/usr/lib/gdk-pixbuf-2.0/2.10.0/loaders.cache': No such file or directory"
I installed some desktop related packages a while back like gnome-desktop and I know the package is related to this, but all I really care about is making sure the server stays online and not about the desktop packages. I tried just removing kdebase-runtime and anything else that is dependent on it, but it wont let me do anything at all without fixing this broken package.
I really would just like to go back to lenny stable again but I know its probably too late since I already had it install a new kernel and grub 2 (auto configuring my new grub.list)..
I just finished getting a semi-old desktop computer up and running with ubuntu server edition. It's running subversion and samba for my office of 5 people. I was wondering if there was a simple online/cloud based backup service to automatically back up my subversion repository and samba shares in case of any hardware problems?
I have two servers on a vlan at my datacentre/colocation and previously both servers had public IPs on their eth0 interfaces. The servers are HP ProLiant DL360s - one is a G4 and one is a G5 The newer G5 is now the LAMP server and the G4 has been retired and I want to repurpose it as an iSCSI target using openfiler freenas or similar.
My G5 has public/static IPs lashed to the eth0 physical interface and the eth1 is not configured to do anything yet. The G4 will have both interfaces available - perhaps one for ssh access from one of my static public IPs and the other to be a private IP on the local vlan. Here is what I am trying to get my head around...
The G5 eth0 - Public IP - full LAMP services on two or three virtual interfaces eth1 - Private IP 192.168.0.1 The G4 eth0 - Public IP for ssh eth1 - Private IP 192.168.0.2
Because my traffic between eth1 on these boxes is via private IPs on the local private vlan it doesn't add to my quota for bandwidth. How do I go about configuring the routing and gateways and other aspects of this so that I can run a private IP space network between the eth1s and still serve the outside world from the eth0s...
I am afraid that if I assign the private IPs to the eth1 interfaces the routing may either not work or interfere with the access to the production internet facing interfaces (eth0s).
i have a small home network. i have laptops and workstations that my family (public) uses and an esxi box (private) that i use to test new apps for work.i need to have the public network separate from private. I have tried using two linksys routers but was unable to get the private network to access the internet. i was thinking i could use iptables with a couple of nics but I am not sure it would work. I know this could be a lot of work for someone that has never used iptables before but will give me a reason to learn it.i am sure setting up a public and private network has been done before i just don't to buy a bunch of hardware. I have a extra workstation and a bunch of nics so i would like to go that route. I am open to suggestions.
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
Brand new to Linux. Sort of got thrown in front of the bus if you know what I mean. The company I work for has a Linux server running CentOS 5.4 Company uses Linux for their Email, FTP and Web Server. Have been here a few years dabbling in and out of Linux and now that the old Admin has left the company.....I need to learn it ASAP. The server has run pretty solid until today.
The email server runs SendMail and SpamAssasin. Received lots of complaints today regarding extra SPAM. Noticed that SpamAssassin was not running. Tried to restart it through the WebMin tools and got the following error: Starting spamd: child process [3956] exited or timed out without signaling production of a PID file: exit 255 at /usr/bin/spamd line 2588.
I have installed a linux server in my office to run 16 machines. Its main use will be a internal mail server but will be also running websites.
I have installed Ubuntu 9.10 server x64 and have got apache running.
I am looking for the simplest more robust solution for smtp, pop3 and imap. I have only ever used qmail before and found it a pain to configure and its getting old so I though I should probably try something new. I have not much experience with running pop3 or imap on linux so would love a suggestion on that.
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
I'm going to be launching my website soon, and I found a company to host it on one of their dedicated servers. I think I'm going to go with fedora as to OS, and well my problem is I'm having trouble finding a company to backup up my files, that both supports fedora and well is reasonably priced.
I have several machines that I'd like to backup various folders through the linux box onto DVD-RW media.I want to keep log files of what was written and when to dvd on one machine and have it automagically assign a unique serial number that I can print on the dvd in case I need to recover.I'd like a user friendly UI that I can point and click to schedule the backup and it's type.Is there a good Fedora backup application (read easy to use/understand and configure) I can use to backup machines across a network and across multiple dvd's (if needed)? The host machine is a F11 box.These are a mix of Win server 2008, win 7, win XP and several Fedora boxes.Speaking of dvd media is this a good idea and how many erase/write cycles are they good for?
I need to install a backup server in my work environment.We have a Windows 2008 server and an old DELL PowerEdge 1750 server that has no OS on it yet.I would like to install Fedora on it and then backup the Windows Server data on the Fedora server using rsync or something else to do the backups.Do you think it's a good idea ? If not what would you use to backup the Windows server data, preferably on a linux system.
I have been hassling with this for several days now. I have 64-bit Ubuntu Server 10.04 running on an Acer Aspire EasyStore H340. I have windows 7 running on a 64-bit desktop pc and on a laptop. I mainly wanted to use the Ubuntu server for a file server, so I installed Samba and created three shares. These do show up in Windows explorer, and I can read and write to them. My windows applications seem to be able to see the shares and open & save files.
My next step was to try and set up a backup of the Windows 7 pc to the Ubuntu server. Windows integrated backup sees the shares and sub-directories within the shares, and the initial part of the backup seems to run OK, but when it tries to save an image of the 'C:" drive it works for a long time and then ends with an error (cannot complete backup).
So, I looked for some free backup programs to try, but these do not allow me to select the shares as a destination (invalid destination). The dialogue sees the drives I have mapped the shares to in Windows, but does not show any sub-directories, and selecting the mapped drive letter does not take as a destination. If I try to browse down through "Network" in the destination dialogue, it selects "Network," but does not expand it or accept it as a destination.
So, I partitioned, formatted as ext3, and mounted my 2nd 1TB SATA drive on the server, and mounted it as "storage." I set this up as share in Samba and gave everyone read-write access, but still no luck selecting it for a backup destination. After some Googling, I downloaded and installed Ext2Fsd-0.48 (a windows 'driver' for Ext2/Ext3). This installed correctly, but when I open the program, neither "Network," the shares, or the mapped drives show up anywhere.