Debian :: Backup Running Server Using Rsnapshot?
Jan 31, 2010How to backup incremental running debian server using rsnapshot?
View 2 RepliesHow to backup incremental running debian server using rsnapshot?
View 2 RepliesI have installed rsnapshot from slackbuilds.How to run it to backup home directory?I'm also reading the official docs.
View 3 Replies View RelatedI have a machine on my network and that machine is a mass storage server that I will eventually use as a media server (to stream movies, videoclips and music on my home theater system). I use slackware 13 on ALL of my machines.
I am trying to automate the backup of the "/home" folder of my laptop onto the mass storage server. I currently use rsnapshot and it works great, but I would like to automate the whole process, even if I am not home or in front of my machines...
Here's what I imagined (in pseudo code):
1) Poll if server is active (up);
1.1) If not:
1.1.1) Wake up the server (WOL);
1.1.2) Wait for the server to boot;
1.1.3) Confirm the server has made it to the login prompt (normal boot);
1.1.3.1) If not, send an alarm via email;
[Code].....
Rsnapshot is a software written in Perl to make backup of local and remote file system. The well proven rsync is behind this utility. rsnapshot does not need root user intervention to restore the data of a normal user. It does not take much space in your Backup server. It can be easily automated (scheduled) to make life easier. Just setup once and forget it configuration. Basically it takes snapshot of file system (or a part of) in regular interval such as hourly, daily, weekly and monthly.
This can be configured easily through a simple text based configuration file. The above task can be setup in a few easy steps in a few minutes. Two major tasks are configuring rsnapshot and openssh automatic login. To make the backup automatically, we need to automate the remote login in a secured way. This can be done through openssh tools. This scenario depicts backup of desktop (assuming that IP address is 192.168.0.100) data to a backup server. My desktop runs on Ubuntu 10.04 and backup server runs on Debian Squeeze. [URL]
I currently have a centos 4.4 I believe running with a 250GB hard drive. I want to make an image of that hard drive. I have tried removing the drive and connecting it to my windows pc using an adapter that would allow my windows machine run the hard drive as it was a regular external hard drive. Of course windows doesn't reconize that drive since it is linux partitioned. I am thinking that I need to have the hard drive inthe box I am wanting to copy and put in a blank drive in the box that I want to copy to. And boot from a live CD and use cat or dd to copy it. I have seen the commands before bust I am thinking this is the only way. Basically I am wanting to have a duplicate of the drive and build a whole new server that is already all setup.I will just change the host name and assign it another Public facing UP. Is this correct? Oh, and the new server will have different hardware. Might even be AMD or intel different from source or destination.
View 12 Replies View RelatedI am backing up my debian server with rsnapshot which actually uses rsync to perform the backup. The backups are located in an external storage of size 1.4T .
[code]....
I tried to understand what this error message means and i founde that error code 12 : 12 Error in rsync protocol data stream I understand that when rsync find that a file on the target was changed , it will send only the block/blocks that contain the changes and in the destination rsync will create new file and not update the old one (new inod...) . I want to know if this error i get is due to full disk or perhaps it is some other factor
A question that I've asked myself (and google) many times but never got to an answer.Scenario:ServerA) postfix Primary MXServerB) postfix Backup MXAccording to the rfc a source mailserver can decide where to send email e.g.send an email to backup even is primary is reachable up and running.So my questions are: 1) how do I (live I mean) replicate messages arrived on any server to the other?2) How do I synchronise sent messages, message deletion, folder creation/deletion.Imapsync goes half way towards the solution but it needs to be run manually one off and you must specify source and destination.All I wanted really is a user to be able to use either server without having to worry about primary/secondary or what so ever.
View 4 Replies View RelatedI need to backup my active production servers (yeah it's too late now) with image cloning application that were running RHEL3-5. The problem is I need to run it remotely from my office. Most of the software I found either need to use bootable cd or need to unmount my partitions which is I wasn't allowed to since it's a production servers. I also tried dd but it consume too much time, sector by sector cloning and empty disk space also included so the file created also big in size.
View 4 Replies View RelatedI switched last summer from Windows (used it since Windows 95) to Debian. I'm using Debian Jessie for a couple of months now and I'm getting used a little.
There are problems here and there, but I can solved them with some reading on the web. Not really a big problem...till now
I run Debian 8.2 om my PC (PC1). Bought an older PC (PC2) that I want to use as a backup server.
I'm using PC2 only for making backups, after the backup I switch it off again.
So I installed Debian 8.2 (net-install without DE and with SSH) on PC2 and tried to configure it to let it work as my backup location. Made a public SSH key and exported it to the root account (no problem) and to the user account (sensdeb), but there was an error "Access Denied"
Gave the user (sensdeb) sudo-rights via visudo file
# User privilege specification
root ALL=(ALL:ALL) ALL
sensdeb ALL=(ALL:ALL) ALL
I installed rsync.
The problem is that Rsync only works when I use the root account.
Code: Select allrsync -r -n -t -v --progress --delete -u -l -H -s /media/Data/Mp3/Anastacia root@192.168.1.102:/test/Mp3
When I try to sync with he normal user account sensdeb
Code: Select allrsync -r -n -t -v --progress --delete -u -l -H -s /media/Data/Mp3/Anastacia sensdeb@192.168.1.102:/test/Mp3
I get error's. Access Denied
I don know how to give the user sensdeb the rights so that I can use that account for my backup tasks. Now it's possible to sync with the root account, but that should not be the way to do it, I read many times.
I have a Linux server running Debian 5. I need to get it backed up. I have worked with tar in my past Unix days. I've read that there are some limitations to tar though since it is intended for tape.
My server would need to backup to a Windows server share. From ther it will be backed up to an offsite location. Can anyone make a recommendation for what to use? I would like to take a full backup weekly and then a incremental daily.
This is not a regular backup. I only want to backup selective directories so personal files (photographs, documents, sourcecode) will be kept safe in case of a total system meltdown. This'll be 15GB max. Basically the digital variant of a fire resistant safe. I looked into duplicity but that requires me to install gpg keys on the target machine, which I can not do. I rather have a solution that just relies on just a working shell account and diskspace on the target server.
I thought of writing a simple script to do the following:
1. Mount remote server with sshfs
2. Mount encrypted container at remote server (LUKS, TrueCrypt?)
3. Loop over predefined directories on local machine and copy to encrypted container (rdiff-backup?)
Based on these requirements:
- Target server is "dumb": only ssh access + diskspace (i.e. no installing of gpg keys)
- Encrypted container should grow/shrink to fit contents
- Encrypted container should be easily decryptable on any OS if you have the password
- Once data leaves client server it should be encrypted: sysadmin on target server should never be able to see unencrypted data.
I have a live production server with different partitions that I would like to backup/ghost so that if the server crashes, I would have an easy way to restore it. My server is running Debian Lenny, it's got 2 x 2TB hard disks in RAID1 mode.
Filesystem Size Used Avail Use% Mounted on
/dev/sda5 4.6G 1002M 3.4G 23% /
tmpfs 1.7G 0 1.7G 0% /lib/init/rw
udev 10M 744K 9.3M 8% /dev
tmpfs 1.7G 0 1.7G 0% /dev/shm
/dev/sda1 122M 15M 102M 13% /boot
/dev/sda6 4.6G 289M 4.1G 7% /home
/dev/sda8 942M 18M 877M 2% /tmp
/dev/sda9 1.8T 3.9G 1.7T 1% /var
As you can see, total disk usage is about 5GB. I can transfer that to another server on the network.What would be the best way to perform a complete backup without stopping the server and how could I restore that backup in case of emergency ? After browsing the entire Web, I couldn't find the answer that would make my life easy, so I give it a try here
I install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.
[URL]
The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days
find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
View 7 Replies View RelatedI'm running the current release of Debian with the 2.6.26-2 kernel. This is an upgrade from an older (2.4 kernel) series redhat release. One of the things I had working in the older system was a dns server with accompanying monthly update of the root hints file. I tried working through a dns how-to to set this up again, but it seems much has moved around since I last played with this. All of the files listed in the how-to are not where it says they should be. I am looking for a better reference on keeping the dns server running with current server information.
View 1 Replies View RelatedI am not seeing what i am doing wrong here, but here goes:
From my server I need to run a command for backup on 25 remote servers (through a script). Now I have pushed the public keys for remote ssh connectivity on all of them and it works ( I can push files using rsync without the need to enter passwords on the remote servers), howver, I need to run the following command:
ssh odsadmin@10.139.111.1 'cp -a /var/www/life /var/www/life-v4'
when I run this command, I keep getting asked to enter the password, I even tried putting sudo in front of the cp, but still get the request to enter the password.
I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?
View 4 Replies View RelatedRight, just a quick question about rsnapshot over sshfs and encfs. I've set up an encfs filesystem, and when mounted on the remote machine remotely:
Code:
touch foo.bar
Code:
cp -al foo.bar foo.car
Works as one would expect it to.
The same is true on the local machine (The EncFS has External IV chaining disabled). However, when the remote dir is sshfs mounted on my computer here, and then encfs'd to a decrypt mount on my computer, I can move files to it, and they go over the network and get encrypted, however:
Code:
cp -al <file> <file>
No longer works, I get 'not implemented' errors...
I thought since I don't have External IV chaining this shouldn't be an issue - I've tried without any of the file chaining options, again to no effect. All work remotely, or with both locally, but not over sshfs. Is this a quirk of sshfs?
I just did my first rsnapshot backup of my /home/ to an external harddisk. When I am not at my computer for a couple of hours, I always shut it down. Therefore, there are no predictable hours of the day where I know that my computer is running. So, how should I schedule/crontab my rotating rsnapshot backups?
Is anyone using rsnapshot in combination with a schedule which is not based on exact times but rather on the time the computer is running?
I have since a couple of days a vps. I discovered that there's is no nameserver is running.
# host google.com
Nameserver not running
google.com A record not found, try again
Also there's no resolv.config in /etc/. I re-installed the OS several times without any changes. I ask my host about this but he has not answered my questions.
I have a server with a postgres database, apache and a custom java application.
I am trying to run rsnapshot to backup /home /etc and /var folders.
But I am running into issues with rsnapshot and permissions. More specifically these kind of errors,
Code:
I look at the permissions on these files with ls -la, I get
Code:
The owner of the files is root and postgres users. I am using passwordless login to connect to server as user XYZ. XYZ has root access to the server and to the database.
These files are all over the place. Some in /etc and some in /var/lib for instance. How can I best copy these remaining files.
The server I'm running runs Debian Etch, Squid and Shorewall. Every 24 hours the server gets a new internet IP so I need to use dyndns to keep the dns pointing to the correct PC.
I have a webserver that is running behind the debian server and am having trouble with it. When I enter the web address, it gets a timeout.
I am running virtualbox on a Mac Mini Server. I would like to install Debian on it and give it an internal static IP address. E.g. 192.168.1.xxx
I am looking at all the downloads. The Mac Mini Server Snow Leopard has no CD/DVD but plenty of disk space.
It will be used as a LAMP webserver, Java, Ghostscript and pdftk but also install ISPConfig on it.
I was looking at this ISO: [URL] Is this ok?
The Mac is a 64 bit Intel based machine. If I wanted 64 bit on the VirtualBox, which 64 bit ISO should I get?
What would be necessary to run an ftp server (or a web server) on my local PC so that other people I know could access it and download stuff from it? The idea is to share photos, videos etc with friends/family where the files are a bit too big for email. (All 100% legal, own-content, no copyright issues, needless to say). Security isn't that vital, I'd just put files in the ftp directory, email the link and let them download the files, then remove them again. No passwords are required, and no uploads.
Obviously there's the problem that both computers have to be on at the same time, and I assume I'd have to change my computer's firewall settings and my router's settings to allow the traffic through, but my question is more basic than that - is it even possible? My internet connection is through a router, and as I understand it, my router has the IP address, not my computer. So I can connect through my router using my computer's IP address, but only my router knows my computer's IP address, and all the rest of the internet just sees my router and its IP address. Which means (I think) that I can't just send my IP address for my family to connect to, because that only gets them as far as my router, and the router would have no idea what to do with such requests. Am I right so far?
So is there any way for my family's computers to contact an FTP server or a web server running on my computer? Or does it require some kind of intermediary server to act as a traffic-forwarder? Is there such a thing? I'm assuming that setting up little private torrents would be fiddly and inefficient. Or would it be better/simpler to use one of the free filesharing services and put up with the (sometimes not too family-friendly) adverts associated with them?
Kernel version: 2.6.26-2-686
Java version: 1.6.0_0
I'm going to run a full system update, but I doubt that'll work.
What happens is I turn the computer on, log in, start the application with:
java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui
and it will work fine for a while, but then, while we're playing, it'll become unresponsive and crash. I then go over to the physical computer, wake the monitor up by pressing the down arrow, delete the "[[B" that appears, and try to issue the applications stop command to save and close the application. This doesn't work. The computer registers what I type and registers the return, but nothing happens. I switch over to my second console, log in, find the PID of java, and attempt to kill the java process as root. It accepts the command and gives me another line of prompt, but when I switch back to the first console, nothing has happened.
I then switch back and try to issue a shutdown for right now, and it'll get to "Sending processes the SIGTERM signal", and then it waits for them to be killed, but will not progress further than that. I then have to either force the shutdown with the 3-second hold for the power button, or I have to turn off the power bar it's attached to.
Also, if I'm at the prompt and press backspace, which gives me a system beep, it doesn't stop. Instead of one, short beep, I get a never-ending tone, as opposed to the beep.
The computer does run a little hot; It's in a closet and it's kind of crammed, but it's never so hot that you can't handle the computer.
I do use a virtualization environment using VirtualBox,Qemu. Many a times I have to format my laptop then in such situations installing again the previous programs and making same changes to the configurations is very painful. Is there some way I can make some sort of backup which I just need to install on my existing system and get all previous things immediately installed without going to install and setting of a lot of other applications and settings.
View 1 Replies View RelatedI am trying to run rsnapshot from cron via root's crontab file (crontab -e). If I run rsnapshot from the command line with sudo it works perfectly, however, if I run it from cron:
Code:
* * * * * /usr/bin/rsnapshot hourly >/tmp/crontab.out 2>/tmp/crontab.err
This does not work. The crontab.err file shows:
[Code]....
I have downloaded and installed a mini-httpd on my Debian 2.6.32 and its running on port 80.I have some webpages that I have stored in the /usr/share/mini-httpd/html directory.I am able to access those webpages from my machine(i.e.localhost).But I am unable to access it from another machine within the network.I have tried editing the iptables rules but in vain.The problem is,in my company we use a proxy server to browse the net and when I try to access the webpage on my machine from another system(by giving myipaddress/webpage.html),it shows the error message that proxy server is refusing any connections to the server.
View 1 Replies View RelatedI have installed a linux server in my office to run 16 machines. Its main use will be a internal mail server but will be also running websites.
I have installed Ubuntu 9.10 server x64 and have got apache running.
I am looking for the simplest more robust solution for smtp, pop3 and imap. I have only ever used qmail before and found it a pain to configure and its getting old so I though I should probably try something new. I have not much experience with running pop3 or imap on linux so would love a suggestion on that.