General :: How To Take OS Backup
Apr 12, 2011how to take linux OS backup, can i use tar command for this?
View 6 Replieshow to take linux OS backup, can i use tar command for this?
View 6 RepliesI am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
View 2 Replies View RelatedI'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
1. /backup/
2. /media/backup/
3. /mnt/backup/
4. /home/chris/backup/
Can some one give me a sample of a crontab for backing a directory please, System is Ubuntu 9.04Quote:
#!/bin/bash
# this file is an automated backup script, backup.sh.
# this backs up my domain site.
[code]....
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
View 2 Replies View RelatedI install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.
[URL]
The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?
I have been looking for a complete backup solution like "Acronis True Image Backup and Recovery" on Windows for Slackware a while.
View 12 Replies View RelatedAfter I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
file: including.txt:
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days
find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4
--
-- ------------------------------------------------------
-- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
View 3 Replies View RelatedI want to make a backup from my Email and my Favorites from Mozilla.
But which folders I have to make a backup from.
does anyone know of a good backup software for Ubuntu 10.4 that will let me select which folders to backup, rather than a complete backup? My install and settings etc can be replaced, but my photos and memories cannot!
View 6 Replies View RelatedI have installed Ubuntu 11.04 onto HP EliteBook 8540w notebook and would like to backup the entire disk using some popular backup tool.
I have searched in the internet and found the closest tool is PartImage. But the bad news is that it does not support ext4 fs!
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
View 7 Replies View RelatedI am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?
View 4 Replies View RelatedWhen I backup with Dejadup, does it backup only the folders or the OS as well?
In case of a real crash, can I bring back the system to its pre crash state including the OS, using Dejadup backed up files?
I have use backup-manager tool very often but now found some problems and goes to it official site backup-manager.org for search answers.But this site didn't open already more than week!
url
url
url
At now DNS records don't have A record:
Code:
$host backup-manager.org
backup-manager.org mail is handled by 10 private.sukria.net.
backup-manager.org mail is handled by 15 private.nxr.fr.
backup-manager.org mail is handled by 5 mx.sukria.net.
backup-manager.org mail is handled by 10 jupiter.unix-scripts.info.
Does this project moved, renamed or died?
Maybe it change the domain address?
Or this is only temporary problems with hosting or domain?
I have installed an application manager(monitoring application) on my linux server. Now, i need to have backup schedule for my application. The application itself has executive file to backup database.But when i put this file in my crontab to schedule the backup program it wont run!50 09 * * * root /opt/ME/AppManager9/bin/BackupMysqlDB.sh
View 1 Replies View RelatedI know how to backup a filesystem with tar and restore it later, but i'd like to know how to do that with a remote system. I'm really terrible at all command piping stuff and i can't simply adapt the tutorials to my needs. What i'd like to do is to login as root@remotemachine and issue a tar command on the entire / directory that would save the resulting archive to my local machine. And then i'd like to do it backwards (restore). I've seen some commands on the net ... something like:
ssh -something root@remotemachine "cd /; tar -cpf - ." | tar -xf -
I don't remember it exactly but i now that this command copied the filesystem over to my computer and i don't know how to change it to create an archive instead.
I am running Ubuntu 9.10 Gnome I have twin hard drives in my system. I use rsync to automatically back up my home user directory on the first drive to the second drive everyday. This has been working well for almost a year. My first drive is 85% full, but suddenly, my second drive is 100% full...in fact, rsync errored out having filled the second drive.
I erased the second drive, and using nautilus I copied the section of my first drive that I back up to the second drive (minus a few large files). When it was finished, the second drive was 99% full. As near as I can tell, my backup has about 30 GB of extra information, but I have been unable to determine what the extra data is! I am no longer able to do backups until this gets resolved. Does anyone have any idea what this could be, and how to discover where it is coming from? So that it can be rectified?
I thought maybe some link had somehow gotten created and my backups are following the link and copying the same info twice or something, but I haven't been able to figure out how to list all my links in my file structure on some kind of search.
I was working on a shell and got some weird exceptions in my program.Just as a reference, I want to save all that is there on my shell to a text file. I do not just want the command history but also all the results that those commands produced at the shell.Is there some built-in utility to do this? I have kept the shell open for now, so that I can take the backup. Also, I am using xterm and it does not allow selecting all the way upto the top of the shell, so the ultimate way is to take the backup one screen at a time.
View 2 Replies View RelatedI've tested some backup softwares, but i've probably don't remove on of then and i have here some backups which i don't want:
duplicity-full.20100728T171159Z.manifest
duplicity-full.20100728T171159Z.vol1.difftar.gz
duplicity-full.20100809T221549Z.manifest
which software it may made?
Machine A holds important source data that needs to be backed up. Machine B is where the data will be backed up to.They're both Ubuntu.I want to have an automated process that allows machine A to create an encrypted tarball and copy it to machine B, without human intervention, but doesn't allow anything except copying the file over to a target directory.I thought about using a chroot jailed account, but this seems to be a pain to set up and overly complex. I really just want to be able to have machine A to copy files to machine B, via automatic cron, but I also want to prevent the mechanism that allows the copy from allowing any other actions (copying to any other directory, logging in, executing any other commands noninteractively, etc.). Also, the transfer must be encrypted (e.g. using ssh somehow).
View 1 Replies View Relatedi just wanted confirmation that what i've done should work, here's the deal, i want to make a backup of my entire linux installed system using tar and gzip, tar to keep it all nice and ordered and to preserve the permissions, and gzip to compress the resulting file.
now down to business, i have booted into an arch linux install CD so as to have a command line separate from that of my standard installed system, my HDD layout (as mounts in my installed system) is like this
Code:
/dev/sda1 /boot
/dev/sda2 swap
/dev/sda3 /
/dev/sda4 /home
/dev/sdb1 (unknown partition on external HDD left over from fiddling around)
/dev/sdb2 main storage on external HDD
[Code]...
HDA1 is going to die, SMART says its well beyond dead, and I can tell the read/write times have gone up drastically.
I know I can use dd to copy it from a live CD.
1.) If I dd it to something.img, is there a way for me to "work" on the backup copy? Basically, I need to backup ASAP, but won't have a replacement drive for a while, so I'll need to be able to put the changes into the backup.
2.) When it comes time to restore, if I dd it to a larger drive, is there a way to take advantage of all the extra space? I assume I can dd it over and the MBR & Partition Table will go with, so I can use something like gparted to grow the partition to the drive size?
Or is there a better way to make a backup?
I was running grsync (rsync gui) to do a backup of my root and home partitions to a local external hdd. Home is currently 21.2 gb and root is about 60 gb, but the backup ran for nearly 24 hours before I canceled it, without finishing. How long should an 80ish gb backup take to do?
I made sure to disconnect other externals so they wouldn't be backed up as well. It was just my root and home being backed up.
What are the options to backup data to CD/DVD from command line interface. Don't want to use K3B and Nero Linux. Want to backup using "tar" like utility. Please suggest good articles or options.If we can use tar itself to backup data to CD. Then how?For example I want to backup my /home directory to a DVD. I don't want to use GUI tools for this.
View 2 Replies View Relatedmy laptop has a 320 GB hdd which is dual booted with win 7 and linux mint.If i remember i dont think i am using the entire 320 GB from the hdd,may be around some 200 GB.If i use a DD command and back it up to an external 500 GB hdd,will it back the entire 320 GB or only the data which is of 200 GB ?
I want to have a full backup of my HDD ,incase the hdd fails in laptop i can mirror it to a new hdd from the external hdd.
Looking for a GUI based tool to backup and restore GRUB to / from another hard drive, or CD, or USB stick. I have a few Linux distros on one hard drive.
View 7 Replies View Related