Software :: Ubuntu 9.04 Server - Finding Utility To Backup Entire HDD?
Dec 26, 2009
I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.
So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.
I've searched some older posts and they said to use partimage, but this program doesn't support ext4 file systems. Here's the original post: [URL].. So how would someone backup their entire ext4 partition so the owner can mess around with some graphics drivers
Custdistro means it will backup your all/thing without/home and customback means it will backup all thing in /. It can create an ISO if your backup is less then 4 GB. Well, i've made a lot of changes to ubuntu 10.04 and now i love it! It does everything i'll want from any computer. This took me a lot of time, follow several tutorials and destroy the entire system a couple of times. The last one is a BIG problem because restoring my system to the state before i ****** all up takes some time again.Do any of you guys know how to backup all my system settings, programs and files? So that if i corrupt my system again i can restore it to be exactly as my current state?
I have spent considerable time installing and getting my Ubuntu 10.04 LTS (Lucid Lynx) to where I want it. I am looking for something that would BackUp my entire hard drive to a CD/DVD (preferably bootable) --so-- if I crash -or- want to clone to another hard drive I would have the ability to a 'Restore' of the CD/DVD and simply be able to load the CD/DVD to an old / new hard drive and be back-in-business without a lot of hassle.
I want to backup my entire harddrive and I assume the easiest way to do it is using Clonezilla.
Clonezilla makes an image file....but how do you get the image onto multiple DVD's? When burning the image file does K3B allow you to "change the full DVD" and add another disk?
In other words- any harddrive I have (already filled with 79 Gb) is going to make an image bigger than something that can fit onto a DVD.
How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?
[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.
I have a computer running Ubuntu 9.10 as a server (it is in standard Ubuntu not Ubuntu server edition). I have 4 1TB hard drives, three of which I want to back up to on certain days of the week. I have tried using Lucky-Backup and Rsync but neither seem to be able to handle the amount of data (there is currently about 400GB). Does anyone know of a program that can run scheduled backups of this size?
Can anyone recommend a good native backup utility for 10.04. I would like compression, the ability to image partitions and/or drives and a simple way to restore in the event of total drive failure. A nice incremental backup facility would be good too. I would be backing up to an external USB hard drive but of a smaller size than the source drive so compression and the ability to choose what and what not to backup is needed.
I have read that i can backup the entire system with the home folder with commands, or with programs, such as clonezilla, but it doesnt work, so im trying to back it up with commands now but i cant find a good tutorial to explain what commands to use.
I have Ubuntu 9.10 on my system and i have a lot of apps on it.is there a way that in case of a full re-installation or hard disk replacement i could have all my softwares and settings installed on the new Ubuntu installation.
I need to transfer a 4Gbyte file from my Linux netbook to a friends WinXP desktop. And I'd like to it with a usb flash drive, but it can't handle a file larger than 2Gbyte. A limitation due to the underlying FAT32 filesystem. But I don't wish to reformat my usb as ext3 either.
So I need to split my 4GByte file into smaller chunks. And the 'split' utility needs to be available on both Linux and the WinXP operating systems.
I start using Ubuntu, after successfully installed it without any problem.Then I downloaded latest updates, etc. But,after installing a display driver, my computer freezes, refusing to go beyond the welcome screen. I tried several ways but could not solve the problem. So I decided to re-intall Ubuntu, download again the updates, again configure my settings, etc. herefore, I'd like to know whether there is any application to backup or to create a full image of a hard disk so as to avoid the long hours of re-installation.
Sometimes I need to copy a huge directory to another directory (local filesystem), and usually I will use the "cp" or "rsync" commands. These commands are good, but depending on the size of the data being copied, the copy is painfully slow. I realize we are limited because of the hardware we have with it's limitations, ie, I/O speed, and the filesystem (which is usually ext3). Are there any other utilities that maybe not well known, but can handle copying large amounts of data? (mostly in the TB range)
I just got a 2TB drive with the intention of backing up multiple Ubuntu machines to it. What would be the best way to do this, keeping ease of restoration in mind? Should I just copy each drive image to the BU drive, or use a utility like Back in Time?
I got myself a dell laptop from the local computer store. Its a used machine with Windows Vista Home Basic on it. I want to load Ubuntu Desktop 10.10 though so I can do perl development. BUT I want to keep a copy of the entire harddrive with the dell utility partition and Windows Vista in case I want to go back. I was thinking I could image the drive but I not sure what to use, I don't have Ghost or anything, Someone had told me about Clonezilla. Would that work for me? Is it hard to use? Also I want to burn the data to a DVD or something more storable than a harddisk.
Does the dump command back up entire file-systems or is it capable of backing up subsets of a file-system? And is tar capable of taking device names (for file systems) as input to be archived?
I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.
Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
I'm trying to use wget to retrieve some data from our tape backup utility (HP Command View 1/8 G2 Autoloader). The URL requires two parameters for the info I want to retrieve. I have searched for a few hours and have tried numerous combinations to get the data but the parameters aren't being executed. I have escaped the URL as well.
I am working from a laptop where all my work is stored on a 80GB drive. I am now also an owner of an external 250GB USB hard drive, formatted with FAT32. I want to keep it FAT32, so that I can offer some of my files to people that run Mac OS or Windows and I don't want to have them install ext3 for windows and what not.I am in need of a strategy which will allow me to keep a mirror of my laptop drive on my new external drive, i.e. no history / versioning required. However, I do care about file permissions. The files don't have to be stored as-is, they can be stored within a large (80GB?) tar file, that is fine - it would be easier for me to coerce people to open a .tar file than to install an ext3 driver for their OS, I suppose. I don't think I can keep file permissions otherwise, can I?
I have previously used a self-written sh script that used rsync to keep an up-to-date copy of my laptop filesystem on a USB flash drive, but in that case I had the flash drive formatted with ext3, so no problem with file permissions there. This time, it's trickier.
I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.
I have a situation where a directory has about 1.5 million files in it. On an hourly basis, I want to be able to find any files that have changed in the last hour, compress them, encrypt them and then copy them to both a local backup machine and an off site backup.
Is there any kind of utility or kernel module that creates some type of log of modified files? I know I can use find, but the search for -mtime in this directory takes quite a while and will not suffice for an hourly backup.
What is the best Linux Mint backup tool that is most like Time Machine (that ships on Macs)?
The one thing that I want it to have similar to Time Machine is that it only backs up files that have been changed, therefore making for faster backups.
I am running Live 12 on my CD rom drive of my dying laptop. I have a major Windows registry error on that system and am working to recover my files. I have successfully moved a couple of folders from the laptop to my Seagate Free Agent Drive as a test.What I would like to know is, is there a way to copy my files and folders without literally dragging and dropping each one? We're talking 140 G of folders....sigh.
cpuid utility is not compiled with U9.04 and the utility is not available as a package with synaptic - other distributions have it available as rpm . url