As a matter of routine, I use dd to collect my hard drive master boot records, and I save the resulting file someplace where I can always get to it if I have to. It has occasionally saved me when some kind of disaster has struck and damaged a partition table or rendered a drive unbootable.Since I have switched to a fully encrypted system, I have been wondering exactly how to save the dmcrypt information. Dmcrypt works with a standard filesystem, and the filesystem itself isn't encrypted, only the contents.Thus dmcrypt must write a header someplace that includes all the information needed to decrypt the contents of the partition, including the key and the type of encryption employed.
Should something happen that causes that header to become corrupted, the entire partition is inaccessible and no recovery tool will work. So, it would seem like a really good idea to use dd to copy that header someplace safe.Now, I haven't taken the time to sit down, read the dmcrypt code, and try to figure out how to do it from that. The command cryptsetup luksDump doesn't do what I want because it dumps that information, but not in a format that would be immediately useful in copying the information back to a damaged partition. And every howto I've found tells me how to set up dmcrypt, and encrypt partitions and so forth. None tell me how to recover this information, other than mentioning that it is stored in the first few sectors of the partition.
Looking at some of this through a hex editor, I *think* that saving the first 512 bytes of the partition gives me what I want, but I'm not positive of that and I hesitate to depend on any putative backup scheme of this sort that I haven't vetted. I want to save no more than I need, and I certainly don't want to save any less than what I need.Edit: As I think about it, the first 512 bytes can't be enough because Luks/dmcrypt permits multiple keys. So how much information do I need to save?
I have a 1 TB usb external disk. I was crypted it with cryptsetup. now I dont' want crypt. without losing data how can I clear dmcrypt from my external disk
I've been looking for a tweak that would allow me to store temp and log files in ram. I've found a few that involved editing the fstab file, but they were either Ubuntu articles or they were over a year old and perhaps didn't apply to Fedora 14.I hear there is also a 'noatime' command that can help speed up processes by telling the kernel not to saved when files are accessed.And what are the advantages and disadvantages of noatime, and writing log/tmp files to ram?
Why can't Ubuntu store configuration information in a way that it wouldn't need to be clobbered in order to apply security updates? For example, this mornings updates told me I had to choose between using a new version of smb.conf that's part of the security patches, stick with the old version, or let the update installer merge them. Of course, the first two choices have obvious drawbacks: why should I have to choose between losing all my SMB settings or refusing security fixes? So I chose the merge, which came back with "Conflicts found during three-way merge! edit `/etc/samba/smb.conf' and sort them out manually."
I searched Using my User Name and did not find the post post made for this problem.Still the search using the User name does not return the first post or this.
We've had a site broken into, and several of the desktop computers physically stolen. The Ubuntu 9.10 router/gateway/firewall/web filter box has however NOT been stolen. I'm wondering if there is any information we can get from this that would help the police.
NAT and firewalling are handled by firehol. It runs a DHCP server to provide the desktops with IP addresses. It runs a Samba server with some file shares. It runs Squid and Dansguardian in an intercepting-proxy configuration. Of particular interest might be whether the MAC addresses of the stolen desktops can be obtained, which might help with tracking them down. Also anything to narrow down the time of the break-in.
I wanted to set up Computer Lab. loading Fedora 11 OS and one system acting as a Server to store Users(Student) Login Informations. When students do a programs, all programs (eg, C++ programs) files should be saved in the local fedora system but when login to the system, the login should be validate by a Server System.
This may be a stupid (?) question, but does any one know of a patch for sudo that allows the sudoers information to be pulled from mySQL? I run multiple servers with multiple people working on them and would like a one-stop update of permissions. Yes, I could use rsync or the like, but I'm just wondering if this has been done, or could be done.
(Sorry if this is the wrong forum, I'm kinda new around here, posting wise and this seemed to fit. Feel free to move it if it's not)
I'm trying to write a program which would get information from a webpage and display the information on my desktop sort of like a widget. I kind of remember there being something like this already made, but for the life of me I can't remember what it's calledDoes anyone know?
How would you make NIS user information override local user information on client systems? This is what I think is right? Add nis on the passwd registration file on the second line Is this correct?
I've been putting my DVDs onto my home network for a while without any probelms until now.When I put in Doubt and try to rip it, the title tree (in both K9 and DVD::Rip) shows several titles of a large filesize and the same time duration (that of the film). About three of the multiple titles are exactly the same file size and duration and then there are about 3 more that differ slightly in both.
I think this must be some kind of anti-piracy measure. Obviously 6 files of 5 GBs wouldn't fit on a DVD so it must be some kind of error/trick. When I try to backup the DVD it doesn't work. Has anyone else experienced this problem?
I've got a lot of photos, home movies, documents, etc on my machine (currently running with ubuntu-10.04-desktop-amd64) and i've had issues in the past with hard drives failing. fortunately i've not lost anything substantial as all the important stuff was recovered from backups i had made with blank DVDs.my last desktop rebuild was about 2 years ago and i decided that using blank DVDs to back things up was no longer practical as they're too small and the amount of data i need to backup is too large.hard drive storage used to be quite expensive but they're getting cheaper and cheaper all the time. i bought 2 pretty large, identical drives for($70) each.my machine has 3 hard drives in it. the first is a small solid state drive with the operating system on it, the second has all my data on it (mount point is /data) and a third to backup the stuff i don't want to lose (mount point is /backup).i use the following script to automatically make another copy of all the stuff i want to backup onto a second drive.
this is automatically run at 9pm every night as a root cron job.the 2 lines that begin with "rsynch" do the actual work of backing up my data. the 2 lines that begin with "chown" are just for convenience. they change the owner of the backup log to myself so i can easily delete them without being hindered by ownership issues (the script is run as ROOT so the logs are created, and therefore owned by ROOT).i originally used this article to help set it up in the first place.it explains the process with a good degree of clarity and i can recommend it to anyone who would like more info on this technique.i'm not too worried about losing my system as all the configuration files in the home directory are backed up.it doesn't take too much effort to rebuild the system but it's the personal data that is much harder to replace.
if the data drives fails i will be able to replace it and restore the data from the backup drive. if the backup fails i can just throw it away and make another backup. this whole strategy is designed to guard against a total drive failure. it offers no protection against accidental deletion of files except for the small window between the deletion and when the script is run (anything deleted from the data is also deleted from the backup the next time the script is run).i've only really experienced hard drive failiure 3 or 4 times.the last time the computer started doing all sorts of wierd stuff and i didn't understand the issues involved.
I have a couple of important directories that I want securely backed up to my ftp server. backup software that is opensource, and that could work on both linux and windows?
I was updating my laptop from 10.04 to 10.10 and the fan stoped and the laptop overheated and shutdown in the middle of the update. So now I cannot use Ubuntu till I reinstall, but I was wondering if someone knew how to back up the homefolder through the live cd with permission to copy every file that is on their because the important files are the one I need permission for.
I like music A lot! every CD I buy, I like to rip to my HD as WAVs (i'd do flac but I have a windows system too, and can't for the life of me find a flac codec for Media Player); but I'd also like to create images in case i lose / scratch / otherwise damage my CDs. I usually use DD, but the system has to be unmounted. Problem is IDK where audio CDs are mounted so I can't unmount it.
I have a Seagate USB drive that I'd like to use as a backup drive for my home system with two drives. One drive contains /home and /root, the other contains /var. I've read about a lot of different software for backups but I'm not really sure which one would be the best for this. I want to be able to use this backup to restore the system just in case something happens. What would be the best software to use for this? I'd like something that will basically clone the system I assume since I'd like it to not only copy the system structure but also symlinks.
Before I used to make a copy of configuration file and give the end of the file name as 'something.conf.bkp' . Later realized it was a mistake. The correct wording should end in .bak and not bkp .
As per (See apt.conf(5) manpage, section DIRECTORIES, last paragraph.)
The Ignore-Files-Silently list can be used to specify which files APT should silently ignore while parsing the files in the fragment directories. Per default a file which end with .disabled, ~, .bak or .dpkg-[a-z]+ is silently ignored. As seen in the last default value these patterns can use regular expression syntax.
We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray.Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc?Do you have any other approach for creating physical back-up of your hard drives?
In a few hours I'll have a new 500GB Sony laptop, filled with the usual Sony rubbish which I'll promptly be replacing with Ubuntu or Crunchbang or something. However, first I want to make a full clone of the drive (including recovery partitions), should I wish to return it to Sony or sell it on in its factory state.
The problem is that the only backup drives I have are less than 500GB - the biggest I have is 250GB or so! So I need to backup and compress on-the-fly.
What's the best way to do this? Presumably dd piped into gzip would do the trick, or does anyone have any other suggestions to accomplish this?
I want to do a clean re-install of Windows 7 but there are files and installed programs which need to be backed up and restored. I tried using the Windows 7 backup utility to do a full backup to an external usb drive. The problem is that whenever the backup gets close to finishing, it always crashes. The system I am backing up has been compromised by viruses, which might be causing this. I already used several utilities to get rid of the viruses but some of the damage they did can not be undone. I tried doing the backup in safe mode but Windows 7 does not allow this. What other methods can I use to backup and restore important programs and files on Windows 7? Perhaps there is a way to do it from outside Windows 7, like say, using a Linux live cd? One of the main problems I see is restoring installed programs since those make use of the registry so simply copying the files probably won't work.
I want to securely backup my 80G HD, but doing a complete backup takesforever and slows down my machine, so I want to backup just 1G per day. Details: % First hurdle: on the first day, I want to backup the "first" 1G of the hard drive. Of course, there really is no "first" 1G on a hard drive.% After 80 days, I'll have my whole HD backed up... assuming none of my files ever change, which of course they do. So the backup plan/program must also catch file creation/changes as they come along. % The backups must be consistent, in that I can restore my system restoring the backups sequentially. In other words, "dd if=/harddrive" probably won't work.
% The backups should encrypt file contents AND names, but I don't see this as a major hurdle. % Once the backup has backed up everything (even changed files), it can re-backup the first 1G on my hard drive. Even though this backup is redundant, that's OK, because I always want to be backing up something (eg, if I'm backing up to optical media, the older media might start going corrupt). Is there a magic backup plan/program that does thisIn reality, I want to do this for multiple machines drives each, but think that solving the above will solve the general
How would one go about backing up a hard drive? I want my Windows hard drive to be backed up, so that I can restore it onto the hard drive if the hard drive was blanked.
If this hard drive is formatted and linux is installed on it, I want to be able to restore my current Windows partitions (Win 7 and Win XP) and probably the boot loader. I am sure this is possible, but what would be the best way to go about it?
I did a clean install of 10.04 over the weekend and copied all of my backed up files from my external drive back to my internal drive. However, I've noticed that when I moved all my files back, they're all now marked as being executable. I've since fixed this, but I was wondering why this happened to begin with?
I use rsync to backup my files (grsync to be exact), but when I do so I copy files from my internal drive, which is formatted as ext4, to my external drive, which is formatted as NTFS (I keep my external drive as NTFS in case I need to hook it up to a Windows machine). Does the file system discrepancy have to do with why my permissions change when I backup/restore my files? Is there a way to prevent this? Or should I be backing up my files a different way?
have you ever set up the perfect ubuntu , all the softwares,games,libs,customizations... and wanted to back it up so if anything happens in the future you could just go back to "perfect one" ?
I have been struggling to get backuppc running and finally today figured out what I was doing wrong. Backuppc is running on my Ubuntu 10.04 server. I have it backing up my mac, but I can 't get it work on my Ubuntu 10.04 desktop. I followed the same steps I used in getting it to work in my mac, so I can't really see why there is a discrepancy. This is the error log I'm getting now:
Code: full backup started for directory / Running: /usr/bin/ssh -q -x -l root 192.168.1.120 /usr/bin/rsync --server --sender --numeric-ids --perms --owner --group -D --links --hard-links --times --block-size=2048 --recursive --ignore-times . / Xfer PIDs are now 14404 Read EOF: Connection reset by peer Tried again: got 0 bytes Done: 0 files, 0 bytes
Got fatal error during xfer (Unable to read 4 bytes) Backup aborted (Unable to read 4 bytes) Not saving this as a partial backup since it has fewer files than the prior one (got 0 and 0 files versus 0) The dreaded 4 bytes error. I was getting that problem before on my mac, but from what I've managed to find, this error usually shows up when you don't have ssh keys set up properly. Once I got that figured out, it started working on my mac. I can confirm that that part is set up because when I execute the following as the backuppc user on the server: