We are in the process of pruning our directories to recuperate some disk space.
The 'algorithm' for the pruning/backup process consists of a list of directories and, for each one of them, a set of rules, e.g. 'compress *.bin', 'move *.blah', 'delete *.crap', 'leave *.important'; these rules change from directory to directory but are well known. The compressed and moved files are stored in a temporary file system, burned onto a blue ray, tested within the blue ray, and, finally, deleted from their original locations.
I am doing this in Python (basically a walk statement with a dictionary with the rules for each extension in each folder).
Do you recommend a better methodology for pruning file systems? How do you do it?
I've recently started using Ubuntu as my main desktop operating system and I'm looking for a backup solution that is able to backup not only my documents and various other files that I have on the system, but I also need it to backup and create restore images for the operating system.
Image Hard drive Ubuntu Operating system 9.10 Complete back up and restore. Changing over Hard Drives need a complete back up not just save files. So the image can be restored on any hard drive that restores the computer to its original state before it was imaged.
I'm setting up a Backup & Media server, which will be running debian. I will setup a small HD or SSD/CF card for the OS, and a MD raid for the data drives.The total size of the raid will be either 3 or 4TB, depending. Now, I need to figure out what filesystem to use on top of this raid.My criteria is as follows:
1. Support for large files. I can't imagine anything larger than about 1.2TB, but the 4GB of, say, fat32 just isn't enough.
2. Robust. I don't want it falling apart on me; nothing too unstable.
3. (and this is most important): Good Undelete support. I got burned recently when a software glitch managed to rm -rf my EXT4 drive. All the file data is still there, but all the metadata is gone. I *DO NOT* want that happening with this. I want to be able to do a "RM / -RF", immediately unmount it, and then recover *all* of the deleted data. Obviously, when data is overwritten it's overwritten, but I don't want to lose all my metadata if a "RM -RF" happens. FAT-32 is the model I'm looking at: You can usually recover deleted files if anything happens to the drive.
So, what are my options?EXT2 looks like a possibility. EX4 is *clear out*, unless there's some nice utility/mode that keeps a backup of all deleted metadata etc.
I have been researching the web for a program which will allow me to backup my entire hard drive so that I can restore my system if need be. I am however unsure which is the best one to use if I want to achieve this:Somehow I want to back up my hard drive containing my ubuntu system byte for byte so that if the hard drive were to fail I could simply go to the store, get a new hard drive, restore my backup and be up and running again without having to do any re installments of ubuntu or any other programs for that matter.
What is the easiest program that does this? I would like it to support incremental backup.rsync with the "Back in Time interface"?bacula?
I just got a 1TB external usb hard drive to backup my comedy shows. On my smaller usb 'pen' drives, I set the file system to ext2 (occasional reads/writes) , but should I do the same for this bigger and more frequently acessed drive? (daily read/ocassional writes), should I go with EXT3 for the journaling?
Also, regarding security, I was thinking about making the drive writable only by root, so that when I mount the drive as a normal user, which will be for a few hours daily, if someone does get onto my system they couldnt write to the drive from my user account. That should just be a simple case of setting the device to 755 (and owner=root) should it not?
I have got a Cowon iAudio 9 music player and want to display the album art images of the albums on it. The player does support this feature but only when there is a *.jpg file inside of the album folder or the image is embedded into the music file itself (did not tried that).
At the moment I'm using banshee as music management software and I like it. The album art is downloaded automatically for each album and when the player is connected it is beeing recognized and I can drag & drop albums or single songs.
The only problem is that the album art is not transfered along with the albums. The player does support MTP and MSC usb connections and is mounted as removeable usb drive to the system (Ubuntu 10.10). I would be glad if there's a solution to do that with banshee but other software solutions are welcome too.
but I have literally just starting using Linux (Centos) in the last week or so. I am using a standalone PC that is not networked, and as I will be downloading and generating a lot of data on this machine, I would like to regularly backup onto an external hard driveIdeally I would likethis to happen automatically as there will be other people using the machine.There seem to be many different ways of doing this, and I am getting a bit confused about the best method to use.
I know that ImageMagick's convert program can be used as follows to convert a collection of images -- say, in PNG format -- to a PDF file:
convert *png output.pdf
The problem with this is that each image is then stretched to fit on one page, whereas I would like to keep the original dimensions of the images and put as many as possible on one page in the PDF file before moving on to another page.
I keep a backup of a bunch of files on a flash drive, so that whenever I change distributions I can just restore all my Android stuff (saves on re-downloading everything). One of these is the Android SDK.
In my ~/.bashrc I add the paths to some executables in the SDK, only if the directory exists, and only if the path is not already in $PATH. For the Android NDK this works fine, but for the SDK I get this:
Code: snfo@snfo:~$ adb devices bash: /home/snfo/Android/sdk/platform-tools/adb: No such file or directory snfo@snfo:~$ ls -F /home/snfo/Android/sdk/platform-tools/adb /home/snfo/Android/sdk/platform-tools/adb*
Everything else is fine though, just that one path is causing trouble.
Now, I've saw something similar to this before whenever you move an executable from one place to another. If you don't re-source your bash config it will continue to keep looking wherever it used to be located. But I've never moved these files.
Have been using mint 11 for past few weeks with no problem but failed to boot correctly. menu appeared giving options for booting but kept returning to this menu without going further so I opted to go for safe booting option, after loading a few files it asked for password but kept giving me the message incorrect password so I could get no further. Fortunately I had a cop of clonezilla and was able to restore a backup from a second hard drive but would be grateful for any observations anyone would like to make about this (in case it happens again)
I have set up a Ubuntu box that is a proxy server. Everything works great and I would like to somehow make a complete disc backup of everything on that hard drive, incase it fails. Took me quite a while how to figure out everything and get it working. The box has an 80gb drive, with Ubuntu 9.10 loaded, standard default setup. Could I just install a 2nd hard drive and somehow give it a command to mirror everything to the 2nd hard drive?
I've picked up an HP Simplesave external drive. It comes with some fancy software that is of no use to me because I don't use Windows. Like many current consumer-targeted backup drives, the backup software is actually contained on the drive itself. I'd like to save the drive's initial state so that I can restore it if I decide to sell it.
The backup box itself is somewhat customized: in addition to the hard drive device, it presents a CDROM-like device on /dev/sr0. I gather that the purpose of this cdrom device is to bootstrap via Windows autoplay the backup application which lives on the disk itself. I wouldn't suppose any guarantees about how it does this, so it seems important to preserve the exact state of the disk.
The drive is formatted with a single 500GB NTFS partition. My initial thought was to use dd to dump the disk (/dev/sdb) itself, but this proved impractical, as the resulting file was not sparse. This seemed to be because the NTFS empty space is not filled with zeroes, but with a repeating series of 16 bytes.
I tried gzipping the output of dd. This reduced to the file to a manageable size — the first 18GB was compressed to 81MB, versus 47MB to tarball the contents of the mounted filesystem — but it was very slow on my admittedly somewhat derelict Pentium M processor. The time to do that first 18GB was about 30 minutes.
I am just spent half an hour hunting for a thing that should be totally available already:USB install images of Ubuntu, knoppix and all the others.And, the only good way are so far complicated tutorials where you extract the stuff from an CD image. Why??Hasn't everybody notices that CDs/DVDs are vanishing big time? That more and more systems don't have the readers anymore? Instead of following a 10 point instruction list, it would be nice to just be able to download a Ubuntu 8.10 or whatever USB image and be able to beam that DIRECTLY to a USB stick with a dd command.
Or am a missing something here? Does this exist?It should by no means be mariginal, considering how important USB stick in specific and flash memory in general have become.
Issue: I want to backup my system and be able to restore if/when my system crashes.
Setup: Thinkpad T61 running Sidux KDE3.
I want to back up the system files more than personal data. What I have found is that when I apt-get dist-upgrade the upgrade performs without error, but when I restart and login I get a black screen with a mouse. From the 'black screen' I can open yakuake with F12 and then launch applications and such from the command line. The point of this more troubleshooting right now. Later is so I won't need to do a complete reinstall.
What backup prog have you used that will backup the complete system and is easy to restore?
O/S: Fedora 12 I am newbie in linux. What I want to do is: Make backup for my file system, cos I learn how to configure servers. So if I made some thing wrong, I want to be able to restore the default setting for my files. Instated of install new O/S.
I am working from a laptop where all my work is stored on a 80GB drive. I am now also an owner of an external 250GB USB hard drive, formatted with FAT32. I want to keep it FAT32, so that I can offer some of my files to people that run Mac OS or Windows and I don't want to have them install ext3 for windows and what not.I am in need of a strategy which will allow me to keep a mirror of my laptop drive on my new external drive, i.e. no history / versioning required. However, I do care about file permissions. The files don't have to be stored as-is, they can be stored within a large (80GB?) tar file, that is fine - it would be easier for me to coerce people to open a .tar file than to install an ext3 driver for their OS, I suppose. I don't think I can keep file permissions otherwise, can I?
I have previously used a self-written sh script that used rsync to keep an up-to-date copy of my laptop filesystem on a USB flash drive, but in that case I had the flash drive formatted with ext3, so no problem with file permissions there. This time, it's trickier.
I have installed luckybackup software on my ubuntu 10.10 notebook edition. But I don't know how to use it to backup files to an external hard drive. The Hard Drive is a 1 TB Seagate. I don't think that the Destination Drop down menu in luckybackup even shows the External HD.
I dual boot windows and ubuntu on a particular machine and I'm looking for a comprehensive backup solution. Basically I'm after a single tool to clone the entire drive and do incremental backups with little to no concern for the underlying os.
My first instinct is to set up rsync to do the back up from ubuntu and just mount the windows partition when it does its thing so it backs that up too. Does that sound reasonable or am I missing something? At face value this seems like a reasonable answer, but I can't help but feel like something is "off" with that approach.
My manager at work asked me to setup an FTP server running LINUX. i had a post about some custom scripts and commands that i needed, so after i configured and set everything up, i got another assignment....here we go, i need to come up with a way of backing up the entire system, or creating a script that will setup,install and configure the FTP related features i have made working there.i am wondering what would be the most ideal/best and easy to do way to back up the system? the goal is : in case if HDD fails,r anything goes wrong.system can be restored in no time. this is still up in air thing, i offered taking some time on a weekend and setting up same system on say,ew drive, and swapping them when one goes bad... but what would be an alternatives
I'm trying to create backup/archive my Ubuntu 10.04 system files (so I can restore it in case my system get corrupted). More specifically, I'm trying to zip the important files in my root directory not including my home directory (which includes my documents which I backup separately/more frequently) to an external hard drive attached via USB (called 'My Book').
Since File Roller didn't give me quite the level of control I was looking for, I created a script that I could execute to backup and archive regularly. Here's a snippet: cd /media/"My Book"/"Linux Backups" NOW=$(date +"%b-%d-%y") LOGFILE=Backup_Root_FileSystem-$NOW.log sudo zip -r -T -v Backup_Root_FileSystem-$NOW / -x /media/'My Book'* /media* /proc* /sys* /mnt* /dev* /cdrom* /home* /'lost+found'* | tee -a $LOGFILE
I am trying to restore my system to Ubuntu 10.10, using a system backup made with REMASTERSYS. When I reboot, I get the message: GRUB error:15 I found many threads discussing this issue, most notably here: [URL]