I have two 500GB externals and a 200GB internal HDDs for my desktop. One of the 500s is solely for backups of my desktop, netbook, and laptop. The other 500 is pretty much my "everything" drive. Literally everything (music, movies/videos, pictures, documents, etc) get saved to this external so I can access it on all my machines. The internal 200GB on my desktop is pretty much only used for temporary downloads, the OS (obviously) and things like that.
Here's what I'd like to do:
I used deja dup to create a back up of my /home and my "everything" 500 onto the "back ups only" 500 but once I let my paranoia simmer for a second I thought, "what if my back up HDD fails?!" So I want to have two redundant backups on separate drives. First I figured I'd just chuck it on the other "everything" 500 but then I thought that since I'm using barely any room on the internal 200, I could just store a backup there. So that's my plan. My problem is that deja dup only really allows one location to run the scheduled backups.
I saw this post: [url]
and think I could just do that for the redundant backup? Or is there a program I can get from the software center that allows me to schedule more than one backup on different locations? I looked into Back in Time but that's just a snapshot(?) and I didn't see a way to do what I liked...I think.
I've been a DOS/Windows guy for 20 years, and recently became a SW test lab helper. My company uses CentOS for a lot, so I've become familiar with it, but obviously not as comfortable as I am with Windows.
Here's what I have planned:
machine: Core 2 Duo E8400, 8GB DDR2, 60GB SSD OS drive, ATI 4650 video card, other storage is flexible (I have 3 1TB drives and 4 750GB drives around that can be used in this machine.)
uses: HTPC, Network Storage, VMWare server host: SMTP, FTP server, and Web server virtual machines
I've figured out how to do much of this, but I haven't figured out how to do backups in Linux. I've been spoiled with Windows, with the built in backup system so simple to use. I find myself overwhelmed with the array of backup software, and unable to determine which to use. none of them seem to do everything I need them to do, but some come close, I think. I'm hoping someone here can help me out in figuring out which program to use and how to use it.
Here is what I need the backup software to do: 1. scheduled unattended backups, with alerts if the backups fail 2. a weekly full backup with incremental every 12 hours 3. removing the old backups when the new full backup runs, I would prefer to keep 2 weeks of backups, but that's not necessary 4. a GUI would be preferable, since my arthritic fingers don't always do as I want them to do. I typo things a lot, and the label worn off my backspace can attest to that.
How do you get Rsync to do incremental backups rather than full backups? At the moment I have a script that will create a backup folder (if it doesnt already exist) then copy the source files into the backup directory with the command
Target is where the files will be backed up to Sources is the dir(s) to be backed up Exclude files is the list of files not to backup log file is where the output will be saved to. At the moment it only does full backups, but I would only like to do incremental, how would this be achieved? Am I missing out an option in the Rsync that is required.
As far as I can tell, the server guides only explain a bit about what dynamic routing is, but not how to implement it.
My situation is this:
We require a server with 3 interfaces. One local, one to a vsat link and the other to a fibre link. The fibre will be the default route for Internet traffic but we want dynamic routing to automatically switch to the vsat link when the fibre link goes down (which happens fairly often in Zimbabwe!) and then switch back to the fibre link when it comes back up again.
The first option would be to handle dynamic routing on a Cisco router, but at the prices of Cisco devices here, it's not the most affordable option.
Slackware is very stable and very geek-friendly. I happen to love it... Unfortunately, I've found it unsuitable for day-to-day stuff in recent years, because it doesn't have a whole lot of software in its repos - and installing stuff from source can quickly lead you into dependency.
But pkgsrc has a vast amount of software in it, and can run on Linux. So it could be a solution, right?
The problem with pkgsrc is that the dependency resolution doesn't recognize stuff installed through standard Slackware packages. If you try to compile Gnash with it for instance, it will drag in Firefox and waste a few hours compiling that, even if you already installed Firefox through pkgtools. So with a default setup, pkgsrc is suitable for building on a very minimal Slackware system, but not for extending a preexisting Slackware desktop with Xfce and Firefox and whatever.
Is there any way of changing this, so that pkgsrc registers preinstalled binaries as providing whatever dependency? Or is that not possible? If not, is there any other system that could provide dependency resolution for compiling stuff?
I am kinda stuck while providing solution for the above problem. I have achieved the fail over using keepalived but not sure how can we replicate the data from one server to other seamlessly and have them in sync with each other. My prime requirement for this project is end user should not notice the fail over and replicated copy of data should be available on the secondary as well.
I notice that these two packages contain the same files and uninstalling glibc-i18n doesn't uninstall the internationalization files, because they are also in the glibc package. Is this a mistake or is this normal or is there a reason for it ? I just made my own glibc package to fix this, but was wondering why this is the way it is...
I backed my photographs with K3B in Suse, prior to installing Ubuntu. Now I can't access them. Well I can but not with Ubuntu or Windows 7. Suse (KDE) on the family computer reads them with no problem?? I'm confused. Is there something that KDE has that Debian doesn't. there's nothing wrong with my system exept for this. Reads cd's and dvd's.
What would be the best way to have automated system backups? I'm trying to get it so my Xubuntu box automatically backs up the entire system including user settings on regular intervals, what would be the best way to do it? I have 2 hard drives with one that I do not use that I'd like to backup to.
I am currently backing up my data but find that it takes way to long to do a rsync, it takes forever to just find the differences and transfer them.Out of 3 separate rsyncs the main one that is slow is my www.skins.be mirror directory which is 41GB and has 392,200 files, sorted into multiple directories. Which grows by around 100 every couple days.I think that something that would be able to track changes by inotify time on directories will speed it up since Picasa sure finds the changes fast when I open it and it is tracking over 26,200 pictures. I just don't know of a backup solution that does that.
First Question: I have a very big volume (20+TB). When I try formatting it as ext4, I get the error message:
mke2fs 1.41.12 (17-May-2010) mkfs.ext4: Size of device /dev/sdc1 too big to be expressed in 32 bits using a blocksize of 4096. I understand that ext4 has a limit of 1EB (about a million terabytes), but a 32-bit limitation in e2fsprogs prevents me from creating a partition > 16TB.
Until e2fsprogs is updated to use 48-bit block addressing, it appears my choices are:Break up the volume into smaller volumes < 16TB, or Use xfs or zfs (I have already created a test xfs partition, and it works fine).
Does anyone have any opinions about which option is preferable? I have never used xfs before. Is it as robust as ext4? Is it as well supported by Ubuntu? What about zfs? Is it worth downloading from the ppa?
Second Question: I now have a huge amount of data to back up. In the old days, I remember making a full backup of a "big" 10 MB hard drive by taking a stack of floppies and inserting them one at a time into my floppy drive while my backup program split the backup into 1.4 MB chunks small enough to fit on a floppy.
I now have the same problem, but at a different scale. I need to back up 20+TB onto a stack of external 2TB drives. Is there any software package that can fragment a backup in this way?
I just read in my Linux+ resources that it is not a wise idea to compress tar files with gzip if my tape drive supports compression. In addition, the resources mention that using gzip for tar files runs the risk of data lost if an error occurs with the compression.
Does anyone compress tar files for major backups/restore policies?
I'm having some trouble to find a file system that allows me to backup my data to an external HD, and then access this data in other Linux (and sometimes windows and mac) machines.
The file system that I'm looking for must:
- have no user permissions: anyone can do anything with the data;
- have support to large files: I've used FAT for a while but it just sucks;
- maybe support access in windows and or mac with additional drivers;
- have journaling (or something of the sorts) to reduce the risk of data loss.
I am looking for a SIMPLE point and click backup solution for Ubuntu. At work I manage a rather large Net Backup installation so I am not a complete idiot. Just very graphically dependent. I knew it was bound to happed as soon as the mouse came out. I installed backuppc. Read the docs/how-to's/tutorials for 90 minutes and have come to the conclusion that it simply is not worth the time effort or energy. I could not even figure out how to use backuppc to backup my Documents directory to a USB attached HDD. Now that is sad.
I currently use rsync over a network to backup up my Ubuntu laptop to a share on Server 2003. That is when it does not error out due to the difference in time stamps. Basically, I get a point in time backup, incrementals don't work.
The reason I'm here today is because wine seems to have changed quite a bit. It used to be simple and I only need it for a couple of programs called dvd decrptor and dvd shrink. I've tried some programs in linux and haven't been able to get them to work, like DVD 5. It could be me, of course! So, I haven't been able to get wine to read my cd-rom hardware to where I can back up my dvds. Are there any solutions that anyone knows of on Wine or linux programs which I prefer? I did create an iso from a dvd with DVD 5, I believe, and then had it set up to burn with k3b. The first part of the dvd came up with the menus and music, but I could go no further than that. I figure there is some little something that I'm doing wrong or some bit of linux software suport for DVD 5 or k3b that I do not have. That's usually what it is.
I would like to create lossless backups of my DVDs, or more exactly the main movie including one or more audio tracks and a subtitle of my choice. I would like to have the subtitle burned into the movie so that I only have one file (container). No, I don't want a complete DVD folder (TS_Video and other stuff) nor do I want to create an iso file of the DVD. Is it possible? The way I see it there should be two options:
1. Extract the wanted audio track/tracks, one subtitle and the main movie. Keep the audio track/tracks and the movie track in their original formats, and put it all together in one file/container (subtitle burnt in). In theory it should work! I know the video tracks are mostly MPEG2 and the audio tracks mostly AC-3 or DTS.
2. Do almost exactly as above, but before putting it all together compress the video and audio tracks even further to some lossless formats. In theory this should work too! In some other forum a guy told me that since MPEG2 and AC-3/DTS are already compressed formats it probably isn't possible to compress audio and video much further without losses, which is probably true.
Is it possible to do what I want to do? How? If this process is not easy to do it would be nice if some of the skilled guys would create an application that does exactly this. I believe more than me would find it extremely useful.
I installed Ubuntu 10.04 on the laptop and it looks pretty good. I currently run 9.10 on the main desktop and would like to upgrade to 10.04, by pressing "upgrade" in the update manager, but I have some questions before I do, namely about data loss.
If I upgrade, will stuff like Thunderbird keep my emails, FF keep its profile (cookies, bookmarks, addons etc..), the documents keep all the documents, I have an apache server installed with a few websites - will they still be there after an upgrade? I also have a virtual machine with windoze on, what about all the stuff in there and VMware itself?
Or, will I need to back everything up onto an external hard drive (not sure how to backup Thunderbird and FF), and then reinstall everything, and transfer all the documents, websites etc.. back over again??
I've previously created a script to backup some iso files to dvd's but had to do a format so lost it. Back then I used a command prior to growisofs that closed the dvd drive (if opened) and waited until the disk was ready to write before starting growisofs. I can't find that command now, anyone know which one it is? I remember it is a one line only that I think showed some basic info of the disk thus had to wait before it was ready...
I have a crontab related question which I am hoping someone can answer. I recently took over a Redhat Enterprise 5 Server, and I was told by the previous Server Admin that there is a cron job that does the backups. I ran the following command to get a list of all users:
Code: cat /etc/passwd | grep "/home" |cut -d: -f1
I then ran the following command for each of those users to see if they have any crontabs associated with them:
Code: crontab -u USER -l
It doesn't show any crontab entries for any users (including root). But I am positive that there is a scheduled job somewhere because the backups are still running every night.
With so many filesystems available which one should I use to make backups? All I care about is reliability and stability. I don't care at all about portability.
I would like to create an automatic sequence of Openoffice backups of a spreadsheet file, and each would be the daughter of the previous version.I would like it to autosave every hour so at the end of the day I could then manually make up a 'Day' file for permanent record.
I decided to create a file server for my family. I have set up a RAID 6 (4 disk) array. My thoughts were to back up the array to a hard drive monthly. Store the drive in a WiebeTech Drivebox, off site, in a "fire proof" box. (The kind for papers sold at Staples or Office Depot. After a year, I would have 12 back-ups. I would then overwrite the previous hard drive. (i.e. HDD from March 2011 would be overwritten on March 2012.)
Additionally, I was wondering if there was recommended maintenance to verify the array is working properly. Right now, I am moving data to the array so quickly that I am backing up every few days between three hard drives. (Back-up #4 was written to Drive #1 after Drive #1 was reformatted.) I am aware that I could use rsync. (Which I currently use for backing up my portable USB HDD to the RAID array.)
So I am using rsync (3.0.7 on MAC OSX) to backup one hard drive to a folder on another one. The is USB drive to USB drive and I have done the initial backup from one drive to a new formatted other drive with the following command:
Code: rsync -avX --progress /Volumes/Source /Volumes/Destination This all appears to be going smoothly as I type. I am going to write a script to do subsequent backups in the
I've decided to do a full total fresh install of natty since I've had so many issues with 10.10, how to backup all my terminal commands and my ppa source list as .txt files? Also can I do a fresh install via ethernet cable or should I use the live cd to overwrite my current maverick Ubuntu?
I created a cron and a shell script but I'm not sure how to automatically handle the mysql password prompt. I'm sure it's simple but having trouble figuring it out
I'm looking for a standalone backup manager with the following properties:
1) Easy scheduling.
2) Automatic encryption of backups
3) Ability to remove old backups based on total size, not just backup age. (to avoid overfilling backup media)
4) Since this is going onto a business-critical machine used by a techno-peasant, it needs to have a snazzy, graphical interface for easy monitoring and configuration.
I am sure I *could* write this myself, but I find it hard to believe that there isn't one out there already, and I am lazy. Unfortunately, there are also a very large number of backup programs out there with less than complete descriptions and I am getting tired of installing each in turn to see what it does.Has anyone stumbled across something like what I just described?
I have a problem with K9Copy's back up system it cannot seem to shrink a DVD9 down to 4.7gb to fit on a standard 4.7gb DVD Disk, how to correct this? As it renders K9copy totally pointless for me.
I'm trying to follow various ways to setup a Time Machine drive on my Ubuntu 'natty' server. I am at the stage now though where I can't seem to find the external USB HDD anywhere. I have it partitioned into two and cannot find anything to do with it. When I check ot doesn't show up in the memory either. Is there a way to mount it or find it?