Software :: Incremental Encrypted Local Backup Utility Alternative To Duplicity?

Aug 10, 2010

I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.

View 2 Replies


ADVERTISEMENT

Debian :: Incremental Encyrpted Local Backup Utility?

Sep 17, 2010

I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.

Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

View 1 Replies View Related

Ubuntu Security :: Encrypted Backup With Duplicity?

Feb 20, 2010

I managed to make an encrypted backup of my ubuntu box onto my server and was also able to restore it. I mainly followed this tutorial here. Altough everything worked fine I have two questions:What is that part for ? Quote: export PASSPHRASE=your_passphrase

Just for the fun of it, and to see how it would handle incremental backups I ran the backup command a second time and was, to my surprise, asked to provide my GpG password. Whys that? And how can I "auto-login", since I would like to run this command in a cron job.

View 5 Replies View Related

CentOS 5 :: Backup Script With TAR - Incremental Backup With Simple FTP To Another Location And Email Status

Jan 15, 2010

After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :

I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.

It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).

Files for TAR to include and exclude are in txt files listed each line separate name:

file: including.txt:

View 7 Replies View Related

General :: Incremental Searching With The `less` Utility?

Feb 26, 2010

Does less have an incremental search?

I'm on xubuntu.

View 2 Replies View Related

Fedora :: Incremental Automatic Backup

Jul 18, 2011

I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.

View 4 Replies View Related

General :: Administrator - How To Take Incremental Backup

Mar 16, 2011

I'm trying to take backup for my data for rhel, but I not able to take all backup. Could anybody help show me how I take incremental and full backups? What is the process?

View 3 Replies View Related

Ubuntu :: Rsync Like Incremental Backup

Oct 20, 2010

when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?

View 5 Replies View Related

Ubuntu :: How To Incremental Backup Of Folder

Mar 8, 2011

incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.

View 6 Replies View Related

Server :: Incremental Backup In Linux

Sep 18, 2010

A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.

View 14 Replies View Related

General :: Binary - Use Logs To Have Incremental Backup

May 7, 2009

I had full backup in mysql. now i added some tables .i got new binary logs. how i can i use these logs to have incremental backup.

View 3 Replies View Related

Ubuntu :: Cron - Backup Runs To Have Incremental Number

Apr 3, 2010

I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:

[code]....

View 7 Replies View Related

Ubuntu Servers :: MySQL Incremental Backup - Binary Log File Names

Mar 15, 2011

I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.

My problem is with the mySQL incremental / binary log backups.

The problem is that the binary log file(s) are always named xxxx-bin.1.

Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.

I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.

My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.

All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.

The my.cnf file contents that are relavent are as follows:

Code:

The statements in the backup script that do the backup are:

mysqladmin flush-logs

or

mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf

View 3 Replies View Related

Ubuntu Multimedia :: Alternative To Canon Eos Utility?

Nov 15, 2010

I need to know if there are native Linux utilities that work like the eos utility of canon that allow to import immediately after each shot the photo on the laptop.

View 3 Replies View Related

Ubuntu :: Alternative For Promise WebPAM Utility?

May 2, 2011

I need a utility to rebuild RAID setups after a HDD failure. I currently have two HDD's hooked up to a Promise FastTrak TX2650/TX4650 controller; under Windows I could use Promise's WebPAM utility to rebuild the RAID. Well, I no longer use Windows (and never plan to again) so is there a native Linux program that can rebuild RAID configurations in the event of a HDD failure?

Promise finally released WebPAM for Linux [URL].. but I believe you must install Promise's proprietary Linux driver in order to get WebPAM to work. I would really like to avoid installing their proprietary Linux driver if at all possible.

View 5 Replies View Related

Ubuntu :: Back It Time Backup Utility?

Jan 6, 2010

How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?

[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.

View 2 Replies View Related

Ubuntu :: Choosing A Backup Utility - Scheduling?

Apr 30, 2010

I have a computer running Ubuntu 9.10 as a server (it is in standard Ubuntu not Ubuntu server edition). I have 4 1TB hard drives, three of which I want to back up to on certain days of the week. I have tried using Lucky-Backup and Rsync but neither seem to be able to handle the amount of data (there is currently about 400GB). Does anyone know of a program that can run scheduled backups of this size?

View 3 Replies View Related

Ubuntu :: Good Native Backup Utility For 10.04

Jul 27, 2010

Can anyone recommend a good native backup utility for 10.04. I would like compression, the ability to image partitions and/or drives and a simple way to restore in the event of total drive failure. A nice incremental backup facility would be good too. I would be backing up to an external USB hard drive but of a smaller size than the source drive so compression and the ability to choose what and what not to backup is needed.

View 1 Replies View Related

Server :: Red Hat 5.5 - Amanda (or Alternative) Backup

Jun 6, 2010

I'm fairly competent with Linux at this stage and as we have just installed a new Red Hat 5.5 Server to host our DB at work it has basically fallen to me to sort everything out. The only thing I'm not 100% on is a backup strategy at the moment. Basically, the server / DB is not that big and as such we will be doing a full backup every night of the whole file system.

As it is there are several LVM partitions across several disks (these are all virtual both on the SAN and as a virtual machine). My question is what would be the best way to create a full backup overnight of the whole server including all the partitions (or a backup of each partition I'm guessing is more feasible). This doesn't seem to be too difficult to do, but my main problem is restoring the backup.

Is it a simple matter of basically .tar'ing the whole / filesystem? and if so, to restore can I just use a live CD to restore the partition that is needed? How does LVM affect the restoration process? I have looked at Amanda as a backup solution, though we wont be backing up directly to tape (backup to a share location then copy to the tape drive).

View 7 Replies View Related

Ubuntu :: Best Backup Method - Copy Drive Image Or Use Utility

Jul 8, 2011

I just got a 2TB drive with the intention of backing up multiple Ubuntu machines to it. What would be the best way to do this, keeping ease of restoration in mind? Should I just copy each drive image to the BU drive, or use a utility like Back in Time?

View 8 Replies View Related

Software :: Ubuntu 9.04 Server - Finding Utility To Backup Entire HDD?

Dec 26, 2009

I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.

So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.

View 1 Replies View Related

Software :: Use Wget To Retrieve Some Data From Tape Backup Utility?

Sep 30, 2010

I'm trying to use wget to retrieve some data from our tape backup utility (HP Command View 1/8 G2 Autoloader). The URL requires two parameters for the info I want to retrieve. I have searched for a few hours and have tried numerous combinations to get the data but the parameters aren't being executed. I have escaped the URL as well.

Code:

wget --user=x --password=x --recursive --no-clobber --page-requisites --html-extension --convert-links --no-parent -O ssi.html "10.0.x.x/inventory_status.ssi?mag_0=1&mag_1=1"

returns:

Code:

<HTML><HEAD><TITLE></TITLE></HEAD><BODY>
<SCRIPT LANGUAGE="javascript">
top.location.href='logout.ssi';

[code]....

View 3 Replies View Related

Networking :: Alternative Routing For Local Process?

Nov 26, 2010

I have a multihomed server, connected on two different ISPs. All default trafic goes to ISP1 via wan1. There is special local processes in my system, what must go through ISP2 via wan2. This processes are make connections to TCP:80.

What did I do:

[root@localhost ~]# ifconfig wan1 10.44.8.252 netmask 255.255.255.0 broadcast 110.44.8.255 up
[root@localhost ~]# ip r r default via 10.44.8.1

[code]....

I see that frames goes out with SRC of wan1... I tried this:

[root@localhost ~]# iptables -t nat -I POSTROUTING -o wan2 -p tcp --dport 80 -j SNAT --to-source 192.168.86.2

and saw:

[root@localhost ~]# telnet 194.87.0.50 80
[root@localhost ~]# tcpdump -i wan2 -nnt port 80
tcpdump: verbose output suppressed, use -v or -vv for full protocol decode

[code]....

The connection did not established... Conntrack does not see it!

View 2 Replies View Related

General :: Good Backup Utility / Script / App / Got External FAT32 Drive

Mar 10, 2011

I am working from a laptop where all my work is stored on a 80GB drive. I am now also an owner of an external 250GB USB hard drive, formatted with FAT32. I want to keep it FAT32, so that I can offer some of my files to people that run Mac OS or Windows and I don't want to have them install ext3 for windows and what not.I am in need of a strategy which will allow me to keep a mirror of my laptop drive on my new external drive, i.e. no history / versioning required. However, I do care about file permissions. The files don't have to be stored as-is, they can be stored within a large (80GB?) tar file, that is fine - it would be easier for me to coerce people to open a .tar file than to install an ext3 driver for their OS, I suppose. I don't think I can keep file permissions otherwise, can I?

I have previously used a self-written sh script that used rsync to keep an up-to-date copy of my laptop filesystem on a USB flash drive, but in that case I had the flash drive formatted with ext3, so no problem with file permissions there. This time, it's trickier.

View 9 Replies View Related

Programming :: Incremental Backup Using DUMP Command - Error "DUMP: Only Level 0 Dumps Are Allowed On A Subdirectory"

Sep 6, 2010

I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as

DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.

The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...

View 2 Replies View Related

General :: How To Do Rsync-like Encrypted Backup

Feb 27, 2011

I want to save a backup of my data on a remote server, but never want the backup server to see the data unencrypted. Editing a single file and backing up should not result in everything being encrypted and sent again. The remote server should preferably not even know the directory structure (and especially not the directory names).

View 2 Replies View Related

Ubuntu :: Backup PC And Encrypted Home Folders?

Jan 2, 2010

I'm recently switched my work laptop from running winXP to runing karmic. I'm still at the stage of getting my various bits and bobs working correctly. One of these I (may) have a problem with is backup's. I've ran backuppc on a ubuntu 9.04 box in the attic for the last year or so and I've been backing up my laptop to that. But since the switch, since I have an encrypted home dir, what is being backed up is the encrypted files. First, can I recover these if needed (I kept a copy of my passphrase), or can I get backuppc to ssh in as me with my home dir mounted correctly?

Backuppc is using rsync over ssh I've been using linux on and off since about redhat 5.0, so I'm not afraid of the command line or vi

View 5 Replies View Related

Debian :: Backup Plan For Encrypted Hard Drive

Jun 25, 2010

My laptop has only Debian on it. Except for /boot, the entire hard drive is a giant encrypted LVM partition. It takes Clonezilla 13 hours to back up to a USB hard drive without verification, long enough to make sure backups aren't done much. Is there some way to make an encrypted bare-metal backup of only what is used (except swap) instead of every sector? Backing up across the LAN would be ok.

View 6 Replies View Related

Debian :: Encrypted Backup To Remote Dumb Server

Mar 23, 2011

This is not a regular backup. I only want to backup selective directories so personal files (photographs, documents, sourcecode) will be kept safe in case of a total system meltdown. This'll be 15GB max. Basically the digital variant of a fire resistant safe. I looked into duplicity but that requires me to install gpg keys on the target machine, which I can not do. I rather have a solution that just relies on just a working shell account and diskspace on the target server.

I thought of writing a simple script to do the following:
1. Mount remote server with sshfs
2. Mount encrypted container at remote server (LUKS, TrueCrypt?)
3. Loop over predefined directories on local machine and copy to encrypted container (rdiff-backup?)

Based on these requirements:
- Target server is "dumb": only ssh access + diskspace (i.e. no installing of gpg keys)
- Encrypted container should grow/shrink to fit contents
- Encrypted container should be easily decryptable on any OS if you have the password
- Once data leaves client server it should be encrypted: sysadmin on target server should never be able to see unencrypted data.

View 3 Replies View Related

General :: Encrypted Backup Solution To USB HD (Open Source)

Mar 26, 2011

There are a lot of backup solutions, many scripts based of rsync. The problem is not a lot of them encrypt your data before syncing it. I have a USB hard drive and I want to backup my user folder /home/myuser/ to the external drive What software will allow me to create incremental backups which are encrypted with relative ease

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved