Debian :: Incremental Encyrpted Local Backup Utility?

Sep 17, 2010

I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.

Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

View 1 Replies


ADVERTISEMENT

Software :: Incremental Encrypted Local Backup Utility Alternative To Duplicity?

Aug 10, 2010

I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.

EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.

View 2 Replies View Related

CentOS 5 :: Backup Script With TAR - Incremental Backup With Simple FTP To Another Location And Email Status

Jan 15, 2010

After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :

I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.

It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).

Files for TAR to include and exclude are in txt files listed each line separate name:

file: including.txt:

View 7 Replies View Related

General :: Incremental Searching With The `less` Utility?

Feb 26, 2010

Does less have an incremental search?

I'm on xubuntu.

View 2 Replies View Related

Fedora :: Incremental Automatic Backup

Jul 18, 2011

I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.

View 4 Replies View Related

General :: Administrator - How To Take Incremental Backup

Mar 16, 2011

I'm trying to take backup for my data for rhel, but I not able to take all backup. Could anybody help show me how I take incremental and full backups? What is the process?

View 3 Replies View Related

Ubuntu :: Rsync Like Incremental Backup

Oct 20, 2010

when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?

View 5 Replies View Related

Ubuntu :: How To Incremental Backup Of Folder

Mar 8, 2011

incremental backup of folder.The problem with e.g. find&tar is, that I want backup not only files with modification time after x:y , but also older files, that have been copied into this folder after last backup.

View 6 Replies View Related

Server :: Incremental Backup In Linux

Sep 18, 2010

A complete back up using tar takes consumes more time. so is there any way to take incremental backups using tar.And i also want to take incremental backup dump of my databases too.Any suggestions and links will be very helpful.i keep on googling for this,but could find any exact for this.

View 14 Replies View Related

General :: Binary - Use Logs To Have Incremental Backup

May 7, 2009

I had full backup in mysql. now i added some tables .i got new binary logs. how i can i use these logs to have incremental backup.

View 3 Replies View Related

Ubuntu :: Cron - Backup Runs To Have Incremental Number

Apr 3, 2010

I am trying to write as bash script in order to have backup done by cron on the webhosting server. I want all backup runs to have incremental number in front of them. I came up with an idea to store incremental number of a backup in txt file on a server so when next one runs is can check the number in the file. However I am having terrible issues. My script:

[code]....

View 7 Replies View Related

Ubuntu Servers :: MySQL Incremental Backup - Binary Log File Names

Mar 15, 2011

I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.

My problem is with the mySQL incremental / binary log backups.

The problem is that the binary log file(s) are always named xxxx-bin.1.

Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.

I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.

My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.

All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.

The my.cnf file contents that are relavent are as follows:

Code:

The statements in the backup script that do the backup are:

mysqladmin flush-logs

or

mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf

View 3 Replies View Related

Ubuntu :: Back It Time Backup Utility?

Jan 6, 2010

How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?

[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.

View 2 Replies View Related

Ubuntu :: Choosing A Backup Utility - Scheduling?

Apr 30, 2010

I have a computer running Ubuntu 9.10 as a server (it is in standard Ubuntu not Ubuntu server edition). I have 4 1TB hard drives, three of which I want to back up to on certain days of the week. I have tried using Lucky-Backup and Rsync but neither seem to be able to handle the amount of data (there is currently about 400GB). Does anyone know of a program that can run scheduled backups of this size?

View 3 Replies View Related

Ubuntu :: Good Native Backup Utility For 10.04

Jul 27, 2010

Can anyone recommend a good native backup utility for 10.04. I would like compression, the ability to image partitions and/or drives and a simple way to restore in the event of total drive failure. A nice incremental backup facility would be good too. I would be backing up to an external USB hard drive but of a smaller size than the source drive so compression and the ability to choose what and what not to backup is needed.

View 1 Replies View Related

Ubuntu :: Best Backup Method - Copy Drive Image Or Use Utility

Jul 8, 2011

I just got a 2TB drive with the intention of backing up multiple Ubuntu machines to it. What would be the best way to do this, keeping ease of restoration in mind? Should I just copy each drive image to the BU drive, or use a utility like Back in Time?

View 8 Replies View Related

Software :: Ubuntu 9.04 Server - Finding Utility To Backup Entire HDD?

Dec 26, 2009

I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.

So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.

View 1 Replies View Related

Software :: Use Wget To Retrieve Some Data From Tape Backup Utility?

Sep 30, 2010

I'm trying to use wget to retrieve some data from our tape backup utility (HP Command View 1/8 G2 Autoloader). The URL requires two parameters for the info I want to retrieve. I have searched for a few hours and have tried numerous combinations to get the data but the parameters aren't being executed. I have escaped the URL as well.

Code:

wget --user=x --password=x --recursive --no-clobber --page-requisites --html-extension --convert-links --no-parent -O ssi.html "10.0.x.x/inventory_status.ssi?mag_0=1&mag_1=1"

returns:

Code:

<HTML><HEAD><TITLE></TITLE></HEAD><BODY>
<SCRIPT LANGUAGE="javascript">
top.location.href='logout.ssi';

[code]....

View 3 Replies View Related

General :: Good Backup Utility / Script / App / Got External FAT32 Drive

Mar 10, 2011

I am working from a laptop where all my work is stored on a 80GB drive. I am now also an owner of an external 250GB USB hard drive, formatted with FAT32. I want to keep it FAT32, so that I can offer some of my files to people that run Mac OS or Windows and I don't want to have them install ext3 for windows and what not.I am in need of a strategy which will allow me to keep a mirror of my laptop drive on my new external drive, i.e. no history / versioning required. However, I do care about file permissions. The files don't have to be stored as-is, they can be stored within a large (80GB?) tar file, that is fine - it would be easier for me to coerce people to open a .tar file than to install an ext3 driver for their OS, I suppose. I don't think I can keep file permissions otherwise, can I?

I have previously used a self-written sh script that used rsync to keep an up-to-date copy of my laptop filesystem on a USB flash drive, but in that case I had the flash drive formatted with ext3, so no problem with file permissions there. This time, it's trickier.

View 9 Replies View Related

Programming :: Incremental Backup Using DUMP Command - Error "DUMP: Only Level 0 Dumps Are Allowed On A Subdirectory"

Sep 6, 2010

I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as

DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.

The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...

View 2 Replies View Related

Server :: Backup Goes To Local Disk?

Jun 7, 2011

I have an external drive that I want to do backups to. Most times it goes great, other times the server gets real sloggy, and I do a 'df' and see I'm at 96% disk usage. What has occured is the disk failed to mount apparently, so the backup backs up to my local disk at /media/backups/

I have /media/backups in my /etc/fstab pointing to /dev/sdc1, but I think the external disk will sleep when not in use for long periods.

How do I make sure /media/backups is REALLY going to the external drive and not my local drive? Is there anyway to sort of test it BEFORE I write umpteen gigs to my local hard drive?

View 6 Replies View Related

OpenSUSE :: Backup And Restore Local Folders In K Mail?

Mar 4, 2011

How to backup and restore local folders in K Mail. I understand that this is a rudimentary question-- under K Mail in 11.2 there is no backup and restore option for local folders (that I have been able to find anyway). I use 11.2 on this machine where I do all e mail, browsing and everything pretty much internet related. I am going to upgrade to 11.4 in another week and want to make sure that I understand backup and restore procedures for K Mail...in case something untoward happens during the upgrade to 11.4.

View 2 Replies View Related

Ubuntu :: Using Rsync To Backup The Local Home Directory?

Oct 14, 2010

I went ahead and created this directory

mkdir /tmp/rsync-backup

and, ran this

rsync av /home /tmp/rsync-backup

this is the result

rsync: link_stat "/av" failed: No such file or directory (2)
skipping directory home
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7]

View 5 Replies View Related

Debian :: Backup Purge Script - `/mnt/backup/subfolder': No Such File Or Directory

Nov 10, 2010

This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:

# delete backups older than 7 days
find /mnt/backup/*  -mtime +7 -exec rm -Rf {} ;

The problem is, every morning I get an email with an error message something like this:

find: `/mnt/backup/subfolder': No such file or directory

View 2 Replies View Related

Debian :: Used Backup-manager - Restore The Backup Data?

Feb 4, 2011

I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?

View 4 Replies View Related

General :: Setup Rsync To Backup A Remote Directory To Local Drive?

May 24, 2010

I'm trying to setup rsync to backup a remote directory to my local drive.

I cd to the directory that I want to pull the files to, then I enter:

rsync -vrtW account@remote.com:~/public_html

I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing?

View 1 Replies View Related

Ubuntu Installation :: Cpuid Utility Is Not Compiled With U9.04 And Utility Is Not Available As Package With Synaptic?

Feb 5, 2010

cpuid utility is not compiled with U9.04 and the utility is not available as a package with synaptic -
other distributions have it available as rpm . url

Any way to run this utility in the Debian world?

View 2 Replies View Related

Ubuntu Networking :: Combine Rsync And SSH Pass - Backup File From Each Host With General Password In Local Network??

May 13, 2011

I made a script to backup file from each host with general password in local network. This script using SSH Pass and Rsync with this

syntax:
rsync --rsh="sshpass -p password ssh -l root" hostath destinationpath
Everything is okay under 9.10 version until I migrate to Ubuntu 11.04, there is always give an error:
rsync error: received SIGINT, SIGTERM, or SIGHUP (code 20) at rsync.c(541) [Receiver=3.0.7]

I am using bash version: GNU bash, version 4.2.8(1)-release (i686-pc-linux-gnu) and 2.6.38-8-generic kernel

View 3 Replies View Related

Debian :: 'Install To USB' Utility?

Jan 24, 2011

Acoording to this manual:[URL]there are two installation methods:* Burned: boot from a LiveCD and then use the 'Install to USB' utility.(NOTE: Which LiveCD in particular ... and is it going to install a Debian distro ? We're in the Debian wiki... soit's confusing I think -- OlivierBerger)* Unburned: download the ISO file and copy its contents into the USB pendrive.

View 2 Replies View Related

Debian :: Add Shut Down Utility Of Ubuntu?

Jun 2, 2010

In recent versions of Ubuntu, it is possible to click on the top right corner to choose to shut, down, log out, etc as opposed to having buttons that need to be clicked on followed by other buttons. Ubuntu's current utility works by clicking on the top right corner which shows a list without opening a new mini-window or anything allowing me to select however it is that I want to change the state of my computer if it wasn't obvious as to what I am referring to.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved