I wonder if anyone knows of a cross platform sync/backup utility that can do the following:
Incremental backups Revisioned backup (possibility to not just restore the latest but to go back to a specific point in time) Two way sync. E.g. two computers sync to the same server and the server also pushes changes to the clients (well I suppose it might be possible to solve it by restoring the latest backup also)
Cross platform: Linux, Windows & Mac (the server side may be Linux only)
Basically I want a revisioned version of Unison or a local version of Dropbox with support for custom folders. Depending on how you see it.
I'm not afraid to script things together if no pre-built solution exists but there are tools that when combined might do this. However it must be possible to automate, this comes from the fact that I want to be able to schedule it so it performs automatically. (If there is a way to push changes to the computers, like Dropbox, that's a plus but not a requirement.) This also means that a command line utility might be preferable over a GUI-only one.
Linux/Ubuntu noob here so please be gentle So I own an Ubuntu server (7.10 - "gutsy") which was previously used for my small business. All setup and maintenance of this server was done by an admin who has since moved on and I can't get in touch with.As part of the setup, this admin has somehow setup the server such that whenever I plug in an external HDD (USB) it automatically runs a backup script which copies over a whole bunch of stuff to this drive.I want to cancel/delete this script as this is no longer necessary. Can anyone give me any pointers as to how I could track down where this script is and how to remove it?
Need backup sync software recommendation. Hopefully something with a GUI. (if not, some dd if= cmd might do) I need something that will:
+copy from a source dir to a destination dir (1st -> 2nd dir)+del files and folder not found in the source. +Exclude copying files that are already in the destination (if files are same size, then skip)(if files size is different, then overwrite).+Remote folder sync such as ftp, would be nice. In other words something that will synchronize destination dir with the source. I tried a program called lucky backup. Was not impressed.
Like topic, I want to create a weekly backup of some folder to anoter partition (or external usb), compressed or not (folder also of 20/30 gb), with only root permission on file (or folder) created..This system, where I have installed debian jessie, is always on being used like a NAS..
I am trying to find a backup program to incrementally backup some files to an external disk every week for example. I would prefer not to have to write a script as I am not really used to it.
i am not familialized with Linux i installed recently a TACACS+ server under CentOS 5.5 and it is working well but my boss asked me to make a weekly backup of the tacacs.log file to check who was connected at any time the problem is that i don't know how tp make this backup and how to make it automatic and also he asked me to change the default port of tacacs from 49 to another port does someone knows how to make it?
I have been looking around online and I am seeing that there are several solutions for doing a nightly automated backup on Linux. I was wondering what people here actually use for doing such and why they use one particular backup method over another.
What I am looking to do is every night (at say 3am) I want my system to backup my 200gig Documents folder to my external hard drive. Does Ubuntu have a tool built in by default to do this or do I need to add something from the repos/online?
I created a profile for Yast Backup and successfully executed it manually. However, when I set up the Automatic Backup with the same profile it doesn't work. It will start at the scheduled time and do the "search" part of it but it hangs before it creates the archive. These two processes (backup_cron & y2base) remains running and never terminates.
Here are the last few lines in the Yast log before it hangs:
Currently, I would like to setup an automatic backup system, where the data on my computer is copied to an external harddisk which is connected to the router (Siemens SE551). This device theoretically allows setting up a file server, but unfortunately I can't access the share from Linux. I have not yet tried to access this share from Windows.
The rough plan is to use rsync and ssh, issued by a cronjob on both, my Ubuntu- and on my WinXP-notebook. But before considering this, I have to get my system connected with this drive.
BTW: I am using Ubuntu 10.4.
According to the SE551 manual [URL], NAS is not provided.
I followed this manual and set security OFF. According to the router's web interface, the device was recognized correctly.
I installed samba:
Code: gedit /etc/samba/smb.conf # # Sample configuration file for the Samba suite for Debian GNU/Linux. # # # This is the main Samba configuration file. You should read the
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
I've got Ubuntu One syncing a single 25MB folder on 4 computers. On one of these computers, the ubuntuone-syncdaemon process constantly pegs the CPU, using from 50-80% long after any sync-able files have been modified and successfully synced. The process is only using 8.9MB of RAM.
Specs: Ubuntu 10.04 (lucid) Kernel 2.6.32-24-generic 1000.8 MB RAM Pentium 4 2.53GHz Free disk space: 280.9 GB System monitor shows 56.8% total RAM usage, 15.4% swap file usage.
Audio sync method. "Stretches/squeezes" the audio stream to match the timestamps, the parameter is the maximum samples per second by which the audio is changed. -async 1 is a special case where only the start of the audio stream is corrected without any later correction.Searching the net makes one believe that this command is just some sort of magic.People just put it in the line and it just works. Isn't that nice?
It says nothing about how to change the TIME the audio starts syncing. Like do I want it to start 5 seconds delayed? Or what about 5 seconds sooner?What if the audio gets more out of sync as the video goes on? Can I slip it a little at a time? What? No magic?No one mentions a file that already has badly synced audio.So what -async 1 really does is simply start the audio at the beginning of the file. LIKE AS IF THAT ISN'T STANDARD PROCEDURE?So what is the exact solution to syncing a messed up video? And why can't it just do the proper "timestamp" sync in the first place?No docs, no info and you are left out in the cold.
I use jpilot on opensuse 11.3 64bit to sync pim data with my Palm Treo 680 via bluetooth. This worked fine until today. Now I get the following error message when I try to sync: Syncing on device bt: Press the HotSync button now dlp_ReadSysInfo error Exiting with status YNC_ERROR_PI_CONNECT Finished.
The last successfull sync was on the 20th October and today is the 24th October. I did not change any settings in jpilot or on my palm device. So I guess there must have been an update of opensuse which causes this error. But I do not now how to look up the updates during this period or how to undo them. Was there an update between the 20th and the 24th Oktober, which might affect either jpilot or bluetooth functionality?
I've discovered Firefox Sync a while ago, and it's absolutely awesome. Now of course I'd like most of my software to work this way! So is there a way to get the same behavior with Thunderbird?
It seems that selinux has stop weav to sync the bookmarks.I followed the fix code as SELinux suggested,but it can't work.Does anyone know how to solve it?
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
does anyone know of a good backup software for Ubuntu 10.4 that will let me select which folders to backup, rather than a complete backup? My install and settings etc can be replaced, but my photos and memories cannot!
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.