I have a shell script that need to create some files:
1) backup files of user passed in file ( that will be written by this shell ).
2) temp files that the shell will create and later delete/remove.
This shell script will be used from my local dir ( I am not a super or a sysadmin ). Users of this shell will call this script to run on their local files in their respective directories. When my script runs, it errors with the following:
I have a samba share to a windows 7 computer I do not know if I will be able to use backintime or not so I want to know how to have rsync do my backup.I read the man but I'm not sure if I understand the it.on same computer different hard drive to run every hour in a script. Leanne is windows 7 share and backup is the other hard drive in the computer rsync -arvRzEP /media/leanne /media/backup.
I want to synchronize sets of files (e.g. from or to flash memory). rsync is powerful, but --delete option is dangerous. Anyone know whether there's a way to do --delete interactively, i.e. get rsync or some near equivalent to ask (y/n, in a console window) before deleting?
I would like to use rsync to keep the hard drive in my media server synced with my video collection on my linux server. The media server I believe is running some version of linux running samba. I mount the media servers share to a folder on my linux server & use the following command: rsync -a -vv --delete /home/shared/Videos/* /mnt/WDLive/Shearer Files/Movies/
However, it does not delete files on the media server that I delete on the source. I also created a new folder on the source & moved some of the files into it. When I ran rsync again, it created the new folder on the media server, but it recopied all the files from the source again, instead of moving the files which were already on the media server into the new folder, so no I have 2 copies.
I switched last summer from Windows (used it since Windows 95) to Debian. I'm using Debian Jessie for a couple of months now and I'm getting used a little.
There are problems here and there, but I can solved them with some reading on the web. Not really a big problem...till now
I run Debian 8.2 om my PC (PC1). Bought an older PC (PC2) that I want to use as a backup server.
I'm using PC2 only for making backups, after the backup I switch it off again.
So I installed Debian 8.2 (net-install without DE and with SSH) on PC2 and tried to configure it to let it work as my backup location. Made a public SSH key and exported it to the root account (no problem) and to the user account (sensdeb), but there was an error "Access Denied"
Gave the user (sensdeb) sudo-rights via visudo file
# User privilege specification root ALL=(ALL:ALL) ALL sensdeb ALL=(ALL:ALL) ALL
I installed rsync.
The problem is that Rsync only works when I use the root account.
I don know how to give the user sensdeb the rights so that I can use that account for my backup tasks. Now it's possible to sync with the root account, but that should not be the way to do it, I read many times.
I want to save a backup of my data on a remote server, but never want the backup server to see the data unencrypted. Editing a single file and backing up should not result in everything being encrypted and sent again. The remote server should preferably not even know the directory structure (and especially not the directory names).
I'm trying to learn how rsync works to backup my system. I tried: Code: rsync -azvv /home /media/Elements I get a folder called home on my external hard drive but when I use ls -l to see the permissions they are all wrong. On my /home folder the permissions for /nathan are drwxr-xr-x 48 nathan nathan The permissions on the backup /nathan folder are drwx------ 1 nathan nathan
I also tried using the long version of -a which is -rlptgoD and that didn't work either. What do the 48 and 1 mean when I used ls -l? When I look in the /nathan folder the permissions are all screwed up too. A lot of the files are backed up as executable and the permissions are all screwed up. I also ran it with sudo, and that didn't work either. The permissions were still screwed up and ownership is messed up too.
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that rsync -r -a -v -z * --delete freenas: DSIBackups..It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not.
when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?
I've been trying to make a three stage backup with stage 0 being a full monthly back up, stage 1 being a weekly backup, and stage 2 being a daily backup. I've been trying very hard to use rsync for this but sorting files by date is proving to be problematic. Sometimes it seems to work from the command line directly, but the same command causes errors and warnings from a script while entirely failing to sort the correct files.
The common example I see for this involves commands like this:
Code: rsync -Rav `find /home/ -ctime -7 -print` /path/to/home_backup The problem seems to be that since the user directories in /home contain files that have been altered within the time frame specified the whole directory is matched first which means that the whole directory is recursively archived as opposed to just the changed files.
I've also seen examples using the --files-from tag using the same find parameters and this one seems to ALMOST work but gives me strange warnings and fails to run at all when launched from a script.
Many of the things I've googled about using rsync to backup stuff by the date modified involves a rather snarky "You're missing the point of rsync!" to which I respond by yelling at my computer monitor followed by "JUST TELL ME WHAT I NEED TO KNOW!" I understand that rsync is meant to take care of incremental backups on it's own, that's why I want to use it specifically for a traditional 3 stage backup scheme.
To copy from production to standby over the internet I use a cron job doing rsync -avze 'ssh -p 8022' --exclude-from= ....
My question is: should the cron job run on the production or the standby system. Root access to the remote system is given by a pass phrase-less ssh key. Currently I run rsync on the production system. I guess that it is more secure because the standby needs no ssh login to production. Running rsync on the standby would use less resources on production. I am concerned that in this case there would be pass phrase-less access from standby to production.
I'm doing an rsync backup to an external drive in order to take a shot at setting up partition encryption. My rsync command is, as root: Code: rsync -av / /external1/backup.Once I've finished my cryptsetup and done a fresh Linux install, what command should I use to properly restore my backup (without messing up the encryption setup)?
I built a script that downloads my podcasts using Gpodder into the directory /HOME/SHARED/PODCASTS/ (with a subdirectory for each podcast)The script then selects the latest episode and copies it over to a target directory (it empies the target directory first and copies over everything) I want to use RSYNC to make sure the 'not so fresh' episodes get deleted and the "fresh" episodes get copied over. Then dropbox can sync the "new" files over to the cloud where i can access them via my ipad/iphone (whole other story).The thing is : i've replaced the cp command with the RSYNC command and now the script is acting strangely.
It selects and sync's over the "newest" podcasts to the destination directory. Then it suddenly DELETES all the episodes in the destination directory and copies over the three last files.
I'd like to backup my whole system to a 2nd disk using rsync (other tools not possible).Which paths should I exclude from the packup?I was thinking about /proc, /dev, the lost+found directories...What other paths am I forgetting?
I'm syncing a server over the internet with rsync, but it only works for a few hours before the backup fails with a "No route to host". I can restart the job and it'll will pick up where it left off, but is there an automated way to do this, or protect against a connection failure? I have about 170GB to copy over initially, but I can only get through about 4-5GB before the connection drops--manually restarting the sync everytime it drops will make the initial backup take days...
I'm hoping somebody can find something here that I haven't. I'm trying to use rsync to backup home directories to a nas. First, I NFS mounted the nas and ran an rsync and everything worked out fine. the transfer completed after a few hours and everyting was transferred (lots of stuff!). I then decided that I don't want to leave the nas mounted all the time and I didn't want to automate mounting and unmounting of the nas as I didn't think I could produce a script that would work reliably enough. So I decided to start an rsync daemon on the nas and upgrade via that. I run the following command (results are included. the ^C is me killing it after it hangs).
well, i know ther are issues when using rsync to copy files to ntfs partition like file permission blah blah. the thing is, i need to backup my music files periodically onto a ntfs partition from ext4. i really dont care about file permissions or any other stuff. when i use rsync, it should update the mp3 files on my ntfs (external) disc with the new ones.can i give a go with this operation? i have lot more important files on the external disc and i dont want this rsync corrupt or delete those files coz they are highly important files.
I have a Linux host acting as an ISCSI server for a Windows box. I want to keep an off site backup, so I figure rsync will keep the ISCSI server synced with an offsite Linux host. I understand that Rsync does block level incremental transfer to conserve bandwidth ok, awesome.The trick is, that I also want an archival copy kept. Say I want to go back to a revision of a file from 10 days ago, I need to be able to do that.
I was planning on using Backup Exec, since we currently have a licensed copy. Throw the archives from Backup Exec onto the ISCSI server as well, and have it keep a rotating 30 day backup, or something like that. The issue I see here is that this will be creating a deleting files as it does its daily backup rotation. I'm guessing RSYNC will see these as new files, and likely retransmit everything on a daily basis. The question then becomes, is this assumption correct, or will it still know to do a block level incremental transfer even when file names and such are changing?
Our backup script was working fine (ssh to the server, back up /home to a second hard drive on my computer). Then right after an ubuntu update, it quit working. I investigated and found that "something" had changed the label on the backup hdd to what looked like gibberish to me. But the script identified the backup hdd by its uuid, which didn't change. Yet, here is the error I get when the backup fails: receiving file list ... done [took about 5 seconds] rsync: mkdir "/media/14D9-3B1F/server-backup" failed: No such file or directory (2) rsync error: error in file IO (code 11) at main.c(594) [receiver=3.0.6]
Note that the backup hdd IS mounted, uuid is correct, and the folder 'server-backup' DOES exist. Does anyone have a clue for me? I'm moderately experienced in Linux and ubuntu. Our server runs centos 5. And as stated, the backup ran fine for several weeks. I think there was a new linux kernel on that update, but at this point a while later I don't know which one. Current kernel is .2.6.31-22-generic.
I support a small business which has an Ubuntu server running as a file server. The server is running Ubuntu 10.4. There is one hard drive which is mounted as /media/hdd. Each night this is backed up to an external USB hard drive mounted as /media/backup. The backup is carried out using the command:
Code: rsync -av /media/hdd/ /media/backup/
Is there a way to encrypt this back-up so that if the USB hard drive is plugged into another machine it cannot be read?