I am trying to run a rsync backup Script to backup my data from my HD to my NAS Drive (currently mounted as Samba share, but can change to NFS if that would improve things, if I can work out how!), but when I run the Script I get the following error on every file it tries to copy
Code:
As I mentioned I am mounting my Share from my NAS as Samba and below is the fstab entry
Code:
The rsync command with options is as follows
Code:
The error seems to be with the -a option, but I have no idea why.
Thought I'd post it here because it's more server related than desktop... I have a script that does:
[Code]....
This is used to sync my local development snapshot with the live web server. There has to be a more compact way of doing this? Can I combine some of the rsyncs? Can I make the rsync set or keep the user and group affiliations? Can I exclude .* yet include .htaccess?
When I run rsync --recursive --times --perms --links --delete --exclude-from='Documents/exclude.txt' ./ /media/myusb/
where Documents/exclude.txt is
- /Downloads/ - /Desktop/books/
the files in those directories are still copied onto my USB.
And...
I used fetchmail to download all my gmail emails. When I run rsync -ar --exclude-from='/home/xtheunknown0/Documents/exclude.txt' ./ /media/myusb/ I get the first image at url.
I have a tiny shell script to rsync files between two servers and remove the source files.
This script works fine, when it has been initiated manually or even when the rsync command is executed on the command line.
But the same script doesn't work, when I try to automate it through crontab.
I am using 'abc' user to execute this rsync, instead of root, as root login to servers are restricted in all of our servers, by us.
As I mentioned earlier, manual execution works like charm!
When this rsync.sh is initiated through crontab, it runs the first command(chown abc.abc ...) perfectly without any issues. But the second line is not at all executed, and there is no log entry i can find at /mnt/xyz/folder/rsync.log.
I'm using Ubuntu 10.04 LTS server and Postgresql 8.4. I have a .sh script that is run by cron every other hour. That works fine. The .sh script includes an rsync command that copies a postgresql dump .tar file to a remote archive location via ssh. That fails when run by cron; I think because it is (quietly) asking for the remote user's password (and not getting it). I set up the public/private ssh key arrangement. The script succeeds when run manually as the same user that the cron job uses, and does not ask for the password. I am able to ssh to the remote server from the source server (using the same username) and not get the password prompt (both directions), so why doesn't rsync work? I even put a .pgpass file in the root of that user's directory with that user's password, and the user/password are identical on both servers.
I think the problem is rsync is not able to use the ssh key correctly. I tried adding this to my script but it didn't help.
Code:
Here is the rsync command embedding in the .sh script.
i have a cheap-arse employer who doesn't provide any data backup for us. so i've setup my own very simple server with ubuntu 10.04 (no raid). from my linux desktop i want to backup to my file server. from my file server i configured samba to allow me to access my data from the many Windows PCs in the place. I also have a USB drive for when i go home or to another building.
most of my work happens on my linux desktop. so i use rsync to 'backup' from there to the file server. can i use rsync to keep my usb drive also sync'ed? in the same command?
when i access files on my file server from a Windows PC via samba, and change something, will rsync pick up that change on the server, and migrate the changes back to my linux desktop?
I'm trying to using rsync to backup some files, about half a TB. It's now it a state where it keeps sending the same files everytime it runs. for example:
rsync -av /data/source/* user@host:/data/dest sending incremental file list source/file1.txt source/file2.txt
I then verify those files are copied over. then the next time it runs it does the same thing
rsync -av /data/source/* user@host:/data/dest sending incremental file list source/file1.txt source/file2.txt
any idea why it's getting stuck on these files? I've tried to wipe the whole dest directory out and start over but no luck.
I have a file in a directory mysite/a.php on machine A and machine B. The files will not have any difference initially.On machine A if a.php is changed and then when a sync is done from machine B to A, will a.php have the change or will it be overwritten? What about directories?
I want to save a backup of my data on a remote server, but never want the backup server to see the data unencrypted. Editing a single file and backing up should not result in everything being encrypted and sent again. The remote server should preferably not even know the directory structure (and especially not the directory names).
I have two directories, dirA whicht contains N gb of data and dirB which is supposed to contain only the newest M gb of data from dirA. When files are added to dirA, they sould also be added to dirB, while the oldest files in dirB should be deleted.Is that possible with rsync? or any other software?
I am using rsync to backup dirs on my ubuntu server onto a NAS (which is mounted onto the filesystem), but the problem is that it is constantly doing full backups rather than doing incrementals and I am not really sure why. After doing a bit of expermienting with the script I noticed that if I just backed up a home dir (/home/user) the incremental backups work fine. If however I was to back up a dir like (/home/domain/user) it always does full backups.I have tried various different scripts but still the same end result. The latest script is a variation on the a script found on the samba rsync examples webpage, see below...
#!/bin/bash # rsyncbu.sh -- backup to nas using rsync # This script backups files listed in BDIR to the BSERVER. The verbose output along with the date is listed in the LOG_FILE specified # verbose output
I have a 1TB USB external drive, currently formatted as fat32. What I need to do is copy two folders and all their subfolders, totaling about 500GB, to that external drive. The USB drive will have to transfer back and forth between RHEL, Windows XP, and Mac OSX computers freely.What format should I go with on the USB drive, FAT32 or NTFS?What rsync switches should I use? I know I don't want to use -a because I don't want any permissions restored. I'm guessing I'll have to run rsync a couple times to fully get all the files, so I need to be able to cancel an rsync, then have it pick back up where it left off, not start over and recopy every file again.
I often have to transfer huge data over our LAN from one computer to another. The size of the files varies and can be somewhere from 2 GB to 50GB or more! The only accepted connection protocol between machines are ssh; and rsync and scp are the only options available for copying over network (unison is not installed). I usually use rsync with "-z" option to copy over network. if "rsync -z" is faster than "scp" for data transfer?
I've been trying to make a three stage backup with stage 0 being a full monthly back up, stage 1 being a weekly backup, and stage 2 being a daily backup. I've been trying very hard to use rsync for this but sorting files by date is proving to be problematic. Sometimes it seems to work from the command line directly, but the same command causes errors and warnings from a script while entirely failing to sort the correct files.
The common example I see for this involves commands like this:
Code: rsync -Rav `find /home/ -ctime -7 -print` /path/to/home_backup The problem seems to be that since the user directories in /home contain files that have been altered within the time frame specified the whole directory is matched first which means that the whole directory is recursively archived as opposed to just the changed files.
I've also seen examples using the --files-from tag using the same find parameters and this one seems to ALMOST work but gives me strange warnings and fails to run at all when launched from a script.
Many of the things I've googled about using rsync to backup stuff by the date modified involves a rather snarky "You're missing the point of rsync!" to which I respond by yelling at my computer monitor followed by "JUST TELL ME WHAT I NEED TO KNOW!" I understand that rsync is meant to take care of incremental backups on it's own, that's why I want to use it specifically for a traditional 3 stage backup scheme.
19:25 into this video has a question about back-ups:[URL].. it is basically asking how to save files from different sources (mac harddrive, windows harddrive, sd-card, cd's) and they want to ensure pictures from a digital camera dont get overwritten (there are multiple 001.jpg, 002.jpg, ... files -- maybe md5sum will help here).
i assume rsync would be good in this case but i am not sure what the syntax would be. i also assume that the rsync command is available for mac (i am more of an scp guy) but i have been proven wrong on assuming bash compatability between linux/ os-x before.
I'm trying to copy files from my local directory to a remote site using rsync.I want to include in the copy all the java files and exclude all the .svn directories, but I can't do it.
I need to combine 6 different filesystems into one filesystem using rsync. I am so confused as to which parameters I need to use. The 6 fileystems are:
I've figured out about half of the problem this time. I've duplicated the directory structure of the Windows machine on the Linux server.
I then proceed with mounting the Windows share on the Linux server. I type the mount command while I'm in Folder1 on the Linux server:
Code:
So far so good. Now I need to rsync and the command for this is
Code:
But what do I fill in here? I'm guessing path/to/dst is just plain Folder1 on the Linux box. Path/to/src refers to the share on the Windows machine but since I've mounted that share, is path to source = path to destination?
I'm trying to set up my trusty Fedora box, with rsync over ssh to backup my windows machines at home and I need help configuring the rsync server (I'm using DeltaCopy as the rsync client on the windows side)I tried a few dry runs but it seems windows can't see the linux box, the rsync job just hangs for ever and never does anything.I should mention that ssh works fine.
rsync -r -v -e ssh root@nn.nn.nn.nn:/usr/local/websites/* /usr/local/websites and each time I run it it copies everything - all files. I thought rsync was only supposed to copy files that had been added or modified.
How I use rsync to setup server from any existing server, that I want to create server from my existing one server where I have installed and tune everything, I want everything replecate on new server excluding network (IP) and /proc
I want to exclude all log files from being transferred from the src to dest and delete all existing ones on the destination so im using --exclude=*.log with --delete-excluded which works great ...
but i want to keep a certain log file intact on the destination. I want a --exclude-from-delete option
I am trying to synchronize the content of the directory my_dir/ from /home to /backup. This directory contains a file which name has a double quote in it, such as to"to. Here is my rsync command: rsync -Cazh /home/my_dir/ /backup/my_dir/
And I get the following message: rsync: mkstemp "/backup/my_dir/.to"to.d93PZr" failed: Invalid argument (22) For info, rsync works well when the synchronized filenames contain single quote, parenthesis and space. Thus, why is it bugging with a double quote?
I'd like to know if it's possible to automatically mount, and fire up rsync to sync a USB drive with a directory? Specifically, I'd like to copy as much data as the drive can hold and only delete the oldest files if space is needed. I would assume I'd do something like this with a script, but my problem, is where to start.