Ubuntu :: Use Rsync & Ssh To Move A Backup Folder Some Computers To A Server?
Mar 13, 2010
I am trying to use rsync & ssh to move a backup folder some computers to a server. I found a command that is supposed to do this, but I am having issues getting it to work.
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that rsync -r -a -v -z * --delete freenas: DSIBackups..It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not.
I switched last summer from Windows (used it since Windows 95) to Debian. I'm using Debian Jessie for a couple of months now and I'm getting used a little.
There are problems here and there, but I can solved them with some reading on the web. Not really a big problem...till now
I run Debian 8.2 om my PC (PC1). Bought an older PC (PC2) that I want to use as a backup server.
I'm using PC2 only for making backups, after the backup I switch it off again.
So I installed Debian 8.2 (net-install without DE and with SSH) on PC2 and tried to configure it to let it work as my backup location. Made a public SSH key and exported it to the root account (no problem) and to the user account (sensdeb), but there was an error "Access Denied"
Gave the user (sensdeb) sudo-rights via visudo file
# User privilege specification root ALL=(ALL:ALL) ALL sensdeb ALL=(ALL:ALL) ALL
I installed rsync.
The problem is that Rsync only works when I use the root account.
I don know how to give the user sensdeb the rights so that I can use that account for my backup tasks. Now it's possible to sync with the root account, but that should not be the way to do it, I read many times.
when i use rsync command to backup my image file , it shows the following error message.
bash: line 1: /usr/bin/rsync: Argument list too long rsync: connection unexpectedly closed (0 bytes received so far) [receiver] rsync error: remote command could not be run (code 126) at io.c(463) [receiver=2.6.8]
The command which i used is rsync -avrl -e ssh cms@server:/data/cms/data/images/* /mnt/Backup/Intranet_cms_backup/images
i have an ubuntu server 4 windows client..i use putty or webmin. would like to copy some folders for example: "My houses"to be backup everynigth to the ubuntu server..can somebody give me an easy way for doing this with rsync and smb or cifs.
Is there a simple way to move the Sendmail queue folder? Presently it's at the default location on /var/spool/mqueue/ but when / recently ran out of space (my fault storing backups there), it was unable to receive any more mail. There is plenty of space at another partition. My /var/opt/scalix location lives on another set of discs with lots of room. I created a folder called /var/opt/scalix/sendmail/mailqueue/ but uncertain how to move the existing queue to it.
I have a backup folder which I need to prune and I've been trying to do a find and destroy action on files in this folder which have not been modified for more than 30 days. I figure if no users complain that their files have been missing for more than 30 days, it's safe to delete them from this folder.
I'm looking for a way to sync directories between two computers, in a "two-way" fashion. Basically, I have a laptop and I have a desktop, and I want to keep a particular directory synced between the two machines. The easiest thing would be to have some kind of networked filesystem, but obviously this won't work because the laptop may or may not be connected to the internet at any moment. At any time I might be editing files on the desktop or on the laptop, and when the laptop is connected to the internet, I'd like all files on both machines to be synced to their most recent versions.
I thought I could do this with rsync but now that I've looked into a bit more it seems like it works only for "one-way" syncing. In other words files are synced from a server to a client or vice versa, but not both at once. First of all, am I right about that? And second, is there a program that will do what I want to do? OK, I guess you could do it with some SVN kind of thing but that seems like overkill. I guess if there's nothing out there it shouldn't be too hard to write a script myself to do it.
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
When I position icons on the desktop in specific places, then I choose to move a file or folder into another folder, all the icons arrange back to the left side. This happened in an earlier version of KDE 4.x, disappeared the next version, and reappeared. how to keep this from happening. It makes using the desktop a pain in the you know what.
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I have a folderA that contains folderB that contains a lot of files. I would like to get rid of folderB, but not its contents. I want those contents to be inside of folderA. How can I accomplish this on the commandline?
I saw in a magazine reference to using rsync to have identical copies of folders. This looks like something I could find useful as I have a large number of items in need of safe backup.
I have the folders on an old system on a home network and would like to copy these over to a USB Hard Drive.
Currently the folders reside on SFTP xxx.xxx.xxx.xxx and I wish to sync them to a USB port on my laptop.
I have a samba share to a windows 7 computer I do not know if I will be able to use backintime or not so I want to know how to have rsync do my backup.I read the man but I'm not sure if I understand the it.on same computer different hard drive to run every hour in a script. Leanne is windows 7 share and backup is the other hard drive in the computer rsync -arvRzEP /media/leanne /media/backup.
this is posable but am trying to do this "Create folder from a filename and move the file into the folder" i have 500000+ file's i need to do with is there a easy way?I really don't want to download them all make/move them with filemonkey just to re-upload them
I'm trying to learn how rsync works to backup my system. I tried: Code: rsync -azvv /home /media/Elements I get a folder called home on my external hard drive but when I use ls -l to see the permissions they are all wrong. On my /home folder the permissions for /nathan are drwxr-xr-x 48 nathan nathan The permissions on the backup /nathan folder are drwx------ 1 nathan nathan
I also tried using the long version of -a which is -rlptgoD and that didn't work either. What do the 48 and 1 mean when I used ls -l? When I look in the /nathan folder the permissions are all screwed up too. A lot of the files are backed up as executable and the permissions are all screwed up. I also ran it with sudo, and that didn't work either. The permissions were still screwed up and ownership is messed up too.
when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?
I found a script that runs any commands from a dropbox folder. It seems to take the scripts i have from the remote folder to the output folder to the old folder. but it never seems to actually run the scripts. it just seems to move the scripts from folder to folder Here is the page of what I'm talking about. [URL]
I am thinking of using rsync to sync my Music folder to another folder called Music on an external USB drive. I will be using the Scheduled Tasks front end to schedule the syncs. What should the syntax look like when I put it in Scheduled tasks. I want this to be as simple as possible.
I'm hoping somebody can find something here that I haven't. I'm trying to use rsync to backup home directories to a nas. First, I NFS mounted the nas and ran an rsync and everything worked out fine. the transfer completed after a few hours and everyting was transferred (lots of stuff!). I then decided that I don't want to leave the nas mounted all the time and I didn't want to automate mounting and unmounting of the nas as I didn't think I could produce a script that would work reliably enough. So I decided to start an rsync daemon on the nas and upgrade via that. I run the following command (results are included. the ^C is me killing it after it hangs).
well, i know ther are issues when using rsync to copy files to ntfs partition like file permission blah blah. the thing is, i need to backup my music files periodically onto a ntfs partition from ext4. i really dont care about file permissions or any other stuff. when i use rsync, it should update the mp3 files on my ntfs (external) disc with the new ones.can i give a go with this operation? i have lot more important files on the external disc and i dont want this rsync corrupt or delete those files coz they are highly important files.
I have a Linux host acting as an ISCSI server for a Windows box. I want to keep an off site backup, so I figure rsync will keep the ISCSI server synced with an offsite Linux host. I understand that Rsync does block level incremental transfer to conserve bandwidth ok, awesome.The trick is, that I also want an archival copy kept. Say I want to go back to a revision of a file from 10 days ago, I need to be able to do that.
I was planning on using Backup Exec, since we currently have a licensed copy. Throw the archives from Backup Exec onto the ISCSI server as well, and have it keep a rotating 30 day backup, or something like that. The issue I see here is that this will be creating a deleting files as it does its daily backup rotation. I'm guessing RSYNC will see these as new files, and likely retransmit everything on a daily basis. The question then becomes, is this assumption correct, or will it still know to do a block level incremental transfer even when file names and such are changing?
Our backup script was working fine (ssh to the server, back up /home to a second hard drive on my computer). Then right after an ubuntu update, it quit working. I investigated and found that "something" had changed the label on the backup hdd to what looked like gibberish to me. But the script identified the backup hdd by its uuid, which didn't change. Yet, here is the error I get when the backup fails: receiving file list ... done [took about 5 seconds] rsync: mkdir "/media/14D9-3B1F/server-backup" failed: No such file or directory (2) rsync error: error in file IO (code 11) at main.c(594) [receiver=3.0.6]
Note that the backup hdd IS mounted, uuid is correct, and the folder 'server-backup' DOES exist. Does anyone have a clue for me? I'm moderately experienced in Linux and ubuntu. Our server runs centos 5. And as stated, the backup ran fine for several weeks. I think there was a new linux kernel on that update, but at this point a while later I don't know which one. Current kernel is .2.6.31-22-generic.
I support a small business which has an Ubuntu server running as a file server. The server is running Ubuntu 10.4. There is one hard drive which is mounted as /media/hdd. Each night this is backed up to an external USB hard drive mounted as /media/backup. The backup is carried out using the command:
Code: rsync -av /media/hdd/ /media/backup/
Is there a way to encrypt this back-up so that if the USB hard drive is plugged into another machine it cannot be read?
I am looking for an automated backup system and I like bacula. I have 3 Notebooks and a Desktop computer that need regular backup. Now I don't want to let them run all night just to do the backuping, so I was thinking I could use wake-on-lan to have bacula wake up the machines, then do the backups, and shut them down afterswards. While this may work with devices on the ethernet, it won't work with the Notebooks on the wifi. So is it possible to have the Notebooks schedules to automatically wake up from suspend or shutdown ? Or is it possible to interject a shutdown command if it is after a cerain hour and call the bacula director to start the backup now?
I have a tiny shell script to rsync files between two servers and remove the source files.
This script works fine, when it has been initiated manually or even when the rsync command is executed on the command line.
But the same script doesn't work, when I try to automate it through crontab.
I am using 'abc' user to execute this rsync, instead of root, as root login to servers are restricted in all of our servers, by us.
As I mentioned earlier, manual execution works like charm!
When this rsync.sh is initiated through crontab, it runs the first command(chown abc.abc ...) perfectly without any issues. But the second line is not at all executed, and there is no log entry i can find at /mnt/xyz/folder/rsync.log.