Fedora :: Backing Up Using "rsync"?
Dec 5, 2010
I'm using rsync for the first time to back up stuff to an external hard disk. I used the "rysync --av myfile remote-file" command. It seems like it is much faster than going through the KDE copy paste method. Lots of files to copy. Right now, it looks like it is hung up, but I'll let it do its thing.Is this a good method or should i tweak it some way that is more efficient?
View 2 Replies
ADVERTISEMENT
Feb 19, 2011
I am backing up may data using Rsync via SSH. I am trying to find a solution that allows me to push my backups from windows and Ubuntu to a Ubuntu server. I don't like the solution that requires my windows machines to have shared drives. Not viable for laptop users that may piggyback on free wireless networks. I would also like a solution that has a GUI for the desktop user but not totally required. I will need a solution that allows me to restore files by date. I do a lot of changes to files on a daily basis and sometimes need an older copy of a file.
I love using Rsync as it is a very simple solution and extremely quick. The only thing I don't know how to do is recover different version of a file. It would be nice to see a visual representation of all the files and different dates of each file. I know I might be asking a little too much, but, it is after all, Linux.
View 2 Replies
View Related
Apr 11, 2011
This is a quick one, I don't have any problems yet, I just want to check that this is going to do what I think it is. I added the line
Code:
00 01 * * * rsync -avz --delete /local-storage /mnt/usbackup
to my crontab file, am I to understand that this will backup /local-storage to my external
[code]...
View 4 Replies
View Related
Aug 23, 2010
On fedora 12 - 1 drive - 2 partitions: a small partition to boot, a big one for all other things - sda1 and sda2.
I would like to use tar to copy them to a different drive. That way I not only back them up but also reorganize them.
(1) approach 1 - 2 drive approach
On second drive (sdb) created 2 partitions with fdisk, used mkfs to put file systems into them.
Easy to do sdb1. I created a directory /target and mounted sdb1 there. Then used tar pipe tar to copy /boot to /target. Appears to be ok.
Not so obvious how to do sdb2. I can mount it at target but
(1) if I copy / to /target there is recursion?? I tried --exclude=/boot --exclude=/target but it went ahead and copied /target.
Also there are a number of things mounted onto / - see output of mount below
Code:
/dev/mapper/vg_knox-lv_root on / type ext4 (rw)
proc on /proc type proc (rw)
sysfs on /sys type sysfs (rw)
[Code]....
View 3 Replies
View Related
Nov 17, 2009
I would like to set up a system for backing up files and even possibly using it to keep music on and listen to it over the network. I am wondering which would be better to use a separate FTP server or buy one of the NAS enclosures and a couple of hard drives to put in it. I am assuming that the NAS would be accessed via NFS. I have never run an FTP server and I have never had used NAS.I am just looking for pros and cons to each one. I would just like opinions as to which service (FTP/NFS) would be better for this task.
View 4 Replies
View Related
Jan 27, 2011
I have a couple of backup scri ts. I use a rather cumbersome method to backup evolution. I finally realized I can backup evolution easily from its File Menu. Can I do this directly from the command line, so that I can use it in a script?
View 2 Replies
View Related
Jul 22, 2011
I just started playing with iscsi and have some issues.
This below works great (I can connect to this new device from my clients):
Code:
tgtadm --lld iscsi --op new --mode target --tid 1 -T iqn.2011-07.home.joma:jomamgmt01.data
tgtadm --lld iscsi --op new --mode logicalunit --tid 1 --lun 1 -b /dev/sdb
tgtadm --lld iscsi --op bind --mode target --tid 1 -I ALL
[Code].....
The backing-store does not get added.
View 3 Replies
View Related
Mar 27, 2011
Is there a way to share your evolution settings from say desktop to a laptop without backing up and importing everytime? Kinda like have an outlook pst file on a server and anyone can open it? I work at my desktop alot but sometimes at my laptop in the livingroom.
View 2 Replies
View Related
Nov 17, 2010
Thought I'd post it here because it's more server related than desktop... I have a script that does:
[Code]....
This is used to sync my local development snapshot with the live web server. There has to be a more compact way of doing this? Can I combine some of the rsyncs? Can I make the rsync set or keep the user and group affiliations? Can I exclude .* yet include .htaccess?
View 6 Replies
View Related
Jan 7, 2011
When I run rsync --recursive --times --perms --links --delete --exclude-from='Documents/exclude.txt' ./ /media/myusb/
where Documents/exclude.txt is
- /Downloads/
- /Desktop/books/
the files in those directories are still copied onto my USB.
And...
I used fetchmail to download all my gmail emails. When I run rsync -ar --exclude-from='/home/xtheunknown0/Documents/exclude.txt' ./ /media/myusb/ I get the first image at url.
View 9 Replies
View Related
Apr 12, 2011
I have a tiny shell script to rsync files between two servers and remove the source files.
This script works fine, when it has been initiated manually or even when the rsync command is executed on the command line.
But the same script doesn't work, when I try to automate it through crontab.
I am using 'abc' user to execute this rsync, instead of root, as root login to servers are restricted in all of our servers, by us.
As I mentioned earlier, manual execution works like charm!
When this rsync.sh is initiated through crontab, it runs the first command(chown abc.abc ...) perfectly without any issues. But the second line is not at all executed, and there is no log entry i can find at /mnt/xyz/folder/rsync.log.
View 6 Replies
View Related
Sep 18, 2009
I just tried to sync files from one server to another. After the sync process, I found the files are bigger than original ones.
I looked up the web and found someone mentions the rsync daemon. So I have to run the daemon on one server before I run the rsync?
The command I used is rsync --partial --progress -r source destination
View 1 Replies
View Related
Jul 21, 2010
use rsync to cp such files and dirs under /var/www/html/mydir directory but these two files(/dir4/1.html /dir4/2.html) cant rsync to dest mechine.
rsync configure file,below...
View 2 Replies
View Related
Dec 8, 2010
I'm using Ubuntu 10.04 LTS server and Postgresql 8.4. I have a .sh script that is run by cron every other hour. That works fine. The .sh script includes an rsync command that copies a postgresql dump .tar file to a remote archive location via ssh. That fails when run by cron; I think because it is (quietly) asking for the remote user's password (and not getting it). I set up the public/private ssh key arrangement. The script succeeds when run manually as the same user that the cron job uses, and does not ask for the password. I am able to ssh to the remote server from the source server (using the same username) and not get the password prompt (both directions), so why doesn't rsync work? I even put a .pgpass file in the root of that user's directory with that user's password, and the user/password are identical on both servers.
I think the problem is rsync is not able to use the ssh key correctly. I tried adding this to my script but it didn't help.
Code:
Here is the rsync command embedding in the .sh script.
Code:
Here is the cron entry:
Code:
View 6 Replies
View Related
May 5, 2011
I bought brand new hardware - S-AM3 GIGABYTE GA-890GPA-UD3H and installed fedoda core14 64bit. I have a enterprise backup. The name is rsnapshot. It's a backup over ssh using rsync.rsync version is 3.0.8 on server and workstations. The server is ubuntu 8.04 32bit. No problems with fc5, fc8, ubuntu and so on as clients.When i do a croned backup then sometimes i have a hang on the clients. No ping works and no ssh connection. crtl + alt + f1 or so - i dont konow if it works.I created a 250 gb file and copied it with scp and "rsync over ssh", but i could not reproduce the failure.
View 13 Replies
View Related
Sep 9, 2011
My Source folder contains 424.8 GB in 502,474 files. My Destination folder was created fresh, and after the copy contains 394.0 GB in 486.514 files. I am running it as grsync with root authority. The only options are to preserve time, permissions, owner and group., and to produce a verose output and transfer progress. There are no exceptions specified to skip any files.
The rsync command is this:
rsync -r -t -p -o -g -v --progress -c -l -H -D -s /mnt/Backups/monthly.3/ /mnt/EX-Fantom/monthly.3/
I have run it again to give it a chance to get it right. Same result. The source is in an rsnapshot folder, but this is the first backup, the original, containing only whole files, not links.
View 3 Replies
View Related
Feb 26, 2010
I'm trying to set up my trusty Fedora box, with rsync over ssh to backup my windows machines at home and I need help configuring the rsync server (I'm using DeltaCopy as the rsync client on the windows side)I tried a few dry runs but it seems windows can't see the linux box, the rsync job just hangs for ever and never does anything.I should mention that ssh works fine.
View 3 Replies
View Related
Aug 24, 2009
I'm syncing a server over the internet with rsync, but it only works for a few hours before the backup fails with a "No route to host". I can restart the job and it'll will pick up where it left off, but is there an automated way to do this, or protect against a connection failure? I have about 170GB to copy over initially, but I can only get through about 4-5GB before the connection drops--manually restarting the sync everytime it drops will make the initial backup take days...
View 2 Replies
View Related
Nov 23, 2010
I use Fedora 14 32 bit at home and I d not have internet in home , so I download all packages are depend on Fedora 14 in my work place and move all of them to home by USB flashI use rsync for download all packages , in home I make local repo and install all packages I need by use local repo ,right now I want update my kernel , and I want use rsync foe download kernel update for fedora 14 .How I can do this , I want rsync only download update are depend on kernel and does not download other thing How I can do this ?
View 3 Replies
View Related
Jan 28, 2011
am trying to sync data from Server A to Server B. The destination on Server B is a CIFS share and I need to preserve timestamps, permissions, etc. on all the data that I transfer. During the rsync process, I receive thousands of errors like the one below:
rsync: chown "/LBDCASAN001/JasonHarper/files/1259810304676/2010-12-22-01-00-03/0x22/0xc8/0x43/0x0a" failed: Permission denied (13)
I'm not sure if it's related at all, but my mount point on Server B has the permissions set as: drwxr-xr-x 2 root root when it is unmounted. When I mount the CIFS share, the mount point permissions change to: drwxrws---+ 3 root root
Also, here is the line from my /etc/fstab that mounts the share:
//X.X.X.X/LBXXXXX001 /LBXXXXX001 cifs username=LBXXXXX001,password=XXXXXXX!,uid=0,gid=0 0 0
When I perform the rsync, I'm authenticating to Server B from Server A as root.
View 6 Replies
View Related
Feb 19, 2009
I am using FC 10. I did an rsync to get a software Matlab from the local lan. But Matlab does not work. The error is:
Quote:
License checkout failed.
License Manager Error -96
MATLAB is unable to connect to the license server.
[code]....
I have set SELinux as Permissive for the current enforcing mode, while the default enforcing mode is Enforced. I did rsync in this way:
Code:
rsync mecsmrao@10.16.4.32:/home/pkg/lic/matlab-7.6/ /usr/local/pkgs/matlab_7.6_r2008a/ -avtpog -e ssh
What am i supposed to do?
View 3 Replies
View Related
Jan 16, 2011
I've been trying to get a cold backup of a 1TB database this weekend, started the whole process Friday and still have yet to get a single device backed up. I'm using rsync to copy files from my /u17 thru /u29 mounts, and the usb is formatted ext3. Each time the rsync would start off fine but after about 30 minutes it would fail with any number of errors but the most prevalent is "Read only file system", "broken pipe". Here are samples:
rsync: writefd_unbuffered failed to write 4 bytes: phase "unknown" [sender]: Broken pipe (32)
rsync: write failed on "<path to one of my .dbf files" failed: Read-only file system (30)
rsync: chown "<path>" failed: Read-only file system (30)
rsync: rename "<path of .dbf> -> <rename attempt>": Read-only file system (30)
rsync error: error in file IO (code 11) at receiver.c(305)
rsync: connection unexpectantly closed (16787 bytes received so far) [generator]
rsync error: error in rsync protocol data stream (code 12) at io.c (359)
I've unmounted and remounted a number of times and kicked off the rsync again and it goes about 30 minutes and I get the same errors. This was all as user 'root', so I tried to do the rsync as user 'oracle' and I get the same thing. After looking into the device as it is recognized, it is being picked up by multipath. Would the fact that a usb device is being managed by multipath be a problem? Currently it is mpath15. How would I add usb devices to the mpath blacklist? The usb is being assigned /dev/sdbj but I'm worried that it would change at a reboot. I've searched the web for all of these errors and still no answer.
Note: I've also just tried to do a copy using 'cp' and got the same "Read only file system" errors. I can sometimes touch a file and sometimes I can't. I want to try and get this backup done this weekend.
View 9 Replies
View Related
Feb 17, 2011
I am trying to backup my CentOS5.5 webserver to our local windows sbs 2003 server in the office. I have set up ssh and cwrsync on the windows server and have confirmed that the linux server can reach the windows server via the command: ssh RemoteUser{AT} It asks for a password and connects fine. However when I run this command to start the backup: rsync -avz -e ssh home/account/public_html/some/small/directory/ remote_user{AT}/cygdrive/c/backup/destination/directory/
I get this error after entering the password: protocol version mismatch -- is your shell clean? PS I had to use {AT} instead of the proper character as the forum thought I was posting a URL
View 9 Replies
View Related
Dec 1, 2009
Does fedoraproject close its official rsync server? Or I did the wrong way?
View 1 Replies
View Related
Aug 24, 2011
When ever I transfer large files using cp, mv, rsync or dolphin the system will slow down to the point that it's unusable. It will sometime hang completely and not accept any input until the file is transferred. I have no clue what could be causing this problem but I know it shouldn't be happening.I am using Fedora 15 (2.6.40.3-0.fc15.x86_64) with KDE 4.6. have a Phenom II 955 processor, 6 GB of system ram and the OS and swap file is on an 80 GB SSD. Copying files in the SSD doesn't cause any problem, but moving files between my other two large HDDs causes the extreme slow down. Using htop I can see that my system load jumps to between 3 and 4, but my RAM and CPU usage stays low during the transfer. Here are two commands that take about 10 mins to run and make the system unusable while it's running. It usually transferring around 2-20GB worth of data during the transfers:
cp -a /media/data.1.5/backup/Minecraft/backups/* /media/data.0.5/backup/Minecraft/backups/
rsync -a /media/data.1.5/backup/ /media/data.0.5/backup/
/media/data.1.5/ is the mount point for a 1.5 TB internal SATA drive, and /media/data.0.5/ is the mount point for a 500 GB internal SATA drive.
View 6 Replies
View Related
Mar 6, 2010
I'm currently learning to use rsync to backup my music collection. I have a Firefox tab open to the rsync manual page(s) and have been reading man rsync and running experimental rsync operations.I've been doing this for the last 3-4 hours. I've used rsync for this purpose in the past with disastrous results. What was and is once again (due to a month and a half of file pruning) a 9000 file music collection had mysteriously grown to over 25,000 music files and 80GB of data! This was likely due to the fact that I didn't really know what I was doing with rsync and had never spent too much time learning about all the parameters, what their functions are and how they may relate to my goal.Here are the particulars:
* Source drive is a 500GB disk, /media/sata500/music/.
* Destination drive is a 250GB USB disk, /media/FreeAgent/music, connected to the same computer that houses the 500GB disk.
* I want to copy or backup files from /media/sata500/music to /media/FreeAgent/music.
* I do not want to create ANY duplicates of files that exist.
* I only want to add files to the destination drive if they are new on the source drive, like if I rip a CD and add the contents to the source. I want them copied over next time I run rsync.
Here's the rsync command in it's most recently used form, and probably very immature at this point.
Code:
rsync -t -r -vv --stats -i --log-file=/home/glenn/rsync.log /media/sata500/music/* /media/FreeAgent/music/
This appears to have copied all files and folders and I'm satisfied that my goal has been met with some success. To convince myself of this I ran the command and then once it was complete I added 2 new songs putting them in their respective folders on the source drive and ran the same command again. The resulting output was
[code]....
Two files transferred. Exactly what I want.Both folders now house 20,931 files and use 40.6GB. Identical as far as I can tell.What I'm concerned about are time stamps and play count data, etc. Anything that changes the original file. I don't want this data to cause a file to be transferred as I'm afraid that the new file will be created along side the old file of the same name thereby starting this whole music collection expansion thing all over again. I've invested a lot of time and effort to get it pruned down to where there are virtually no duplicates and albums are correct in that they contain the proper songs in the proper order.
View 14 Replies
View Related
Nov 9, 2015
Debian Jessie XFCE. I use this script to periodically back up my home directory to an external drive:
Code: Select allcp -R -u /home/albert/* /media/albert/"Expansion Drive"/albert/
The configuration files, for example .icedove, are not copied. Can I modify the cp command to include them also?
View 14 Replies
View Related
May 27, 2011
After looking around a bit I started using backup2l to generate backups on my local disk.
It also brags about:
An integrated split-and-collect function allows to comfortably transfer all or selected archives to a set of CDs or other removable media.
View 1 Replies
View Related
Mar 24, 2010
I've been putting my DVDs onto my home network for a while without any probelms until now.When I put in Doubt and try to rip it, the title tree (in both K9 and DVD::Rip) shows several titles of a large filesize and the same time duration (that of the film). About three of the multiple titles are exactly the same file size and duration and then there are about 3 more that differ slightly in both.
I think this must be some kind of anti-piracy measure. Obviously 6 files of 5 GBs wouldn't fit on a DVD so it must be some kind of error/trick. When I try to backup the DVD it doesn't work. Has anyone else experienced this problem?
View 3 Replies
View Related
May 21, 2010
I've got a lot of photos, home movies, documents, etc on my machine (currently running with ubuntu-10.04-desktop-amd64) and i've had issues in the past with hard drives failing. fortunately i've not lost anything substantial as all the important stuff was recovered from backups i had made with blank DVDs.my last desktop rebuild was about 2 years ago and i decided that using blank DVDs to back things up was no longer practical as they're too small and the amount of data i need to backup is too large.hard drive storage used to be quite expensive but they're getting cheaper and cheaper all the time. i bought 2 pretty large, identical drives for($70) each.my machine has 3 hard drives in it. the first is a small solid state drive with the operating system on it, the second has all my data on it (mount point is /data) and a third to backup the stuff i don't want to lose (mount point is /backup).i use the following script to automatically make another copy of all the stuff i want to backup onto a second drive.
Code:
#!/bin/sh
rsync -av --progress --delete --log-file=/home/paul/.backup_logs/rsync.home.$(date
[code]....
this is automatically run at 9pm every night as a root cron job.the 2 lines that begin with "rsynch" do the actual work of backing up my data. the 2 lines that begin with "chown" are just for convenience. they change the owner of the backup log to myself so i can easily delete them without being hindered by ownership issues (the script is run as ROOT so the logs are created, and therefore owned by ROOT).i originally used this article to help set it up in the first place.it explains the process with a good degree of clarity and i can recommend it to anyone who would like more info on this technique.i'm not too worried about losing my system as all the configuration files in the home directory are backed up.it doesn't take too much effort to rebuild the system but it's the personal data that is much harder to replace.
if the data drives fails i will be able to replace it and restore the data from the backup drive. if the backup fails i can just throw it away and make another backup. this whole strategy is designed to guard against a total drive failure. it offers no protection against accidental deletion of files except for the small window between the deletion and when the script is run (anything deleted from the data is also deleted from the backup the next time the script is run).i've only really experienced hard drive failiure 3 or 4 times.the last time the computer started doing all sorts of wierd stuff and i didn't understand the issues involved.
View 4 Replies
View Related