Ubuntu :: Rsync Vackup To Remote Server Of Large Dataset
Jun 18, 2010
I have cygwin on Windows XP running rsync to remote Ubuntu server over ssh using ADSL.My data set is about 20Gb! But, Cygwin will backup incrementally, so after the first backup the process should be relatively quick.With ADSL the first backups will take too long. I was thinking about doing the first backup by copying files to an external hard drive then attaching the hard drive to my remote server and copying the files. The idea being that rsync will pick up the files as if it had created them in the first instance. The incremental backups will then pickup from there.
Does anyone have any experience with this and/or can provide any advice? The external hd is fat-32 which is okay with Windows and should be okay with Ubuntu? From XP right click copy and then paste keeps the file dates intact on the external hd - is this enough to get rsync going incrementally?
View 1 Replies
ADVERTISEMENT
Aug 27, 2010
This is my first time on this forum. I am a statistician. I am trying to subset a large dataset by specifing the starting & end line. The dataset is pretty large (more than 300 million lines), containing around 1.2 million lines for a person. So I would like to split the dataset into per person consecutively. I tried wrap r codes, but R seems to have to read from top to where I want although I specified that it should skip the lines that other tasks have read. So the memory is increasing with the task ID. Finally I got kicked out by the administer.
I guess that shell may do it much simple and elegently. First I thought of "split" command. But the the file has a header of 10 lines. So I can't split it into even size chuncks.
View 5 Replies
View Related
Feb 3, 2011
We have 2 servers, 1 is the webserver and the other is the Mysql server.
When transfering a 2GB file from the webserver to the Mysql server.
The webserver's connection to the mysql DB server dies completely.
Need to restart the MYSQL process in order for it to come back online.
During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.
View 2 Replies
View Related
Nov 20, 2010
I'm configuring an rsync between 2 machines, A_Server --> B_Server, using the following script:Quote:
#!/bin/bash
#
# Script de backup a trav�s de Rsync desde RMP-1 hasta RMP-2.
[code]....
View 6 Replies
View Related
Mar 1, 2010
I have Ubuntu on both my laptop and desktop machines, both are connected to the same network. I back up the laptop to the desktop by running the following on the laptop:
rsync -avv --stats /home/alisdt alisdt@xxx.xxx.xxx.xxx:/home/alisdt/laptop_backup (with the IP address of the desktop instead of the many x, obviously). Whenever rsync hits a large file (greater than a few MB), the network use rapidly drops to ~60KB/s (that's kilobytes not bits). When I copy the same file to the same place using scp, I get > 500KB/s throughout the transfer. Things I've tried:
* mounting the desktop home dir on the laptop using SSHFS -- a simple file copy is fast, rsync is still slow
* ditto with NFS
* rsync --whole-file option, in case the delta-transfer algorithm was choking on large files
* rsync --inplace option
* HPN-SSH (http://www.psc.edu/networking/projects/hpn-ssh/) to enable dynamic window and unencrypted bulk transfer, just in case it was some ssh bottleneck I think it's either an rsync application problem, or a network problem that is only affecting rsync. Any ideas, or other ideas of what I can try to debug? In case it's relevant, I'm using 9.04 on both machines. (A standing bug prevents me from upgrading the laptop, and I haven't bothered to upgrade the desktop).
View 3 Replies
View Related
Apr 12, 2011
I have a tiny shell script to rsync files between two servers and remove the source files.
This script works fine, when it has been initiated manually or even when the rsync command is executed on the command line.
But the same script doesn't work, when I try to automate it through crontab.
I am using 'abc' user to execute this rsync, instead of root, as root login to servers are restricted in all of our servers, by us.
As I mentioned earlier, manual execution works like charm!
When this rsync.sh is initiated through crontab, it runs the first command(chown abc.abc ...) perfectly without any issues. But the second line is not at all executed, and there is no log entry i can find at /mnt/xyz/folder/rsync.log.
View 6 Replies
View Related
Sep 18, 2009
I just tried to sync files from one server to another. After the sync process, I found the files are bigger than original ones.
I looked up the web and found someone mentions the rsync daemon. So I have to run the daemon on one server before I run the rsync?
The command I used is rsync --partial --progress -r source destination
View 1 Replies
View Related
Jul 21, 2010
use rsync to cp such files and dirs under /var/www/html/mydir directory but these two files(/dir4/1.html /dir4/2.html) cant rsync to dest mechine.
rsync configure file,below...
View 2 Replies
View Related
Dec 8, 2010
I'm using Ubuntu 10.04 LTS server and Postgresql 8.4. I have a .sh script that is run by cron every other hour. That works fine. The .sh script includes an rsync command that copies a postgresql dump .tar file to a remote archive location via ssh. That fails when run by cron; I think because it is (quietly) asking for the remote user's password (and not getting it). I set up the public/private ssh key arrangement. The script succeeds when run manually as the same user that the cron job uses, and does not ask for the password. I am able to ssh to the remote server from the source server (using the same username) and not get the password prompt (both directions), so why doesn't rsync work? I even put a .pgpass file in the root of that user's directory with that user's password, and the user/password are identical on both servers.
I think the problem is rsync is not able to use the ssh key correctly. I tried adding this to my script but it didn't help.
Code:
Here is the rsync command embedding in the .sh script.
Code:
Here is the cron entry:
Code:
View 6 Replies
View Related
Jun 24, 2010
I found a strange problem while using rsync to backup my files. I use a script:
#!/bin/bash
SOURCEDIR="/"
TARGETDIR="root@DLINK-13F017:/mnt/HD_a2/administrator/RAIDBackup/"
[code]....
View 4 Replies
View Related
Sep 22, 2010
I'm trying to rsync a folder(and all subfolders) down to a local directory, upon completion I'd like the remote folder to be deleted.
What I've come up with is
Code:
rsync -rvtW --remove-sent-files -e ssh user@example.com:/remote/folder /local/folder
What this is doing is simply wiping the remote files within folders(not the folders themselves) and not actually syncing anything down to my local folder(as in no files at all on my /local/folder)
View 1 Replies
View Related
Feb 17, 2010
I know it is possible to do... but I am not sure how to go about the whole thing. Here's the scenario. I run a lab. Lots of PCs. As time goes by, the older ones dont have the memory or disk space to run more modern apps. But I want to put them to use...
What I am trying to do, and have started, is the following: 1. Install Linux on a bunch of them, make a share on each of these. I've already installed FreeNAS on four machines. (Let's call these machines ClientA, ClientB, ClientC, and ClientD). And have made all the available diskspace
2. Install Linux on a fifth machine (call this Machine1) , and on this machine combine over-the-network all the shares from ClientA, ClientB, ClientC, and ClientD into one large "virtual" directory on Machine1. I know this is do-able, but what I hope to have is the total disk space from all the machines in step 1 to be combined for the purposes of saving files. Not sure which file system to use. For example, if all the other four machines have 2GB of space each, I want to be able to be able to save a 7GB file.
3. And then allow sharing of this one large directory using Samba.
4. Then allow lab users (not on any of the above mentioned machines) to be able to access the Samba-enabled large shared directory on Machine1 to read and write files. The user will have no idea where that the files[s] is/are not on Machine1, and that it maybe segmented in some way, nor should they care.
I understand the risks (if any one machine of ClientA, ClientB, ClientC, and ClientD goes down, lose probably everything). I am considering throwing mirroring into the mix (mirror Machine1's large directory), but that can wait.
So in the above scenario, what file system can I use on Machine1 to combine all the shares from ClientA, ClientB, ClientC, and ClientD to make one large "virtual" directory?
I've looked at UnionFS, but from my understanding while it combines directories, the maximum file size is the size of the largest share. Is this true?
View 3 Replies
View Related
Jan 14, 2011
This command would copy the files to the local directory,find /mnt/nas -type f -ctime 1 -iname '*.avi' -exec rsync -av {} /mnt/Mythbuntu
View 1 Replies
View Related
May 6, 2010
Every once in a while on a computer I'm ssh'd into, I will accidentally type "cat largefile.txt" and my screen will start rushing with text for the next 10 minutes. I'm always working in a screen session, so my current solution is to just log out and then log back in, and since it can go 100X faster when I'm logged out, it'll finish in the short time it takes me to type my password in again. Is there a better way? Either involving the fact I'm in a screen session? Or a way to do this within SSH? What doesn't work: detaching from the screen session (doesn't respond until file is done outputting) trying command to move to a different window in the screen session (also doesn't respond) typing ctrl+C to kill cat command (also doesn't respond, probably because the command is done and the buffers just have to catch up).
View 3 Replies
View Related
Jul 14, 2010
I have two linux box that i would like to keep in sync. I see option -avz syncs the remote with local but new local files are not pushed.
View 2 Replies
View Related
May 24, 2011
I need to be able to use an rsync command in script that will be run by cron. And it needs to be able to pass a password to rsync so that the remote server it's connecting to will authenticate.
I cannot set up ssh keys between the two servers, it's not an option. I cannot use any other language other than bash, it's my only option. I know this is highly insecure, I have no other option.
So far I have this:
rsync --rsh="/usr/bin/ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o PreferredAuthentications=password" -raxv /source/dir/* user@remotehost:/target/directory/.
This allows the script to ignore host verification and goes directly to the password prompt. I need the script to fill in this password prompt with the password that is stored in a variable.
I tried using expect, but I honestly don't know the syntax, it just keeps failing. A lot of the examples I'm finding online for expect starts off with a "spawn", which i don't have installed, and not sure if I have the ability to install it yet.
View 4 Replies
View Related
Jan 29, 2011
i would like to find and backup all *.mp4 files from /Pictures and its sub-directories and move them to a single directory on a remote. I can find and move the files but I don't want the directory structure...just the files to be placed in the remote directory.
To find my files I use
rsync -r -a -v -e "ssh -l user" --delete --include '*/' --include '*.mp4' --exclude '*' /home/drew/Pictures/ remoteserver:/Users/drew/mp4
but this creates all the subdirectories
I also tried
find ~/Pictures -name "*.mp4" -exec rsync -r -a -v -e "ssh -l user" --delete {} remote:/Users/drew/mp4 ;
This works but takes forever
View 3 Replies
View Related
Mar 13, 2010
I have a huge file which has 450G. Its format is as below
x1 50020 A 1
x1 50021 B 8
x1 50022 C 9
[code]....
Now, I want to extract a subset from this file. In this subset, column 1 is x10, column 2 is from 600000 to 30000000. I wrote the following perl script but it doesn't work:
#!/usr/bin/perl
$file1 = $ARGV[0]; # Input file
$file2 = $ARGV[1]; # Output file
[code]...
I guess the input file and output file are both too big that my script can't handle it.
View 11 Replies
View Related
May 24, 2010
I'm trying to setup rsync to backup a remote directory to my local drive.
I cd to the directory that I want to pull the files to, then I enter:
rsync -vrtW account@remote.com:~/public_html
I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing?
View 1 Replies
View Related
Jun 21, 2011
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
View 14 Replies
View Related
Jan 21, 2011
I'm new to setting up Linux Servers. I've setup a Ubuntu 10.10 Server along with CUPS and I'm using Webmin to talk to the server. I have a HP PSC 1315 Multifunction printer connected via usb to the server. Using the CUPS web interface I am able to get the server to detect the connected printer and it identified the HP PSC 1310 Series drivers.
When I printer a test page from the server's screen the print job goes through ok and the size was about 5k.
I then setup a samba share to allow my Windows 7 machine to share the printer. Windows 7 is able to pick up the shared printer correctly and I used the default HP 1310 Series drivers. When I tried to send a test page to the printer, that single page ended up being 3887kb and I also tried printing out a single paged word document which ended up being over 7MB.
View 4 Replies
View Related
Nov 28, 2010
I'm trying to design an inexpensive large scale DNS server but fail to find any metrics or methods to base scalabilty.Can anyone offer information on building a stable dedicated DNS server? That might be able to scale well.
View 8 Replies
View Related
Jun 9, 2011
I'm trying to rsync files and directories from a RedHat linux host(v 4.5 & 4.7) to a Windows server 2003R2 Standard Edition with cygwin running. I'm executing the rsync command from the cygwin shell. The transfer involves rsync'ing approximately 1 TB of data from the linux server to the windows server. After about 280+GB of data transfer, the transfer just dies.
There seems to be no particular file or directory that the transfer stops at. I'm able to rsync GB's of data from other linux hosts to this cygwin server with no problem. Files and directories rsync fine.The network infrastructure is essentially the same regardless of the server being rsync'ed in that it is GB Ethernet running through Cisco GB switches. There appear to be no glitches or hiccups across the network path.
I've asked the folks at rsync.samba.org if they know of any problems or issues. Their response has been neutral in that if the version of rsync that cygwin has ported is within standards then there is no rsync reason this problem should happen.I've asked the cygwin support site if they know of any issues and they have yet to reply. So, my question is whether the version of rsync that is ported to cygwin is standard. If so, is there any reason cygwin & rsync keep failing like this?
I've asked the local rsync on linux guru's and they can't see any reason this should fail from a linux perspective. Apparently I am our company cygwin knowledge base by default.
View 3 Replies
View Related
Jun 14, 2011
I want to run rsync on server A to copy all files from Server B when they are newer than 7 days.(find . -mtime -7) I don't want to delete the files on Server B.
View 2 Replies
View Related
Nov 17, 2010
Thought I'd post it here because it's more server related than desktop... I have a script that does:
[Code]....
This is used to sync my local development snapshot with the live web server. There has to be a more compact way of doing this? Can I combine some of the rsyncs? Can I make the rsync set or keep the user and group affiliations? Can I exclude .* yet include .htaccess?
View 6 Replies
View Related
Jan 7, 2011
When I run rsync --recursive --times --perms --links --delete --exclude-from='Documents/exclude.txt' ./ /media/myusb/
where Documents/exclude.txt is
- /Downloads/
- /Desktop/books/
the files in those directories are still copied onto my USB.
And...
I used fetchmail to download all my gmail emails. When I run rsync -ar --exclude-from='/home/xtheunknown0/Documents/exclude.txt' ./ /media/myusb/ I get the first image at url.
View 9 Replies
View Related
Jun 10, 2011
Has anyone had any experience on using SUA(Services for UNIX Applications) rsync to "pull" files down to the Win2k3R2 server from a linux rsync host?I was trying to use cygwin rsync before until I found out from cygwin that the cygwin port of rsync was "flakey" and would fail intermittently for no apparent reason. cygwin suggested I use SUA or SFU for rsync services.
I've looked for/ am looking for any experience using SUA rsync to copy files down from a linux rsync host to the Windows host via rsync on the Windows host. Also, if you have done this successfully, do you have any pointers/caveats you can share on how you got it working? What I am basically looking to do is copy files and subdirectories of files from a linux host using rsync to some static location on a Windows server on a scheduled basis so that I can backup the windows server to tape using Symantec's Backup Exec application.
I'm doing it this way to avoid deploying the Remote Agents for either linux or Windows on the target hosts. As an alternative I've seen reference to a product called DeltaCopy that uses a native Windows rsync port with the native linux port of rsync to do what I need also.I realize this is not a strictly linux question, but more of a hybrid as I'm moving data to and from Windows and linux hosts. So, if this is too Windows-y a question, please say so and I'll withdraw my question.
View 2 Replies
View Related
Feb 6, 2011
Every time I attempt to transfer over a large file (4 GB) via any protocol, my server restarts. On the rare occasion that it doesn't restart, it spits out a few error messages saying "local_softirq_pending 08" and then promptly freezes. Small files transfer fine.
Relevant information:
Ubuntu server 10.10
Four hard drives in RAID 5 configuration
CPU/HD temperatures are within normal range
View 7 Replies
View Related
Mar 8, 2010
I have been a RPM-based distribution guy for a long time (redhat,centos,suse). We have a large shared and dedicated web environment that is starting to require more and more linux. I am in a position to switch gears and move to ubuntu if it makes sense. Things that are important to me are:
1. ease of deployment (both servers and websites themselves)
2. patch management
3. documentation
View 2 Replies
View Related
Feb 18, 2011
how big and widespaced the fonts on Clementine playlist are and how good they look on the appmenu (where my mouse pointer is). This is not because Clementine is QT4, I've got the same problem with Chrome, Opera etc. I've been messing with system-settings (KDE settings tool) a day before the fonts become that widespaced in order to make my KDE apps look more native on my GNOME, but I haven't touched the fonts settings there.
View 9 Replies
View Related