What this is doing is simply wiping the remote files within folders(not the folders themselves) and not actually syncing anything down to my local folder(as in no files at all on my /local/folder)
rsync: link_stat "/av" failed: No such file or directory (2) skipping directory home rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7]
I need a program to run on Ubuntu that must be easily set up to do a series of different cloning operations at specific times between the USB drives on a single Ubunu pc, depending on the day of the week." So on Monday folder B is forced to match folder A, Tuesday C forced to equal D ... and on Sunday a whole bunch of these clonings happen. This must all run unattended (at 2am) and be robust with no "what do you want to do next messages" or having the whole thing give up if there is a problem with one file. Though I do need a log of success or failure. Windows programs that do this stuff are FolderClone and GoodSync. I looked at Unison and Rsync and one or two others, but none appeared set up to do what I need, or to be excessively complex / general. I don't need something that can sync two copies, or over internet ....
Long story short, I have a failing RAID3 array which is showing file corrupted blocks (and the RAID controller card is periodically not initializing). [URL].. I thought I had robust backups but as it turns out, my backup volumes seem to have been misplaced (don't ask ) so I have no viable backups. I'm trying to back up as much as I can before the inevitable impending catastrophic failure.
I must be doing something wrong because running rsync on my Unix FreeBSD/FreeNAS server (syncing to local USB drive) is really slooooow. Below you will see an example. A 500MB file took almost 10 minutes to sync to a local USB drive! I can FTP this file in a few minutes over my LAN.
Just ran a test and it took 20 seconds to FTP this file across my gigabit LAN where rsync took 10 minutes to perform a local copy)
in home I do not have internet connection, but in work I have internet connection, in home I install Fedora 14 for my 6 years old daughter and she use it for play games like supertux and openarena and .So I want install opensuse for my daughter and I want test it. for fedora , I download all packages with rsync in work and move them to home by USN flash and then I make localrepo in home and install all packages , I need , I want do this for OpenSUSE , all of us know DVD , does not has all packages , I need , so I have to download all packages and make localrepo in home and install all packages , I need , Can I do this for OpenSuse or not ,I want download all packages need by OpenSuse by rsysc and make loacl repo, How I can do this for Opensuse ?
I have a big iso image which is currently being downloaded by a torrent client with space-reservation turned on: that means, file size is not changing while some chunks in in (4 Mib) are constantly changing because of a download.
At 90% download I do the initial rsync to save time later:
$ rsync -Ph DVD.iso /media/another-hdd/ sending incremental file list DVD.iso
[Code]....
Then, when the file's fully downloaded, I rsync again:
total size is 2.60G speedup is 1.00
Speedup=1 says delta-transfer was not used, although 90% of the file has not changed, target dir is on another FS and copying takes several minutes. Why doen't it try to speedup the transfer?! How can I force rsync to use delta-transfer?
when I installed 13.37 I created a local copy of the entire stable tree (source/ and all the rest) just to have all that stuff around to browse offline.
Now, to instruct myself, I'm trying to use rsync to keep this stuff up to date. But I seem either to have misread the rsync man page or ... well, I don't know. I am issuing the following command and getting the results seen below:
I have cygwin on Windows XP running rsync to remote Ubuntu server over ssh using ADSL.My data set is about 20Gb! But, Cygwin will backup incrementally, so after the first backup the process should be relatively quick.With ADSL the first backups will take too long. I was thinking about doing the first backup by copying files to an external hard drive then attaching the hard drive to my remote server and copying the files. The idea being that rsync will pick up the files as if it had created them in the first instance. The incremental backups will then pickup from there.
Does anyone have any experience with this and/or can provide any advice? The external hd is fat-32 which is okay with Windows and should be okay with Ubuntu? From XP right click copy and then paste keeps the file dates intact on the external hd - is this enough to get rsync going incrementally?
I need to be able to use an rsync command in script that will be run by cron. And it needs to be able to pass a password to rsync so that the remote server it's connecting to will authenticate.
I cannot set up ssh keys between the two servers, it's not an option. I cannot use any other language other than bash, it's my only option. I know this is highly insecure, I have no other option.
So far I have this: rsync --rsh="/usr/bin/ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o PreferredAuthentications=password" -raxv /source/dir/* user@remotehost:/target/directory/.
This allows the script to ignore host verification and goes directly to the password prompt. I need the script to fill in this password prompt with the password that is stored in a variable.
I tried using expect, but I honestly don't know the syntax, it just keeps failing. A lot of the examples I'm finding online for expect starts off with a "spawn", which i don't have installed, and not sure if I have the ability to install it yet.
i would like to find and backup all *.mp4 files from /Pictures and its sub-directories and move them to a single directory on a remote. I can find and move the files but I don't want the directory structure...just the files to be placed in the remote directory.
I made a script to backup file from each host with general password in local network. This script using SSH Pass and Rsync with this
syntax: rsync --rsh="sshpass -p password ssh -l root" hostath destinationpath Everything is okay under 9.10 version until I migrate to Ubuntu 11.04, there is always give an error: rsync error: received SIGINT, SIGTERM, or SIGHUP (code 20) at rsync.c(541) [Receiver=3.0.7]
I am using bash version: GNU bash, version 4.2.8(1)-release (i686-pc-linux-gnu) and 2.6.38-8-generic kernel
Running Ubuntu 9.10. In the Remote Desktop config dialog I get: "Your desktop is only reachable over the local network. Others can access your computer using the address 127.0.0.1 or tabatha.local." I understand this means only the loopback ip address is available. All my other machines show their true local ip address (e.g., 192.168.1.104) in this dialog. Thus I cannot log on to this desktop from other machines.
When I try to do a remote logon from another Ubuntu 9.10 box (or from an XP box using a VNC viewer), I get: "Connection to 192.168.1.102 has been closed." What steps are needed to make this machine show its actual ip address? All file sharing between the various machines is working properly and all windows shares back and forth between XP and 'nix, and among the the vaious XP boxes and linux boxes are available as designed.
I have the need to SSH into a Slackware 12 box to provide remote support. I got this, but it doesn't provide for a real 2-way communication.
while : ; do read -p "Enter text to Local: " TXT ; DISPLAY=:0 Kdialog --inputbox "$TXT" ; done
So this loops and all, but it doesn't have a history and I have to wait for a return from the Local operator. If the operator has changed focus I can be waiting all day for a response and I would have to start another session to post a second comment.
What is nice is that it's small and I can create the .sh when I remote in.
-----Update since I started
I now have two scripts to take over from the first one. I have to have 4 SSH running to get this to work.
1 SSH to move(archive) and create a chat.txt; it also fires off a .sh that fires off a console that tails a chat.txt so the operator can see the chat history 2nd SSH to fire off a .sh that loops a Local kdialog input box that appends the chat.txt 3rd SSH to tail -f the chat.txt file on the remote so I can see the chat history 4th SSH to loop a read -p on the SSH so I can append the chat.txt
..and the "listeners" with: load-module module-rtp-recv
Then, playing on the sender, and using PulseAudio Volume Control /Playback to set "Null Output", my listeners all start working as expected. The outstanding problem is that the sender is silent - nothing from its speakers. Perhaps not surprising after the "Null" setting above.
Is it possible to stream like this and also listen on the sender at the same time -
I have Ubuntu 10.04 (fresh install) and it is my work computer/Server and i'd like access to this PC from my home. But with Remote Desktop it only says i can do local.
I've look around and opened up port 5900 in my router. Set this IP to to static and forwarded the port to this computer.
I've read that unchecking the "configure network automatically to accept connections" as it seems to cause an issue in ports i've read. Still nothing.
Here is a screen cap of my settings
Uploaded with ImageShack.us
How can i allow access from my Windows Based PC to this computer over the internet?
I'm looking to convert my HTPC into a remote android dev server for my girlfriend and myself, however I want to make sure that it is possible to do what I am looking to do.Is it possible to map local devices to the VNC server (such as an android phone) so that we can work on development over VNC with phones we have on our client computers? I know its a trivial matter to map local drives over VNC, but what about non-HD devices, can I still maintain full functionality as if the device itself was plugged directly into the server?'ll be installing ubuntu again this weekend, never got around to it after my last HD failure.
I'm doing some commands on a remote server (using ssh to log on to the remote server, did a ssh key swap), how do i redirect the output of a command back to the local server ?the person who helps me out is my HERO i'm really stuck on this and it would bring me a lot further if i get this to work
As many developers probably do, I have a Windows based machine on which I run XAMPP locally to test my code and a Linux machine with Fedora as my remote server.As I sometimes use .htaccess as a way to authenticate some parts of the website, I end up having two .htaccess files: one with the local path (something line D:My_Webs) and one with my remote path (something like /var/www/html/) to the password file.I have searched high and low, but I cannot seem to find any trick so that I only have to maintain one version of the .htaccess file which can work on both Linux and on Windows machines.
I have a eucalyptus instance (vm) running an older version of centos (5.3?). As a vm it has no graphical display. I'd like to run a graphical app there so that it displays back on my local machine.
Used xhost locally and it shows the remote IP (eucalyptus instance) as enabled. On the remote side (eucalyptus instance) I set DISPLAY with:
export DISPLAY xx.xx.xx.xx:0.0
where xx.xx.xx.xx is my local ip address. Oh, I did install X in the vm instance (yum groupinstall "X Window System"). X is installed but not running there (does the point of origin of the x app need to have X running as well - and what does this mean in a vm which is a non-graphicla environment?).
Anyway, I try to run (from remote to local) xclock and get the typical
I have a Ubuntu server hosted on Amazon EC2. I need to create an automated backup scheme so I created another Ubuntu instance on my local network which is hosted in a virtual environment. I managed to transfer the necessary files between 2 machines on the same network using the rsync command:
How can I do the same thing but transferring files from my Amazon server to my local server? Is there a way I can achieve this with port forwarding, or by VPN, or anything else? It doesn't have to be rsync. If you know about a better method, kindly let me know.
I have a remote drive mounted on my system(ubuntu 10.04 x64), and i have the contents of that drive backed up to dropbox. the problem is, if i unmount the drive, the files disappear from dropbox. is there a way to mirror the contents of the network drive to a localfolder(preferably in such a way that all changes and file deletions are changed on the local folder instantly, but unmounting doesn't delete it all)? It looks like rsync would work, but im not sure how to make it work.