Ubuntu Servers :: Disc Mirroring Between Local And Remote File System
Nov 5, 2010
Is there any software that will do full mirroring between a local and a remote file system. I have a server (9.04) and a laptop (9.10). Each user has shared a shared directory on the server, and on the laptop. Updates to files may be done on either system. I want to keep both copies syncronised. Currently I use a script based on rsync (scheduled by cron) to keep the local and remote copies in sync.
The problem with this approach is that rsync only seems to be able to handle deletion of files if one file system is the master, which is not the case in my set-up. If I move a file to a different directory, rsync will reinstate the old file as well as copying the new one. I was hoping there was some software that could do proper mirroring between the 2 systems, but6 so far I cannot find anything.
View 3 Replies
ADVERTISEMENT
Apr 8, 2010
I have two web servers. One is active and one is in reserve. I keep the user data (web pages) in sync by running rsync every 10 minutes or so. This copies any changes from the active machine to the reserve machine. But, it's slow, only gets changes every 10 minutes, bogs down the disk, does strange things to files that are changing during the rsync process etc...
I want something that will automatically copy any changes from the active server to the reserve server as they are made. IE I hit 'save' on the active server, it copies the file to the reserve server. Simple!
I've been looking around and I see GFS which is really vastly more complicated than I need. I'm happy with read-only access on the reserve host, so I don't need distributed lock management.
I could theoretically implement this by setting inotify watchers on every file and running an SCP or rsync command when a file gets saved. So, it can't be that hard.
I do not need a true networked file system, as in something I mount over the network. I just want something to keep my disks in sync.
View 3 Replies
View Related
Apr 6, 2010
Blue sky thinking at the moment:I have a number of file servers, each at different sites. I would like to be able to make these appear as one, so that files on any server can be accessed from any site, and the user doesn't even see there are multiple servers.Obviously, the internet is slow, especially the upload speeds. So when a file is written the write ought to go to the server on the client's LAN - even if it was previously on another of the servers.However, for robustness, some sort of background mirroring is also wanted. If all the servers were left on and connected, they eventually end up all in sync. But this mirroring needs to be mindful of bandwidth usage; if someone writes a big file to their local server, copying that to the other servers can't interfere with normal internet usage.I think UnionFS or similar might be able to handle the unioning side, but not the mirroring stuff.
View 2 Replies
View Related
Jun 26, 2010
Attempting to create a backup script to copy files from one file system to a remote file system.
When I try this I get:
Quote:
# tar -cf - /mnt/raid_md1 | gzip -c | ssh -i ~/.ssh/key -l user@192.168.1.1 "cat > /mnt/backup/fileserver.md1.tar.gz"
tar: Removing leading `/' from member names
Pseudo-terminal will not be allocated because stdin is not a terminal.
ssh: Could not resolve hostname cat > /mnt/backup/fileserver.md1.tar.gz: Name or service not known
[Code].....
I know that the remote file system dir is RW and the access is working fine. I am stumped...
View 3 Replies
View Related
Dec 22, 2010
On Linux I do:
rdesktop remotepc
How do I copy and paste between my local system and the remote system?
View 1 Replies
View Related
May 5, 2009
The code listed below is an excerpt from a script that I am writing. The goal is to verify that a directory on a remote server is available to the local system. If that is not the case, a log file is written, and all filesystems that were previous unmounted, are remounted on the local system.
Code:
# # Unmount all NFS mounts prior to the archive process.
umount -a -t nfs
# Mount the remote directory (NFS) prior to running the make_net_recovery script.
# Make sure there is a <remote server> folder located in the /mnt directory. If it is
# not already there, create one.
mount <remote server>:/<local system> /mnt/<remote server>
# Verify the remote directory (NFS) is available. This directory is needed
# as it is the destination for the iso images. If it is not available, stop
# here, and write the results to a log file.
df |grep <remote server> > /dev/null
RC=$?
echo $RC
if [ ${RC} -eq 0 ]
then
echo successful
else
echo not successful >> /tmp/make_net_backup.log && mount -a
exit
fi
Is the syntax shown above correct?
View 1 Replies
View Related
Aug 14, 2010
I am having trouble to make my SSH server working with remote machine.
Code:
telnet 192.168.1.102 22
PHP Code:
[Code].....
View 9 Replies
View Related
Sep 27, 2010
I'm using Windows XP. I'm connecting to a UNIX box using putty SHH(ksh). Now I want to copy a text file present in remote host to my local system.
View 5 Replies
View Related
Sep 22, 2010
I have a fast computer in my office and I want the person using the slow computer in the same office to boot up and see the login window (gdm) and log-in from there into the fast computer and be able to use their session on the fast computer the same time I am locally logged in to the fast computer as a different user and session.Is this best done through XDMCP? Where is a good tutorial on how to set this up?
View 9 Replies
View Related
Jan 31, 2010
The situation is say all I have is a windows machine and I remotely connected via ssh to a Linux machine. Is there a way I can mount my local CD-rom on the remote Linux machine?
View 5 Replies
View Related
Aug 12, 2010
How do you transfer a file from a local Mac to a remote Linux machine
View 5 Replies
View Related
May 13, 2010
How can I make the user in remote LDAP server to be used to authenticate Local Linux server ?
View 5 Replies
View Related
Oct 15, 2010
i am using dolphin 1.5 in kde 4.5.2. whenever i try to access movie file from remote samba server. dolphin copies the movie file to somewhere in local hard disk. so, i have to wait until a big file transferring complete. i know that it happens when i open .avi using mplayer. if i open the same remote file with kmplayer, it will player immediately instead of making local copy first. however, kmplayer is very slow and sounds and video stream breaking up, (i am sorry i do not know right english expression for this) i suppose this is not related to mplayer configuration. this seems to be dolphin problem. can i make dolphin to stop copying samba share to local disk and play instantly? there is a video in videos. it is comparing how dolphin and nautilus act differently when i play remote samba share movies.
View 3 Replies
View Related
Dec 28, 2010
Assume I have plugged in an external USB hard disc.
How can I find out (from terminal cmdline) the file system (ext2, ext3, reiserfs,...) of this hard disc?
From Ubuntu I know the two commands:
sudo blkid -c /dev/null
or
sudo fdisk -l
but these are not known in CentOS.
What are the corresponding cmds in CentOS?
View 5 Replies
View Related
Aug 17, 2015
There is this bug in the latest version of Ubuntu, which is also Jessie, which is:
Can't copy a file from SMB share to the local file system: Software caused connection abort
The problem, apparently, is that newer versions of Samba hit servers with multiple requests at the same time, and for some reason the Zyxel and Iomega boxes can't handle this. The best solution they've come up with is to modify the smb.conf file on your server to include this setting: "max mux = 1".
Here is the reference material on this bug: [URL] ....
People who develop samba have fixed it in the latest version but neither the ubuntu nor Debian have released the fixed version of nautilus, as of yet. Here, is the reference: [URL] ....
View 0 Replies
View Related
Feb 12, 2009
Is it possible to log into a remote windows computer from my local linux computer and vice-versa using the "telnet" command?
View 2 Replies
View Related
Mar 30, 2011
I am currently running a VPS with Ubuntu Server 10.10. I have been trying for a few days now to get some programs to run when the system starts, but with no luck. I am trying to use rc.local to do this, there is an:
/etc/init.d/rc.local - http://paste.monsterprojects.org/mpbjhwhbjzhbjrr (Was already on the system, I have not edited this)
and:
/etc/rc.local - http://paste.monsterprojects.org/mpbjhwhbkkkhwez
If i run the /etc/rc.local script manually all the programs start fine, and if i run /etc/init.d/rc.local start, all the programs start fine. But for some reason they just don't seem to be starting when the system boots.
View 3 Replies
View Related
Oct 23, 2009
How would I find out what servers are running on my local system from the command line? I cannot find out how to accomplish this anywhere?
View 3 Replies
View Related
Sep 7, 2010
I'm trying to mirror an apt-cacher-ng cache between two computers. I have apt-cacher-ng installed on my laptop, and I have another machine running apt-cacher-ng. In order to keep them both up to date with each other and to make sure all the computers have all the updates, I've been trying to find effective ways to keep them matched.
- Unison looks like what I want, it would delete files that are deleted, and it would add files that are added. (the assumption is if they're deleted from one, they'd be deleted from others).
- Rsync seemed quite a bit easier, especially with the advanced permissions issues. Apt-cacher-ng uses a user called apt-cacher-ng.
Instead of giving root an ssh password, I wanted to just ssh as apt-cacher-ng. Then I can still get the files over the network, but without the root account being open.
So I ran:
passwd apt-cacher-ng
and when I sshed, it looked like it was working until it logged me out (almost immediately). So that's not working. What am I missing? Maybe there's a better tool then rsync for this?
View 2 Replies
View Related
Mar 30, 2011
I believe it was rsync was the tool. I have a box running CentOS 5. It has a 250GB HD in it. I have another drive with the exact model. Currently it has a as a said a 250GB IDE drive. I want to shutdown the machine, install this other hard drive and set up a cron job that will backup my main drive at times that I set. This way if the main drive fails, I do not loose all the data and have to rebuild the server from scratch as I have been custom configuring it for years. I can't remember if there was an issue with the main drive being mounted to back it up or not. I have looked at some of the how's on rsync but they seem to only talk about using another server for this. If I shit down the box, install the new drive, and the box boots back up, is it going to ask about that drive or what do I need to do to get rsync going and does it partition the drive as such? And can I do it this way. This way if the main drive fails, I can just swap the drive and be on my marry way.
View 2 Replies
View Related
Aug 15, 2010
I'm trying to setup a NAS for my network, the only problem is that I can't figure out how to do it. On my network I have about 3 computers however only I only use 2 of them so I thought that maybe I could use the third computer in such a way that I could access it 24/7 from the internet as a server for all of my files (school papers, music etc etc...). The only problem is that I don't know where to get started. Both of the computers I'm currently using are Windows and the one that I,hopefully, can turn into a server is running on Ubuntu.
View 4 Replies
View Related
Jan 21, 2011
I have downloaded several service GUIs to support work with subversion; the most developed one is "kdesvn". I also tried some CVS surfaces. It is always the same problem: I cannot connect to a repository in the local file system. This is weird! The greatest use of versioning in my practice it to work local, and it should not be difficult to program streaming with local files (WinCVS allowed me to do this.)
View 5 Replies
View Related
Mar 15, 2011
I'm trying to set up a framework where people connected to same wifi connection can enter a local site for developing purposes.I want them to be thrown to a local copy (development copy of the site) when they type in www.development.loc in the browser.I don't want to connect the world, only people connected to my wifi.Anyone willing to help by stating what I need to edit to :1. Allow me to access local copy of site that's on computer (located : /var/www/developmentsitename/) using www.development.loc in browser./and2a. What do I edit to create the server accessible by other computers connected to same wifi connection. 2b. If another computer can connect to this site now, can we create a virtual desktop setting in which workers can work as if they have their own partition on the server to work on and upload work onto the development server.
View 2 Replies
View Related
Sep 9, 2010
I have a computer in the university and I have root access to this pc. Iam trying to install Cblas library on it. But it gives me a starnge error /usr/local/atlas is read only file system. I tried doing mount -l and it gave me that appserver1:/export/d1/Linux/doe on /usr/local type nfs4 (ro,sec=.......) I think what it means that the main server directory is mounted to /usr/local and it is read only. So, how can I fix this problem to separate the two and make my /usr/local separate
View 13 Replies
View Related
Feb 11, 2010
I'm going to be using LVM2 to mirror 2 250GB drives for redundancy and fault-tolerance. As per the best docs I have found on mirroring [URL], I am going to keep the log file on a separate physical volume, however I can't seem to find anywhere a good reference for how much size I need to save for the log file. Everything says "small," but "small" doesn't really help in the real world :-).
I have plenty of space on the 3rd pv but want to leave as much of it as possible for other stuff. Should I leave 32MB ? 500MB ? 1GB?
View 1 Replies
View Related
Feb 14, 2010
I am new to XUbuntu and I cant seem to browse my other hard drives connected to my system, It keep saying Connecting to "60 GB Filesystem" failed.Authentication is required.
how do I browse my other hard drives in Xubuntu?
View 1 Replies
View Related
Jan 23, 2011
I have an ubuntu fileserver and an ubuntu laptop both running 10.10.For some reason I can't connect to the server (file or remote terminal) from the laptop, even though I can access ssh through terminal on my mac and have been able to mount the filesystem on another computer running the ubuntu liveCD. I just get the error 'no route to host'.I've tried turning off the firewall on the laptop and re-installing ssh on both computers, but I don't have a clue what to do next!
View 9 Replies
View Related
Aug 28, 2010
Rsnapshot is a software written in Perl to make backup of local and remote file system. The well proven rsync is behind this utility. rsnapshot does not need root user intervention to restore the data of a normal user. It does not take much space in your Backup server. It can be easily automated (scheduled) to make life easier. Just setup once and forget it configuration. Basically it takes snapshot of file system (or a part of) in regular interval such as hourly, daily, weekly and monthly.
This can be configured easily through a simple text based configuration file. The above task can be setup in a few easy steps in a few minutes. Two major tasks are configuring rsnapshot and openssh automatic login. To make the backup automatically, we need to automate the remote login in a secured way. This can be done through openssh tools. This scenario depicts backup of desktop (assuming that IP address is 192.168.0.100) data to a backup server. My desktop runs on Ubuntu 10.04 and backup server runs on Debian Squeeze. [URL]
View 2 Replies
View Related
Apr 7, 2010
I am trying to get the dd command to successfully copy a disk image to a remote system.Right now I am testing out the syntax by trying to copy the /dev/sda1 directory of the subject computer. The command syntax that I am using is the following:Code:dd if=/dev/sda1 ibs=4096 conv=notrunc,noerror | (ssh 132.183.12.128 of=/roarchive/test obs=4096)The user account running this command is root, and the account does have key-based authentication between the source and destination computers. The command does not return any error messages, but when I check the directory on the destination system, the expected output is not there.
View 7 Replies
View Related
Jul 26, 2011
I have a postfix mail server on ubuntu 10.04 lts behind a router. so all local users are fetching/sending mails through ms outlook using local IP. Sometimes when internet goes down and any mail send then it bounced back immediately saying domain not found. Can u please tell me how i configure to hold all mails in postfix server rather than bounce when internet fails and will pass through when restored the internet around 15-30 minutes?
View 2 Replies
View Related