General :: Synchronize System Changes On Two Computers?
Aug 22, 2010
I have two computers with Ubuntu 10.10 on them, a desktop and a laptop. I have two options for handling them:
Try and keep them the same
Accept that they have different 'personalities' and branch off their on way
However, I've chosen the first option because I like to keep my files and settings the same on both computers. I am in the process of extracting a tar archive I made of my entire desktop's hard drive (9.7GiB) onto my laptop, so after this is done they will be exactly the same. However, upon turning the laptop on with my freshly-copied system, things will become different. If I take the laptop to school, for example, and do stuff there, then come back home, and do stuff on my desktop, how can I sync these computers (both ways!) with the new changes at the end of the day/week/whatever?
I've written a "C" program which transmits audio to a number of computers over a TCP LAN connection. I'm using ALSA, the preemptive kernel, and pthread. After running for 30 minutes or so the slight variation in sampling rates (~+-.01%) among the computers accumulates and manifests as a noticeable differential delay in the sound from the speakers. I know how to detect the variation and would like to dynamically compensate for it by individually varying the sampling rate (ever so slightly) of each playback device to oppose the variation.
Does anybody out there in Linux Land know how to dynamically vary the playback sample rate? I've tried using snd_pcm_hw_params_set_rate() and snd_pcm_hw_params_set_rate() followed by snd_pcm_hw_params() to no avail. They don't seem to work when playback is running.
I want to set up a web server, and I want to set up NTP so to always keep the clock in sync. I have installed a very basic system (No GUI or X components), to keep it slim and thereby a little less prone to security problems. However, does anybody know what the Synchronize system clock before starting option in system-config-date actually does in terms of changing config files or permissions? I'd like to know so I can do it manually via the commandline.
I have made a backup of my files on a remote server and I'd like to maintain that backup using rsync. The problem is that the timestamps don't match perfectly between the source and the backup.
What can I do? I'd rather not replace all the files in my backup because there is so much data it would take a very very long time.
Is there perhaps a way to compare checksums and then update the timestamps? Both are low power boxes with no GUI and only BusyBox CLI access.
Scenario: An IDE is set up on a Linux desktop box, editing PHP files locally. Every time I save a file, I want this change to appear on the linux server where Apache is running. The server has ssh (and samba and nfs for that matter).As a reference, when I edited files on Windows, I finally came over WinSCP as the exact tool I needed - WinSCP have just this feature present, with initial synch and then continuous update, using the filesystem watch service: "Keep Remote Directory up to Date".
On Linux, one could argue that sshfs could be employed to sidestep the need for synchronization entirely. On windows, a samba-share would do the same. However, I want the IDE to work with local files (on a SSD disk!), not having to go over the network to do PHP indexing and whatnots, which takes ages.But sshfs might be a part of the solution nevertheless - so that the continuous synchronization just needed to be done between two local directories.
I am trying to synchronize the time of my VM server with ntpd. I have the following configuration. And in the /etc/ntp.conf, I have the following line:
restrict default ignore restrict 127.0.0.1. server time1.server server time2.server
Whenever I have this line, the erver is not able to synchronize its time... So far as I understood, this line prevents other servers using this machine as a time server. And the second line says to allow localhost to use as time server. But why do I need to use its own time server when I have specified to use time1.server and time2.server ? ( firewall for tcp and udp ports 123 is open)
However, when I replace the first line of the configuration with the following line, it works. restrict default kod nomodify notrap noquery But with this, i am allowing other servers to use this server a ntp (which I wouldn't like to). Why this machine tries to use ntp server of its own (to snyc time) and why it is not working though i have the entry "restrict 127.0.0.1" ??
how to uncheck option to synchronize clock with UTC. This option is being asked at installation time. I couldn't remember text correctly but above text will give you hint. how can I uncheck that option using command.
I use Thunderbird in different computers so I configured it so that the datas are on an external drive. It works fine except that it seems that the contacts are still saved on the different computers and not on the external drive. It is strange since I have all datas for each profile on my external drive and all Mail and folders are there. I use different versions of Thunderbird (3.1 and under). how I could also have my contacts datas on my external drive?
In a few months I am gonna be starting a website that handles credit card information.
A requirement of my servers is that they be fully self-maintaining.
By this is mean that they will download their own updates, restart them-self, and switch them-self back on after general house power failure (black out).
Is there a way in Linux to make the computer automatically reboot itself after download and installing an update (if it needs to). My Linux uses the apt package system, with Synaptic Package Manager/Update Manager.
Now here is another problem I found. When a customer has logged into my site, my computer won't register them as logged in when they've rebooted? Like, its going to issue them new cookies? Credit card transaction process will be have to be restarted?
What if I build two identical server running in sync with each other... sort of like RAID mirror.. and they both download and install updates, once their both ready to reboot. Computer 1 will reboot. On successful reboot, System 1 alerts System 2 that it successfully reboot and that System 2 may now reboot. This way the customers experience is not interrupted.
I'm trying to mirror an apt-cacher-ng cache between two computers. I have apt-cacher-ng installed on my laptop, and I have another machine running apt-cacher-ng. In order to keep them both up to date with each other and to make sure all the computers have all the updates, I've been trying to find effective ways to keep them matched.
- Unison looks like what I want, it would delete files that are deleted, and it would add files that are added. (the assumption is if they're deleted from one, they'd be deleted from others). - Rsync seemed quite a bit easier, especially with the advanced permissions issues. Apt-cacher-ng uses a user called apt-cacher-ng.
Instead of giving root an ssh password, I wanted to just ssh as apt-cacher-ng. Then I can still get the files over the network, but without the root account being open. So I ran: passwd apt-cacher-ng and when I sshed, it looked like it was working until it logged me out (almost immediately). So that's not working. What am I missing? Maybe there's a better tool then rsync for this?
I'd like to have a copy of a web site on my local drive. Then when I make changes to that copy, have those changes automatically updated on the site's server. Ideally I'd like to tell it to only do this for certain file types. Does anybody know of a way to do this with Linux?
I just installed antix. It asked for time zones and I set all of that up but it is 3hrs off. My computer clock is correct, why can't I just set up antix to recognize my computer clock? Or why doesn't it just use that as a default?
How to design secure fault tolerant network (routers, firewall, domain servers, etc) of 300 wireless computers separated in multiple buildings and floor with multiple users for each station mainly utilizing dtp and internet software?
im trying to connect two computers on lan.One computer has: VMWare Workstation and has Opensuse 11.3 mounted in it.The other computer has: VMWare Player and has Opensuse 11.3 mounted in it.Both computers are connected to a switch with cables.I have followed this guide in both computers:Depanati singuri calculatorul!: Opensuse 11.3 - configure local networkin order to setup a network.In one computer, if i go to: Computer---Network---Network folder, i only see one machine. When in fact i could see both of them right
I am looking for an automated backup system and I like bacula. I have 3 Notebooks and a Desktop computer that need regular backup. Now I don't want to let them run all night just to do the backuping, so I was thinking I could use wake-on-lan to have bacula wake up the machines, then do the backups, and shut them down afterswards. While this may work with devices on the ethernet, it won't work with the Notebooks on the wifi. So is it possible to have the Notebooks schedules to automatically wake up from suspend or shutdown ? Or is it possible to interject a shutdown command if it is after a cerain hour and call the bacula director to start the backup now?
I recently built my second general-purpose server, and recently installed fedora core 10 on it. The first thing I attempted to set up after installation was the network - and that's where it's gone wrong When editing a network device using the graphical system-config-network utility, I find that the subnet mask is being automatically changed to match the default gateway address every time I attempt to modify any of its settings (or sometimes even when I cancel the changes). This also means that I cannot set the subnet mask, as it simply won't accept my setting for it. I seem to be able to get around this glitch by setting the subnet mask using the shell version of the same utility, but that doesn't solve my network issue.
Even when I use the shell utility to fix the subnet mask, I'm unable to ping other computers or routers on the network even when ifconfig indicates that the desired ip address has been taken, and other computers on the network are also unable to see the server. I'm using a wired connection and a static IP address on a network with no DHCP.
How can I setup my Linux server to automatically synchronize its server clocks? We need to ensure our server clocks are set to the correct time always.
In my home I have 2 local networks. Now on some of the MS machines I can't see shared printers vice versa with Linux machines. There is some software that I found befor I could setup and allow me to see across a WAN.
Before I could not even get to identify the Ipaq on the usb port.
Now, after blacklisting ipaq I got this:
Quote:
But still can't see it on the computer nor in media folder, what goes next? I also installed activesync trhough wine, but still no success on actively detecting the device.
On Ubuntu 10.10 x64, when trying to sync via right click "Ubuntu One/Synchroniser ce dossier" Synchronize to Ubuntu One, nothing happens. When within the "Documents" folder I select "Synchroniser ce dossier" Synchronize this folder from the upper right section of the window I get an error message: "Erreur lors de l'activation du dossier. Impossible d'activer la synchronisation du dossier /home/ao/Documents avec Ubuntu One"
ao@Ordi-AO-Ubuntu:~$ u1sdtool --list-folders No folders ao@Ordi-AO-Ubuntu:~$ u1sdtool -s State: QUEUE_MANAGER
[Code]....
Within the Ubuntu One app, my computer is identified and connected.
I wanted to know, I it is possible to synchronize two systems. I�ve got one System on an ext. hard- drive, and one on a desktop computer. An example for better understanding:If I update on the ext. HD and I connect the ext. HD to the desktop, the desktop should be updated to. Or, when I create a new user on the desktop, and I connect the ext. hd, the user should get created to.Is this possible?
I want to transfer files (a music folder) between two Linux computers. After searching for the best way to do this, I've seen that there are lots of ways of doing this. I know this has been asked a lot, everywhere and all the time. The main problem with this is that there is no clear, recent consensus on one best way to do this task in 2011 for Linux beginners (even depending on some parameters).
So in the spirit of the Stack Exchange websites, I want this not to be related to my particular situation, but more of a guide to others as well on how to transfer files between two Linux computers over a local network. I think a wiki would be useful for many.
Here's what I found so far:
ssh sshfs scp sftp nfs samba giver
What is the easiest? Most flexible? Simplest? Best solution? What are the pros and cons of each? Are there other (better) options? What are the parameters in choosing the best method (solution might depend on number of files, filesize, easiness vs. flexibility, )?
I'm looking to install Linux on two of my home computers. Here they are, with a brief description of what they will be used for. Rig #1: main desktop: Dell Dimension, P4 3.0GHz, 2GB Ram, 128MB PCIe Video Card Currently, I have WinXP Pro installed and it is my main workhorse computer.
I would like to have a fairly full featured distro that I can test drive as an alternative to WinXP (which I use mostly for web browsing and mp3s and games... I know I may be out of luck with getting many of my games working on linux, but I can live with that). The only other caveat with this machine is that it has to work using a USB wireless network adapter. The wireless router is nearly inaccessible and too far away to plug into. And there are no wired ports in the house.
Rig #2: old computer: Celeron 850, 512MB Ram, 30GB HD, 64MB AGP Video card My really old computer that has just been sitting around collecting dust. I would like to install a fairly lightweight distro (for obvious reasons) to play around with. Maybe get some experience using linux from an admin perspective, like installing/compiling packages, running servers, etc...
I have already tried to install Linux Mint and Xubuntu on my main desktop. While both installed without any errors, neither of them was able to boot into linux. Presumably because of this bug:
Which seems to be a problem with Grub/Ubuntu. So I'd like to stay away from Ubuntu. So what are some distros that you guys would recommend for these two rigs, given my potential uses/limitations?
i am currently running Ubuntu 9.10 on my laptop, and the 9.10 netbook remix on my HP Mini. I have set up a Samba connection, so I can access the files from my laptop from the netbook, but I am wondering about software to make synchronization of particular folders in my home folder easier, especially since with my setup now, if I access my laptop's home folder from the netbook, I can see all the folders, including the hidden ones!