I have atheros wifi chip and I got wifi through madwifi. I was using f9 and my wifi connection in very stable. Then I tried F10. It has already firmware for atheros in it kernel. It get connect through wireless network (wifi) but soon it disconnect. I updated NetworkManager but no fruitful result came. Finally again I switched to f9 and I got very stable connection. But I am missing the features of f10. I hope that the connection will be stable in f11. I tried to install f11 but it says /dev/root not found again it shows that something clock time at the time of booting and it stop booting. So finally I choose to use f9.
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
I'm hoping somebody can find something here that I haven't. I'm trying to use rsync to backup home directories to a nas. First, I NFS mounted the nas and ran an rsync and everything worked out fine. the transfer completed after a few hours and everyting was transferred (lots of stuff!). I then decided that I don't want to leave the nas mounted all the time and I didn't want to automate mounting and unmounting of the nas as I didn't think I could produce a script that would work reliably enough. So I decided to start an rsync daemon on the nas and upgrade via that. I run the following command (results are included. the ^C is me killing it after it hangs).
rsync: link_stat "/av" failed: No such file or directory (2) skipping directory home rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7]
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
I installed amanda on a client that uses Red Hat Enterprise Linux 4. When I do an amcheck from the server, I get this: Amanda Backup Client Hosts Check -------------------------------- WARNING: 192.168.3.23: selfcheck request failed: recv error: Connection reset by peer Client check: 1 host checked in 0.022 seconds. 1 problem found. I am using bsdtcp. I get an ACK error if I switch the client to bsd authentication. I read that I should enter servers adress to hosts file in /etc/ directory?
I saw in a magazine reference to using rsync to have identical copies of folders. This looks like something I could find useful as I have a large number of items in need of safe backup.
I have the folders on an old system on a home network and would like to copy these over to a USB Hard Drive.
Currently the folders reside on SFTP xxx.xxx.xxx.xxx and I wish to sync them to a USB port on my laptop.
I have a samba share to a windows 7 computer I do not know if I will be able to use backintime or not so I want to know how to have rsync do my backup.I read the man but I'm not sure if I understand the it.on same computer different hard drive to run every hour in a script. Leanne is windows 7 share and backup is the other hard drive in the computer rsync -arvRzEP /media/leanne /media/backup.
recently i made a backup of my home directory in 10.10 before reinstalling 10.10. again.This time I chose to manually define the partitions (50GB Root, 25GB Swap, 325GB Home)Now i wish to migrate the old home into the newly installed home, which is on a separate partition.I have found the following documentation URL...Still, as a beginner I am not quite sure about the necessary steps to perform.As the new home is located on a separate partition is it possible to simple delete all directories there and copy all directories from old home to new home with rsync?
Do I have to install all the software that corresponds to the old home first followed by migrating home or first migrating home followed by installing the software such as thunderbird, Texlive2010 etc.Guess that migration should take place at a later stage. Otherwise my old profile files from firefox and thunderbird will be overwriten by new ones?
I'm trying to learn how rsync works to backup my system. I tried: Code: rsync -azvv /home /media/Elements I get a folder called home on my external hard drive but when I use ls -l to see the permissions they are all wrong. On my /home folder the permissions for /nathan are drwxr-xr-x 48 nathan nathan The permissions on the backup /nathan folder are drwx------ 1 nathan nathan
I also tried using the long version of -a which is -rlptgoD and that didn't work either. What do the 48 and 1 mean when I used ls -l? When I look in the /nathan folder the permissions are all screwed up too. A lot of the files are backed up as executable and the permissions are all screwed up. I also ran it with sudo, and that didn't work either. The permissions were still screwed up and ownership is messed up too.
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that rsync -r -a -v -z * --delete freenas: DSIBackups..It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not.
when rsync is finished the update, or in the meantime - i need to move the updated files to a different location - like date +%Y%m%d something or what ..the reason is, because of the development, i need the modified files, but all of them, not just the last one - so i have to store them daily, but i dont want to store the whole dir - just that few files which are updated does it make sense?
I am trying to backup a client called greetings.com, but when I do an amcheck on the server, I get: WARNING: greetings.com: selfcheck request failed: Connection refused
well, i know ther are issues when using rsync to copy files to ntfs partition like file permission blah blah. the thing is, i need to backup my music files periodically onto a ntfs partition from ext4. i really dont care about file permissions or any other stuff. when i use rsync, it should update the mp3 files on my ntfs (external) disc with the new ones.can i give a go with this operation? i have lot more important files on the external disc and i dont want this rsync corrupt or delete those files coz they are highly important files.
I have a Linux host acting as an ISCSI server for a Windows box. I want to keep an off site backup, so I figure rsync will keep the ISCSI server synced with an offsite Linux host. I understand that Rsync does block level incremental transfer to conserve bandwidth ok, awesome.The trick is, that I also want an archival copy kept. Say I want to go back to a revision of a file from 10 days ago, I need to be able to do that.
I was planning on using Backup Exec, since we currently have a licensed copy. Throw the archives from Backup Exec onto the ISCSI server as well, and have it keep a rotating 30 day backup, or something like that. The issue I see here is that this will be creating a deleting files as it does its daily backup rotation. I'm guessing RSYNC will see these as new files, and likely retransmit everything on a daily basis. The question then becomes, is this assumption correct, or will it still know to do a block level incremental transfer even when file names and such are changing?
Our backup script was working fine (ssh to the server, back up /home to a second hard drive on my computer). Then right after an ubuntu update, it quit working. I investigated and found that "something" had changed the label on the backup hdd to what looked like gibberish to me. But the script identified the backup hdd by its uuid, which didn't change. Yet, here is the error I get when the backup fails: receiving file list ... done [took about 5 seconds] rsync: mkdir "/media/14D9-3B1F/server-backup" failed: No such file or directory (2) rsync error: error in file IO (code 11) at main.c(594) [receiver=3.0.6]
Note that the backup hdd IS mounted, uuid is correct, and the folder 'server-backup' DOES exist. Does anyone have a clue for me? I'm moderately experienced in Linux and ubuntu. Our server runs centos 5. And as stated, the backup ran fine for several weeks. I think there was a new linux kernel on that update, but at this point a while later I don't know which one. Current kernel is .2.6.31-22-generic.
I support a small business which has an Ubuntu server running as a file server. The server is running Ubuntu 10.4. There is one hard drive which is mounted as /media/hdd. Each night this is backed up to an external USB hard drive mounted as /media/backup. The backup is carried out using the command:
Code: rsync -av /media/hdd/ /media/backup/
Is there a way to encrypt this back-up so that if the USB hard drive is plugged into another machine it cannot be read?
I switched last summer from Windows (used it since Windows 95) to Debian. I'm using Debian Jessie for a couple of months now and I'm getting used a little.
There are problems here and there, but I can solved them with some reading on the web. Not really a big problem...till now
I run Debian 8.2 om my PC (PC1). Bought an older PC (PC2) that I want to use as a backup server.
I'm using PC2 only for making backups, after the backup I switch it off again.
So I installed Debian 8.2 (net-install without DE and with SSH) on PC2 and tried to configure it to let it work as my backup location. Made a public SSH key and exported it to the root account (no problem) and to the user account (sensdeb), but there was an error "Access Denied"
Gave the user (sensdeb) sudo-rights via visudo file
# User privilege specification root ALL=(ALL:ALL) ALL sensdeb ALL=(ALL:ALL) ALL
I installed rsync.
The problem is that Rsync only works when I use the root account.
I don know how to give the user sensdeb the rights so that I can use that account for my backup tasks. Now it's possible to sync with the root account, but that should not be the way to do it, I read many times.
I want to save a backup of my data on a remote server, but never want the backup server to see the data unencrypted. Editing a single file and backing up should not result in everything being encrypted and sent again. The remote server should preferably not even know the directory structure (and especially not the directory names).
I've been trying to make a three stage backup with stage 0 being a full monthly back up, stage 1 being a weekly backup, and stage 2 being a daily backup. I've been trying very hard to use rsync for this but sorting files by date is proving to be problematic. Sometimes it seems to work from the command line directly, but the same command causes errors and warnings from a script while entirely failing to sort the correct files.
The common example I see for this involves commands like this:
Code: rsync -Rav `find /home/ -ctime -7 -print` /path/to/home_backup The problem seems to be that since the user directories in /home contain files that have been altered within the time frame specified the whole directory is matched first which means that the whole directory is recursively archived as opposed to just the changed files.
I've also seen examples using the --files-from tag using the same find parameters and this one seems to ALMOST work but gives me strange warnings and fails to run at all when launched from a script.
Many of the things I've googled about using rsync to backup stuff by the date modified involves a rather snarky "You're missing the point of rsync!" to which I respond by yelling at my computer monitor followed by "JUST TELL ME WHAT I NEED TO KNOW!" I understand that rsync is meant to take care of incremental backups on it's own, that's why I want to use it specifically for a traditional 3 stage backup scheme.
To copy from production to standby over the internet I use a cron job doing rsync -avze 'ssh -p 8022' --exclude-from= ....
My question is: should the cron job run on the production or the standby system. Root access to the remote system is given by a pass phrase-less ssh key. Currently I run rsync on the production system. I guess that it is more secure because the standby needs no ssh login to production. Running rsync on the standby would use less resources on production. I am concerned that in this case there would be pass phrase-less access from standby to production.
I'm doing an rsync backup to an external drive in order to take a shot at setting up partition encryption. My rsync command is, as root: Code: rsync -av / /external1/backup.Once I've finished my cryptsetup and done a fresh Linux install, what command should I use to properly restore my backup (without messing up the encryption setup)?
I'm going to make a nightly backup copy from one server to another, using rsync. If I have a sufficiently large file, say 4+ GB or so, I'm not interested in copying the whole file if only a small change has been made. Can rsync detect small changes on block level and backup only those if needed?
So I am using rsync (3.0.7 on MAC OSX) to backup one hard drive to a folder on another one. The is USB drive to USB drive and I have done the initial backup from one drive to a new formatted other drive with the following command:
Code: rsync -avX --progress /Volumes/Source /Volumes/Destination This all appears to be going smoothly as I type. I am going to write a script to do subsequent backups in the
I want to backup windows PC's in my network to my ubuntu 11.04 pc, using rsync. Rsync is working, but I have to mount the pc's. A few details. My server is named: server The windows pc is named: \PC_OF_MARTIJN The folder where the mount is coming is: /home/bastiaan/backup/mounts Credentials are in /home/bastiaan/backup/credentials and they're called: martijn
So what I'm going to add to /etc/fstab is this: Code: //server \PC_OF_USER /home/bastiaan/backup/mounts/user cifs credentials=/home/bastiaan/credentials/user,iocharset=utf8,file_mode=0777,dir_mode=0777 0 0 Will this work?
I am trying to use rsync & ssh to move a backup folder some computers to a server. I found a command that is supposed to do this, but I am having issues getting it to work.
I have setup Rsync as a daemon on a Ubuntu 10.04 box and the setup was successful. Here are my configs
Code: root@hurricane:`# nano /etc/default/rsync # defaults file for rsync daemon mode # start rsync in daemon mode from init.d script? # only allowed values are "true", "false", and "inetd"