Machine A holds important source data that needs to be backed up. Machine B is where the data will be backed up to.They're both Ubuntu.I want to have an automated process that allows machine A to create an encrypted tarball and copy it to machine B, without human intervention, but doesn't allow anything except copying the file over to a target directory.I thought about using a chroot jailed account, but this seems to be a pain to set up and overly complex. I really just want to be able to have machine A to copy files to machine B, via automatic cron, but I also want to prevent the mechanism that allows the copy from allowing any other actions (copying to any other directory, logging in, executing any other commands noninteractively, etc.). Also, the transfer must be encrypted (e.g. using ssh somehow).
Request:Tutorial for a complete novice how to back up Ubuntu partition (recommended backup applications, commands) and transfer to different hardware (from Intel laptop to ARM laptop). Specifically, copy Ubuntu from "HP AMD Turion 64 X2 TL-58" to "Netwalker i.MX515 Cortex-A8"[URL]Which partition imaging tools or commands would one use?Does one have to switch kernels, modules - if so, how?
I have a couple of Lenny LAMP servers, and a backup server. (virtual testing environment)
1. What is the best way to perform a backup? (system state as well as individual files) Although system state can also be accomplished through the hypervisor.
2. Between Windows computers, I access shared directories simply by \hostnamesharedmap or \host_IPsharedmap. Between Windows and linux i use SAMBA. But there must be a simple way to copy 2 files between linux hosts?
3. I've searched a lot, and only found people with the same question without a good answer: is there a linux equivalent for robocopy?
I wrote a script to wake up my windows machine and do an rsync backup of some of my files. I wanted to make this command a accessible through local bin so I made it executable. However the problem is that when I copies files is copies them with root permissions and i can edit or delete them. How can I set the files so they transfer with the proper permissions for my Ubuntu user?
Code: #!/bin/bash # Description: This script first wakes up the client machine and syncs the appropriate folders. # Finally the script shuts down the client if it was off to begin with. if [ "$(whoami)" != "root" ]; then echo "Permission Denied" exit 1 fi .....
Having a bit of a issue with Debian Squeeze and transferring files to the Sony PSP..Hook up PSP to USB port and Debian mounts it..I go to drag a 125 meg mp4 to video folder..Copy windows takes about 10 seconds to transfer it..Exit USB mode and there is no video there. Go back into USB mode and look at video folder on the PSP memory stick and there is no video..It vanished. From another after copy progress closed I right clicked PSP and unmounted it..
It error-ed saying device was busy and could not unmount..Looking at light on PSP i see memory stick is still being written to..i wait for light to stop flashing..About a minute or so..Then am able to unmount it..Go to PSP video and theres the video ready to be watched. Debian isnt accurately showing the copy progress...Its showing complete when it isnt..I have to watch the light on PSP to know when it is truly finished.
i am trying to transfer a file from my live linux machine to remote linux machine it is a mail server and single .tar.gz file include all data. but during transfer it stop working. how can i work and trouble shooot the matter. is there any better way then this to transfer huge 14 gb file over network,vpn,wan transfer. the speed is 1mbps,rest of the file it copy it.
[root@sa1 logs_os_backup]# less remote.log Wed Mar 10 09:12:01 AST 2010 building file list ... done bkup_1.tar.gz deflate on token returned 0 (87164 bytes left) rsync error: error in rsync protocol data stream (code 12) at token.c(274) building file list ... done code....
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
using Slackware 13.0, and whenever i trasfer my files to USB, like copy or cut and paste, it will show as if file transfered in an instant,like click paste and poof.the whole 1 gig file transferred in one second, and it wont show dialog box of transfer process, and then i have to predict some minutes and wait (till the transfer actually finishes, i have to usually see my USB's transfer indicator light), if i plug out before or my prediction goes wrong, i end up with corrupted files. This aint related to window manager, as same is case for KDE and XFCE tried thunar, konqueror, midnight commander, all of em resulted with same problem.
I have puppy linux on a 512mb usb flash drive. I would like to put all of it onto a different usb flash drive with more memory. I used windows vista and copied all of the files from my first usb to my second but when I tried to boot it did not work.
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
I have machine A with a public IP address (addr_a), machine B within a LAN of private IP address(addr_b), the router of the LAN has a public IP address (addr_r).If I log into machine A by ssh from machine B, how can I use the command scp to copy files from machine A to machine B?
I copied xp disk to debian computer, now cd is cracked want to put on virtual machine. I built computer with:
AMD PHENOM II 1090T X 6 processor ASUS MB 10Gigs ram 1Tb sata hd
Os Debian squeeze I actually used 40G pata hd for testing purposes. Anyway I want to install xp on Vm. Do I just use ssh or vnc to transfer? Also when I copied to file I'm not sure if any components are missing. It is a registered copy, I have the key. I would have copied xp hd but it's corrupt.
does anyone know of a good backup software for Ubuntu 10.4 that will let me select which folders to backup, rather than a complete backup? My install and settings etc can be replaced, but my photos and memories cannot!
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?
I have an old laptop (PIII 800MHz, with 256 RAM) that I wish to use as my home server: it'll have to serve just two people, so I think that I'll be more than ok as for the RAM and the CPU.The issue is about data, because the internal hard disk is a 12GB, that is...ridicolous! I have more than 60GB of mixed storage and counting (images, videos and music) in an external usb hd. I could put the hd in my desktop pc just to serve the big files through ethernet or let it inside its usb box attached to the laptop.
The question is: which of these solutions will be the fastest? USB 1.0 attached to the server (laptop) or internal hard disk serving files via 10/100 ethernet to the laptop on demand?what about your experience? Is the difference based on a human notable scale?
I want to transfer files (a music folder) between two Linux computers. After searching for the best way to do this, I've seen that there are lots of ways of doing this. I know this has been asked a lot, everywhere and all the time. The main problem with this is that there is no clear, recent consensus on one best way to do this task in 2011 for Linux beginners (even depending on some parameters).
So in the spirit of the Stack Exchange websites, I want this not to be related to my particular situation, but more of a guide to others as well on how to transfer files between two Linux computers over a local network. I think a wiki would be useful for many.
Here's what I found so far:
ssh sshfs scp sftp nfs samba giver
What is the easiest? Most flexible? Simplest? Best solution? What are the pros and cons of each? Are there other (better) options? What are the parameters in choosing the best method (solution might depend on number of files, filesize, easiness vs. flexibility, )?