I have a machine running Karmic sat in the corner of my office. I scp stuff from my mac to it quite a lot over my local wireless network.At the moment, it holds a .sparsebundle backup of my mac created by SuperDuper! (mac backup application) that I periodically transfer over. I'd like to make this system a little more sophisticated and use Apple's Time Machine to automatically backup to my Ubuntu box if possible.
Anyone have any experience of this? I'm kinda hoping for something a bit more straight forward than this: http:[url]....
I'm wanting to setup a file server for home. I'll probably use the following: Ebox, Firefly Media Server, TorrentFlux I'm still not sure about how to go about redundancy for the store, possibly software RAID 5. ZFS would be cool but the linux support through FUSE and large memory requirements are a bit of turn-off. I'm also not sure how to go about supporting TimeMachine on my new Macbook Pro. I've seen a couple of old guides on the web discussing Netatalk and Avahi but noting new.
I'm thinking I'll probably use a spare 2.5" 80GB SATA drive for the OS drive, and then probably 4x2TB HDD (for 6TB of useable storage in RAID 5). I'm not sure about what mobo and cpu to use yet. I was originally thinking to go with an atom based Mini-ITX mobo but to get one with 4x SATA ports is hard enough, let alone trying to get one with 5-6x (1 for OS and 4 for data) so I'm thinking maybe I should get a low power Intel or AMD based Micro-ATX mobo with 6x SATA. I'm also thinking I'll setup CrashPlan for online backup from the server (plus directly from my Macbook Pro and my Desktop).
I have a situation where a directory has about 1.5 million files in it. On an hourly basis, I want to be able to find any files that have changed in the last hour, compress them, encrypt them and then copy them to both a local backup machine and an off site backup.
Is there any kind of utility or kernel module that creates some type of log of modified files? I know I can use find, but the search for -mtime in this directory takes quite a while and will not suffice for an hourly backup.
All of a sudden, my computer feels sluggish. Mouse moves but windows take ages to open, etc. uptime says the load is 7.69 and raising. What is the fastest way to find out which process(es) are the cause of the load? Now, "top" and similar tools isn't the answer because they either show CPU or memory usage but not both at the same time. What I need is the single command which I might be able to type as it happens - something that will figure out any of: "System is trying to swap 8GB of RAM to disk because process X ..." or "process X seeks all over the disk" or "process X uses 400% CPU"
So what I'm looking for is iostat, htop/atop and similar tools run into one with an output like this: 1235 cp - Disk trashing 87 chrome - Uses 2GB of RAM 137 nfs_bench - Uses 95% of the network bandwidth
I don't want a tool that gives me some numbers which I can analyze but a tool that tells me exactly which process causes the current load. Assume that the user in front of the keyboard barely knows how to write "process" but is quickly overwhelmed when it comes to "resident size", "virtual memory" or "process life cycle".
My argument goes like this: User notices problem. There can be thousands of reasons ... well, almost. User wants to know source of problem. The current solutions give me lots of numbers and I need to know what these numbers mean. What I'm looking for is a meta tool. 99% of the data is irrelevant to the problem. So what the tool should do is look for processes which hog some resource and list only those along with "this process needs a lot of CPU, this produces many IRQs, this process allocates a lot of RAM (and it's still growing)".
This will be a relatively short list. It will be much more simple for an newbie to locate the culprit from this list than from the output of, say, htop which gives me about 5000 numbers but requires me to fold multi-threaded processes myself (I have 50 lines which say VIRT 2750M but only 16GB of RAM - the machine ought to swap itself to death but of course, this is a misinterpretation of the data that can happen quickly).
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
I've tried to google but not much luck. What I would like to do is have anumber of folders on my desktop and their contents, replicated/duplicated into another folder on the same PC in real time. So for example, if I were to change an OpenOffice document in a specific folder on my Desktop it would be replicated/duplicated in real time. If I had three folders on my Desktop A, B and C they would also appear/be backed up (in real time) in a folder called /home/backup. Can this be done?
I am looking for web base real time iftop like tool for linux.I mean it shows current active connection on a NIC of any Client that connected to it .I do not want offline data I want realtime data for current connections on web.
I have a network of 100 machines, all with ubuntu Linux. Is there a limit to the number of machines that can connect to one single machine (at the same time)? For example, can I have 99 of my machines maintain continuous ssh connection to the 100th machine? Can I have every one of my machines (every one of the 100) maintain a continuous ssh connection to all other 99 machines? How much memory does each such a connection take?
I am running Live 12 on my CD rom drive of my dying laptop. I have a major Windows registry error on that system and am working to recover my files. I have successfully moved a couple of folders from the laptop to my Seagate Free Agent Drive as a test.What I would like to know is, is there a way to copy my files and folders without literally dragging and dropping each one? We're talking 140 G of folders....sigh.
I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.
So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.
i want to know mac address of a particular ip but the problem is that i am unable to ping that ip but that ip is being used by someone in my local network that i know from my proxy logs. i want to know the mac address of that ip,
I am looking for software that would allow me to do an automatic installation on a machine with some predetermined rpm and partitioning.To start I went to the package mkcd, but quite complicated to use, I can not quite understand what to do. In looking further I found Bcd, which from a predefined XML file creates an iso, it looks easier to use but I do not know if I can do exactly what I want.A bootable CD that installs linux automatically with rpm added.
I found other software that can match my search as "KickStart" but made for RedHat and rather poor in tutorial, there is nothing familiar in the field. "Fully Automatic Installation" [URL] which at first might work with Fedora and others.
I'm trying to use this tutorial (URL...) to backup my Ubuntu 10.10 (ext3) operating system. I've successfully gotten it into a TAR file on my external hard-drive, but inside the archive are 2 folders: sda5 and media (/media/sda5/), sda5 ofcourse containing my operating system.
I run VMWare as my virtual machine software, but I could easily run Virtual Box if the situation calls for it. On my virtual machine I created an Extended partition, made a 2GB swap space and the rest is ext3 (ext3 space is mounted as sda1) (at this point, Swap is OFF) here is what I've tried to do inside the virtual machine to restore:
Code: sudo -s mkdir /media/FROM/ mkdir /media/TO/ mount /dev/sdb1 /media/FROM/
but instead it just creates the directory sda5 under media of the live ubuntu cd Do I need to CHRoot in these conditions? AFTER I get the files successfully into the virtual machine, how do I go about restoring the grub2 bootloader? Right now I haven't tried to restore grub on my hardware, but I would be interested in doing so. There are a vast immense amount of forum posts about this subject, but all are to mixed results. Can anybody tell me the absolute definite way to restore grub2 successfully, I don't want to try something if it's going to mess up my install, whether I've backed up or not.
for further reference, here is a link to the previous (failed) thread I made about this same subject:
I have an old pc currently running ubuntu server 9.10. It was configured during install to connect to the home wifi router by a PCI ethernet card, which worked all well and good. However, at the moment I cannot connect to the router (I have moved the machine too far from it). I want to connect this machine (desktop) to the server so I can SSH into the box and backup some files. I need help creating a simple wired network connection between the two, as I have no clue as to where to start.
I have two desktop machines, each with U 10.10. I want to backup data on my main machine to a partition on the other machine. I am going to use luckybackup for that. Problem is, from the main machine, I cannot access the backup partition on the other machine (but I can see it). I get prompts for passwords and when I enter them nothing happens, it seems they're not valid. Bottom line, I can't access that partition.
How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?
[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.
I've a question in regards to RAID, im new to the whole RAID thing. Okay, my question. I have a machine with RAID 1, and I'd like to upgrade the two disks on this machine, BUT in case my backup image does not work, is it possible to power the machine off and re-insert the original disks back in and be rocking and rolling without a prob?
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I have a CentOS5 server with a 1tb hard drive.There is only 80gb of data on that huge drive and now I want to make a bare metal recovery backup using AcronisMy question is, how can I estimate the amount of time the backup will take and the size of the image file? Is it based on the size of my drive or is it based on the amount of data on the drive?
I want to back up my data on my MacBook Air using time machine. I have a desktop with Debian gnome installed where I want to store my back up data. But I can't manage to find a hard drive to start time machine on.
I have four hard drives installed in my Debian computer and I also want to share them over my home network. I am very new to Debian ...