Ubuntu :: Back Up A Large Directory ( 13 GB ) To DVD?
Jan 21, 2010
If I wanted to back up a large directory (13 GB) to DVD, what would be the best way to do this? Basically, what is the easiest way to make an archive that is split into volumes small enough to burn to disc?
I m having a RHEL-5 sever.ABC directory size is 57GB after taking backup in the same disk with name ABC.bkp showing 56GB. i used below command to copy/backup. # cp -r ABC ABC.bkp (different sizes after copying)..I checked both the directory sizes by #du -sh <ABC> and du -ks <ABC.bkp>In both GB and KB there is lots of difference (200mb). why this will happen in copying? what is the solution for above question? what is the correct way of copying 1dir to newdir exactly?
This thread was nearly titled "The volume Filesysyem root has only 128 KB free space remaining" then I discovered the cause my Encrypted Private Directory had grown to 20GB eating all the free space on my Ubuntu system partition. Here's what happened:All was well with my system last night, left it downloading 2 GB of files from the internet to an NTFS drive to return to low space errors this morning.I checked and nothing had been downloaded to my Ubuntu partition, and even if it had, it could of handled the 2GB without issue. Did some reading on here and the first step I tried found the problem:
Code: mark@media:~$ df -h Filesystem Size Used Avail Use% Mounted on
I'm having a bit of an issue with Lucid installed via Wubi. I stuck the OS on its own partition (30 GB in size), and don't store any large files in the Ubuntu file system (when I download something large I move it to another hard drive.) I don't have anything wacky or esoteric installed on my system.
I've been consistently having a problem where, after a few hours or a few days of being booted up, Ubuntu warns me that my available HD space is dangerously small. The amount of available HD space Ubuntu sees then shrinks from a few GB to nothing within a few minutes, and the only way I can seem to solve this is to reboot. Taking a closer look at what's happening, my Home folder balloons in size until there's no more writable space recognized. But there are no files being created or added to, so it looks like there's a bug of some sort. This SEEMS to be correlated with watching videos (or maybe it's the pulling of large files from a mounted directory into RAM? My videos are all on another HD, as mentioned before). I can generally go a few days without getting the "low space" message, but I can't seem to make it through a full 2-hour movie without getting the error.
I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like
for file in ls *; do cp {source} to {destination} done
then because of ls command , its performance degrades.How can I do this?
I'm trying to crawl a directory on a website and basically download everything in it. The structure is simple enough(but there are also multiple folders), but there is one thing that makes wget choke up.Both of the links work, but they are both the same thing. So wget will download the same file twice. How can I make wget ignore the first one? but this doesn't seem to actually do anything. It will still download the duplicate URLs
I am currently trying to copy a directory of roughly 400GBs to dvd, have gotten myself stuck. I tried to tar and then split; however, I don't have enough room on my hard-drive to make a compressed tar and split it up and then burn to disk, so I need a way to tar the and compress the directory, split it, and burn to disk every 4.3GBs.
I went ahead and installed DAR as an alternative, as I hear it is designed for this type of task, but I can't figure out which way is heads or tails.
I know it is possible to do... but I am not sure how to go about the whole thing. Here's the scenario. I run a lab. Lots of PCs. As time goes by, the older ones dont have the memory or disk space to run more modern apps. But I want to put them to use...
What I am trying to do, and have started, is the following: 1. Install Linux on a bunch of them, make a share on each of these. I've already installed FreeNAS on four machines. (Let's call these machines ClientA, ClientB, ClientC, and ClientD). And have made all the available diskspace
2. Install Linux on a fifth machine (call this Machine1) , and on this machine combine over-the-network all the shares from ClientA, ClientB, ClientC, and ClientD into one large "virtual" directory on Machine1. I know this is do-able, but what I hope to have is the total disk space from all the machines in step 1 to be combined for the purposes of saving files. Not sure which file system to use. For example, if all the other four machines have 2GB of space each, I want to be able to be able to save a 7GB file.
3. And then allow sharing of this one large directory using Samba.
4. Then allow lab users (not on any of the above mentioned machines) to be able to access the Samba-enabled large shared directory on Machine1 to read and write files. The user will have no idea where that the files[s] is/are not on Machine1, and that it maybe segmented in some way, nor should they care.
I understand the risks (if any one machine of ClientA, ClientB, ClientC, and ClientD goes down, lose probably everything). I am considering throwing mirroring into the mix (mirror Machine1's large directory), but that can wait.
So in the above scenario, what file system can I use on Machine1 to combine all the shares from ClientA, ClientB, ClientC, and ClientD to make one large "virtual" directory?
I've looked at UnionFS, but from my understanding while it combines directories, the maximum file size is the size of the largest share. Is this true?
I am running CentOS 5.5 with a 14T ext4 volume. We are sharing out a few sub-directories via NFS. Our customer was doing some penetration testing with our web app that writes to one of those NFS shares. We are not sure if they did something to cause the metadata to grow so large or if it is corrupt. Here is the listing:drwxrwxr-x 1 owner owner 470M Jun 24 18:15 temp.badI guess the metadata could actually be that large, however we have been unable to perform any operations on that directory to determine if the directory is just loaded with files or corrupted. We have not run an fsck on the volume because we would need to schedule downtime for out customers to do so. Has anyone come across this before
if i was in /home/user/directory1/directory2/directory3/directory4 and i changed directory to /home/user/, how do i quickly go back to my previous directory (namely //home/user/directory1/directory2/directory3/directory4 )
long time reader and this is the 1st time i have ever had to ask any thing. i got a new hdd and installed it in my laptop, i cloned my partitions but i ended up with a misaligned drive. i backed it up to a usb drive with the idea of just copying every thing back to the new drive when fix, that didnt work. i had made a back up of my home directory with out encryption but i was stupid and accidentally deleted it. now i am out of ideas i have tryed mounting my home directory in a live cd using
Code: Recovery of an Encrypted Home Directory is possible from an Ubuntu 9.10 LiveCD. Mount the disk partition containing the Encrypted Home Directory:
ubuntu@ubuntu$ sudo mount /dev/sda1 /mnt Establish a proper chroot environment: ubuntu@ubuntu$ sudo mount -o bind /dev /mnt/dev ubuntu@ubuntu$ sudo mount -o bind /dev/shm /mnt/dev/shm ubuntu@ubuntu$ sudo mount -o bind /proc /mnt/proc ubuntu@ubuntu$ sudo mount -o bind /sys /mnt/sys ubuntu@ubuntu$ sudo chroot /mnt
So, I wanted to give myself a short cut to not just login to a remote server, but also change into a particular directory once I got there. This was harder than I expected, but this finally worked when I wrapped this up into a shortcut: ssh -t user@example.com 'cd /var/www/mydir; bash' And I just alter the directory path to make another shortcut to a different place on the same server. This does work, however, it seems when I log in this way, some of my environment is lost, and my locale is set back to the default "POSIX". That's not good. I'm running Gentoo Linux (amd64).
I need to backup my home folder (and a few other folders) on an organizational Linux NFS system where my account will be expiring soon, onto a personal hard drive (which is not using a Linux filesystem). I access the account through SSH and SFTP. I want to backup all metadata for these files and directories and everything in them, including dates, owners, groups, UID/GID numbers, CHMOD permissions, etc. How can I go about doing this? Do I need to run the LS command recursively on the directory with certain settings of what information to display, and pipe the results to a file so that the information will be in a file regardless of which filesystem I move it to? Or is there a way to save all the metadata using something like TAR/GZ? If it's with TAR/GZ, then how do I view this metadata on other filesystems that I move the archive to, and will the users/groups stored remain intact as long as it's not extracted?
In addition, do you know how to do this for SELinux metadata and AFS (Andrew File System) metadata too? (These will be for another filesystem later on, but if don't know the answer to either of these, please still answer the above.)
how big and widespaced the fonts on Clementine playlist are and how good they look on the appmenu (where my mouse pointer is). This is not because Clementine is QT4, I've got the same problem with Chrome, Opera etc. I've been messing with system-settings (KDE settings tool) a day before the fonts become that widespaced in order to make my KDE apps look more native on my GNOME, but I haven't touched the fonts settings there.
I am using back in time to back up files from home and from another mounted directory on my system (ntfs). The back-ups are occurring automatically and appear to be complete; but, I cannot delete old back-up snapshots in the backintime GUI Also with sudo nautilus or as root in terminal with (rmdir) I cannot delete the snapshots. My drive is filling up and rather than uninstalling back in time, I would like to simply delete the unneeded snapshots. How can I delete these files? Is there an rsync file that I should configure to delete these? My expectation of backintime was that it would back-up at the requested frequency and not create complete duplicate copies of the files, but, use symbolic links to unchanged files. How can I verify if this is the case? Does the cron file control this>
When I booted up this morning the contents of my Home directory are all showing up on my desktop, and there is no single Home folder. How did this change, and how can I change it back so that the Home folder is on my Desktop with the contents inside of *it*?
Sometimes you get more than you ask for and in this case, I did: I had no idea (had the computer for a few years now) that I was running a dual core 64 bit machine. The silly thing is that I have 32bit Fedora 11 on it, 32 bit versions of all my installed software...etc., etc. Am I able at this point to salvage anything or is it best to just back up the home directory and then do a reinstall?
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
10.04 I managed to delete the Volume Applet that appears by default on the task bar. I'm sure it used to be listed in 'Add to Panel' in other version of Ubuntu however I can't see it there in 10.04. Can someone explain to me how I get it back?
i have noticed that if vista is not the active partition, hybernate does not work. it just goes black and then back to the user icon screen to log back in. another "slight" problem was that i was not able to apply a service pack. after restoring vistas dominance i was able to install the pack. is there any other work arounds for hybernate? even though you might not be interested in cleaning up after microsoft.
it was possible to back up time machine back ups from a mac in ubuntu.
I use a mac at work and use time machine to back up to an external hard drive which i take home each day. I wish to back up the time machine back ups off the external hard drive each day to my computer at home just to be safe is this possible?
I have managed to open the hard drive and have enabled view hidden files so i can see all the files but i am unable to copy them due to permission errors
I,m using Ubuntu 10.10 with Gimp. Ive got a lot of photos etc and need to back these up. Can I anyone suggest a good backup solution which does not require e to keep copying the same files? IE: Once the files are backed up I only want to back the files used since last back up?
I've done a low level format on them so they're completely empty. When I use them with my windows machines, they're absolutely fine. When I plug them into my Ubuntu machine, there is a hidden directory created called 'RECYCLER' which I'm assuming is for deleted files?However, it also creates a .exe file in this directory called 0x2D9FA278 which has an Icon with an H in it and a comment of 'Facebook Photo' This has the effect of making all the directories on the stick into shortcuts! I googled the file name and it seems to be some sort of Trojan, but I don't understand how it's go into my Ubuntu machine, I've scanned with ClamAV and it finds nothing.
I downloaded a mouse theme form gnome look and installed it in the themes. But it has not appeared in the pointer themes section in custimation even though it said that it is installed correctly.When I drag the file to install it again it says something along the lines of it cannot copy a directory over a directory.Where can I find where the mouse/pointer theme is located and delete it. I have searched filesystem, google and these forums and not had any luck yet.
Recently I mounted a larger partition into my home directory since I was running out of space, Everything went smoothly, but it caused me to wonder about something I cant figure out. While playing with the mount unmount commands when I was copying everything over... before editing my fstab.
Is there a way to access the files that existed in a directory before you mount a partition to that directory? after mount the original files are gone.unmount and they are back, Where do they go?
I want to update all the machines in the network from a central repository which is on my master server and whose archive directory is shared through samba.I searched in the man page of sources.list and found that there is an option for this but can't able to implement this. Can anybody kindly tell me the way to do the same.