General :: Keep A Backup Of A Bunch Of Files On A Flash Drive?
Apr 29, 2011
I keep a backup of a bunch of files on a flash drive, so that whenever I change distributions I can just restore all my Android stuff (saves on re-downloading everything). One of these is the Android SDK.
In my ~/.bashrc I add the paths to some executables in the SDK, only if the directory exists, and only if the path is not already in $PATH. For the Android NDK this works fine, but for the SDK I get this:
snfo@snfo:~$ adb devices
bash: /home/snfo/Android/sdk/platform-tools/adb: No such file or directory
snfo@snfo:~$ ls -F /home/snfo/Android/sdk/platform-tools/adb
Everything else is fine though, just that one path is causing trouble.
Now, I've saw something similar to this before whenever you move an executable from one place to another. If you don't re-source your bash config it will continue to keep looking wherever it used to be located. But I've never moved these files.
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I have installed luckybackup software on my ubuntu 10.10 notebook edition. But I don't know how to use it to backup files to an external hard drive. The Hard Drive is a 1 TB Seagate. I don't think that the Destination Drop down menu in luckybackup even shows the External HD.
I'm running Xubuntu 9.10 on an older machine. I've got a flash drive (called "TF_FLASH") plugged into a USB hub. I am using simplebackup to backup my documents (I'm writing my thesis and that is really the only important thing on this computer).
The problem I am having is this: simplebackup will run and backup my files once or twice (I have it set up to go overnight). After that, though, the name of the flash drive changes (from "TF_FLASH" to "TF_FLASH_" - note the extra underscore at the end). So, simplebackup cannot find the drive. If I go into the settings of simplebackup and change the backup destination to "TF_FLASH_" it will work once or twice, but the drive will change to "TF_FLASH__" - again an additional underscore.
I don't know if the name change is being caused by Xubuntu, simplebackup, or some other method. The USB hub is a cheap one, but I don't think that's the problem (my mouse is plugged in and continues to function, etc.).
I'd like to make regular backups of my flash drives, but I don't plug them in regularly. Because of this, I can't just use a scheduled backup application to back them up. Does anyone know of an incremental backup application that will automatically run backups on media as soon as it's mounted?
Basically I need to rename a bunch of .doc files using the for-structure and the mv command (w/ wildcards) in bash. I guess this would be a bit easier if I'd use the rename command, but since this is a school assignment of sorts I need to use for & mv. The .doc files are named "1filename.doc", "2filename.doc" etc. And I've got to rename them to "aaa_1filename.doc", "aaa_2filename.doc", "aaa_3filename.doc" and so on. Tried to dabble quite a bit with the for and mv commands, basically just got a bunch of errors. Every damn time. For 2 hours. The most common error was "mv: missing destination file operand after ..."
I have an rsnapshot backup that I need to move off of a corrupt Linux file system. I need to preserve the internal hardlinks. I've tried rsync -H and using a newer rsync and neither preserve the hardlinks on OS X.
I tried to get rsync -H working and I've isolated it to the file system mounted. I can preserve hard links copying locally (HFS to HFS) but it doesn't preserve when I try to rsync off of a SMB file system mount or AFP file system mount. Is there some mount option solution to getting OS X rsync to obey -H?
I've been working at this for the past 2 days now. My computer got some kind of virus or something that has caused it to loop at startup and continually reset. I run an XP OS on a Gateway. I desperately need to backup my files, because the person who had my backup absently deleted my stuff. I was able to boot up using an Ubuntu disc and I'm in it right now, I've found my files, I have an external hard drive. The problem:First, it wont let me paste into the hard drive. If I drag, it says "Error while copying to "/media/FreeAgent Drive". You do not have permissions to write to this folder." I've mounted the external drive, nothing changes.
I've gone in to properties, is says under permissions that the owner is root, folder access is "Access files" and at the bottom is says "You are not the owner, so you can't change these permissions." The drop downs where I need to change permissions is in gray, so I can;t change it.So next, I tried "gksu nautilus", went to the drive through there, and it let me use the drop down selection under permissions. I tried to change the folder access and I got this message: "The permissions could not be changed. Couldn't change the permissions of "FreeAgent Drive" because it is on a read-only disk." So I tried changing the file access to Read and Write. It didn't give an error, so I thought perhaps it finally worked. I hit apply, and tried to put my files in. Once again I got the message from before that said I didn't have permission. I tried to change the owner so it was no longer root and I got "The owner could not be changed. Couldn't change the owner of "FreeAgent Drive" because it is on a read-only disk."I'm getting so frustrated right now. These files are VERY important to me! The hard drive I have is a Seagate FreeAgent desk 500GB
I'm currently learning to use rsync to backup my music collection. I have a Firefox tab open to the rsync manual page(s) and have been reading man rsync and running experimental rsync operations.I've been doing this for the last 3-4 hours. I've used rsync for this purpose in the past with disastrous results. What was and is once again (due to a month and a half of file pruning) a 9000 file music collection had mysteriously grown to over 25,000 music files and 80GB of data! This was likely due to the fact that I didn't really know what I was doing with rsync and had never spent too much time learning about all the parameters, what their functions are and how they may relate to my goal.Here are the particulars:
* Source drive is a 500GB disk, /media/sata500/music/.
* Destination drive is a 250GB USB disk, /media/FreeAgent/music, connected to the same computer that houses the 500GB disk.
* I want to copy or backup files from /media/sata500/music to /media/FreeAgent/music.
* I do not want to create ANY duplicates of files that exist.
* I only want to add files to the destination drive if they are new on the source drive, like if I rip a CD and add the contents to the source. I want them copied over next time I run rsync.
Here's the rsync command in it's most recently used form, and probably very immature at this point.
This appears to have copied all files and folders and I'm satisfied that my goal has been met with some success. To convince myself of this I ran the command and then once it was complete I added 2 new songs putting them in their respective folders on the source drive and ran the same command again. The resulting output was
Two files transferred. Exactly what I want.Both folders now house 20,931 files and use 40.6GB. Identical as far as I can tell.What I'm concerned about are time stamps and play count data, etc. Anything that changes the original file. I don't want this data to cause a file to be transferred as I'm afraid that the new file will be created along side the old file of the same name thereby starting this whole music collection expansion thing all over again. I've invested a lot of time and effort to get it pruned down to where there are virtually no duplicates and albums are correct in that they contain the proper songs in the proper order.
When I delete files from my usb flash drive on my Karmic laptop (press del key), the properties of the drive remain unchanged (available space/used space). It looks like the files/folders have been deleted, but when I go to my windows machines, the files and folders are now in a folder marked "Trash".Am I missing a step when attempting to delete the information when connected to my ubuntu laptop?
I recently decided to install Ubuntu (ubuntu-10.10-desktop-i386.iso) to my 16GB flash drive (was fat32 originally, tried ntfs as well had boot issues,went back to fat32) to boot from it using the method on this page using the Universal USB Installer. Install worked great, Ubuntu works great, problem is I can't see the rest of the files on my flash drive.
still would like to see some actual LAB DATA - but the info here is satisfactory as a "general rule of thumb". I was wondering if I put files on a USB flash drive & left it sit on the shelf, how long it would be before those files would start to deteriorate? - This would have nothing to do with the read/write cycle as in the "shelf time" it wouldn't be used.
I want to use my flash drive, but I had files I put on using Ubuntu a few weeks back. Now I can only open them in read only copies and can't remove them, from the flash drive. I also have had some issues with file permissions on the hard drive. I was planning on reinstalling after a backup but now I don't know if that would be logical because the files might all be locked. I wanted to reinstall because I have issues with USB and these file issues.
When I try to delete files form my flash drive, the file picture goes away, but the actual data does not. Lets say I put a 900mb file on my 4gb flash drive, then I delete it. It will still ll me that only 3100MB are left in free space. If I try to add more than that it tells me the drive is full. I keep reformatting and reformatting into all different types on file systems but nothing works.
So I made a text file on a windows machine and brought it home on a flashdrive. When I opened it in PCMan File manager it did not show, but executing ls in a terminal shows the text file just fine. It is the only one that appears to be missing in PCManfm. I've had a similar problem going the other direction (Linux to Windows, but with pdfs) many months ago. here is ls -l
Code: Select all/media/FE32-A2F6/Translation/Kevin$ ls -l total 2344 -rw-r--r-- 1 feelactthink feelactthink 2374182 Feb 19 11:19 Artigo 3-SAGE V4.docx -rw-r--r-- 1 feelactthink feelactthink 3686 Feb 19 15:21 HP Cable Recall -rw-r--r-- 1 feelactthink feelactthink 3686 Feb 19 15:21 HP Cable Recall.txt -rw-r--r-- 1 feelactthink feelactthink 4891 Feb 20 17:58 Translation.txt
The file is Translation.txt. What is different about this file that it would do such a thing? It doesn't look at all different from above.
I know it is possible to boot Ubuntu Live from a Flash drive. But it just boots up and runs like its a CD. When you shut down the computer, the changes are all lost.
Is there any way to use the flash drive as a Hard Drive? like install Ubuntu on the flash drive and have the flash drive act as a hard drive - so that if I boot with the flash drive in the computer I can boot of of the flash drive and it would act as a hard drive?
Could I just setup Ubuntu and select the flash drive as the install directory? would that accomplish this?
I have been playing with JWM source and found this cool tutorial at Debian Forums about how its easier to generate a .deb vs installing from source in the traditional manner (./configure, make,etc)[URL] My problem is that when doing the command
dpkg-buildpackage -rfakeroot -us -uc
it starts over, destroys the previous jwm stuff, including my custom files and generates a .deb so, how do I stop it from "cleaning" when i run the above command?
I'm not asking for help here, just documenting something I just discovered. Yesterday I wanted to batch-convert a bunch of old wma files to ogg vorbis. Not wanting to go through intermediate wav files, I tried to use ffmpeg to do it in one go. I first tried using the following command (in a loop, which I won't print here).
Code: ffmpeg -i $file -f ogg -acodec vorbis -ab 192k outputdir/$file "vorbis" turns out to be the built-in libavc implementation of the codec. In the process I discovered that the -ab value is always ignored. No matter what value you put, the output is always the default 64k (average, but of course it's vbr). You can however use the poorly-documented -aq option to set the audio quality used. The values don't correspond to the oggenc values though, being a number ranging from 10-100 (or more, I don't know what the maximum is). It's not exactly clear what number corresponds to what average bitrate, so you have to experiment. ~30 seems to give you an average-rate file, while anything above 60 is probably overkill.
Switching to the external libvorbis gave me more flexibility, although at a cost of much longer encoding times (note that ffmpeg must have been compiled with libvorbis support first).
I could use both -ab and -aq (with the numbers corresponding to the oggenc values), with no problems. ffmpeg does display some wrong values in it's output text, however. In addition, there's one more difference. The vorbis (libavc) codec provides an entry in the header of the ogg container reporting the average bitrate, but it doesn't appear to provide a similar bitrate header in the vorbis stream itself. Some programs may not report the bitrate value because of this.
libvorbis provides both headers, avoiding that problem. So to summarize, libvorbis appears to be a better codec choice than vorbis.
I just got a 1TB external usb hard drive to backup my comedy shows. On my smaller usb 'pen' drives, I set the file system to ext2 (occasional reads/writes) , but should I do the same for this bigger and more frequently acessed drive? (daily read/ocassional writes), should I go with EXT3 for the journaling?
Also, regarding security, I was thinking about making the drive writable only by root, so that when I mount the drive as a normal user, which will be for a few hours daily, if someone does get onto my system they couldnt write to the drive from my user account. That should just be a simple case of setting the device to 755 (and owner=root) should it not?
I keep trying to convert a bunch of jpg files into pdf, but ImageMagick just seems to keep failing there. Well well, after three thousand fix and reinstall attempts (seriously, I've been trying to fix it for the last month or so), this is what I'm getting today:
is there a simple shell script that would recurse all /home/xxx/public_html directories, and then yank this line (it will always be exactly the same) and better yet, for future, is there any way I can REPACE that line with another..