I back up using a small rsync script. I've tried other methods but just keep coming back to it.
I have enough space that I can back up the entire root file system, but for speed and economy there are certain folders I'm not backing up. I understand some are dynamically generated on boot, others are caches.
Anything else I should exclude? How about APT's cache? /var? Any other caches?
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
Is it possible to just tar /home, /opt, and a couple other directories, and then save them on some medium, so that when doing a fresh install, I could just extract the folders from the tar file and keep my information?
I don't plan on doing daily/weekly backups, but I want a copy of some folders so that I can keep as many of my settings as possible in case I need to do a fresh install.
I also don't have the ability to create another partition on my hard drive to be able to put /home on it.. but yes, I looked into that route, as well.
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files 1- directory 2- .txt files 2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
Is it possible to restrict users to their home directories and allow admins to have different home directories? Essentially I want users to have a folder in /var/www/html/$USER and admins to have either unrestricted access or have their root directory be ./ or /www or /etc. I have is set now so users have access to thier home direcotry but I need to upload web files as admin.
10.04 I managed to delete the Volume Applet that appears by default on the task bar. I'm sure it used to be listed in 'Add to Panel' in other version of Ubuntu however I can't see it there in 10.04. Can someone explain to me how I get it back?
i have noticed that if vista is not the active partition, hybernate does not work. it just goes black and then back to the user icon screen to log back in. another "slight" problem was that i was not able to apply a service pack. after restoring vistas dominance i was able to install the pack. is there any other work arounds for hybernate? even though you might not be interested in cleaning up after microsoft.
it was possible to back up time machine back ups from a mac in ubuntu.
I use a mac at work and use time machine to back up to an external hard drive which i take home each day. I wish to back up the time machine back ups off the external hard drive each day to my computer at home just to be safe is this possible?
I have managed to open the hard drive and have enabled view hidden files so i can see all the files but i am unable to copy them due to permission errors
I,m using Ubuntu 10.10 with Gimp. Ive got a lot of photos etc and need to back these up. Can I anyone suggest a good backup solution which does not require e to keep copying the same files? IE: Once the files are backed up I only want to back the files used since last back up?
I recently installed language packs for Japanese and changed my system language to it, too. The problem is, now that I try to go back to English, the locale doesn't change back, only the menus are in english. "Apply system wide" in the Language Support didn't do anything; Firefox is in japanese too. Here is my locale output:
is there any way to ignore whole directories in Ubuntu-One? I figured out that you can use /etc/ xdg/ ubuntuone/syncdaemon.conf to ignore files by adding a regex to ignore.default, but I assume this only applies to files. So, how can I avoid the syncing of some subdirectories of my synced folders? I guess I could use symlinks as a workaround, because I read somewhere that Ubuntu-One wouldn't follow them.
How do I delete just directories and not files when performing a "rm -r foo*" command? E.G. I have foobar.txt foofoo.o foorebar.jpg and foo/ foonuggets/ and footemp/ in a directory. In one fell swoop how do I delete just the directories and preserve the files?
Seeing as how I only use the -r switch when removing directories I accidentally ran this command and removed files that I wanted (luckily nothing vital). Lesson learned now I want to prevent ever doing that on files that *are* vital.
I have a string of files named recup_dir.1 thru .66. I have the command sudo rm -r /home/"name"/Pictures/recup_dir.1 to remove them one at a time, how do I remove 20 or thirty at one command?
I'm using an external USB Iomega Rev 70GB HDD. (Have been for a few years). It uses UDF format for higher transfer speeds of large files. (This is not an option, & not changeable), therefore natively recognised since 2.6.20, I think. (But no official Linux support).
Occasionally though, I run into problems such as can't format a disk in Linux, need to boot to Wins to do it. Also like today: So I have a bunch of comic pdfs & cbrs on it, mix of .rar files & folders. They were there yesterday. Today I went to read something and I couldn't see them. Thought maybe I'd deleted it by accident, but no. Looked at the drive in terminal and I can see the "missing" files and folders fine, but can't see them in Nautilus to open in Comix.
-Tried setting all permissions on the whole drive to R/W/E through Nautilus.
-Files & Folders aren't hidden.
-Tried rebooting.
-Tried using Administrator Nautilus.
-Thought it might be a re-emergence of the recent Nautilus Preview bug, so swiched File Preview off.
Just in case I have some kind of error (again) I am wondering what directories I could restore without causing a boot error or force me to play with configure files using a live disk on the next reboot.List of directories I could restore that I know won't cause a boot error.
I have a large music directory that I'd like to somehow acquire, or generate, a list of each sub-folder within it, and then somehow get the list into a spreadsheet format. Is there a way to do this?
I go to places-acces server-ssh and connect to a remote server with Nautilus.All ok.But I prefer to use vifm as my main file manager: I try to find the ssh-mounted devices in /mnt or /media but cannot fin them.Does anybody know where they are?
I just installed sudo (I have slackware but always got better help here ) and I tried doing a
Code: sudo du / to see if I could get a general size estimate of all the directories and despite running it sudo I still was told "Cannot read directory" on some of the directories on my pc. should sudo have made it so I could read them all?
I'm a newbie who has just installed Maverick on my old HP box and I have to say I'm hugely impressed. I have 2x 160Gb drives in the machine, is there a utility which will let me backup/mirror certain directories automatically, whilst still being able to utilise the remaining space on the second disk?
I would like to convert all my eps files recursively in a directory tree to pdf. I am currently using the following script to convert all files of a single directory:
for i in *.eps; do epstopdf "$i"; done
need to convert the whole directory tree, but the script above didn't work wit-R option
have been working how to do this for a while finally getting around to asking....
as a long time windows user I have used a trick of creating directories that will always be the first ones displayed by using an _foldername model, for ex:
_temp A dir B dir temp
assuming the four above are directories that is how they are sorted by default in windows and I commonly use this technique for special folders.
for my virtual machines folder in windows looks like this:
_base LX-Ubuntu9.1 LX-Ubuntu11.4 XP-TestVM ...
where I put base virtual machines in _base and it makes it easy to separate them for all the other folders which are virual machines....SO....
in Ubuntu, I cant figure out a character or way to ensure this type of behavior, for example if I create a _base folder it shows up after a dirs and around the bs etc...
any know a way to create a dir with a special char that will ensure it stays on top?
I was looking to start a dual boot and was inquiring to see if I can download files from the Windows 7 boot and direct it to the Ubuntu partition/home directory.
I'm trying to setup ncmpcpp correctly and I'm running into an issue with mpd not reading my directories correctly. I dual boot ubuntu/w7 on this computer, and for the sake of laziness I just keep all of my music within the default windows folder ( since half my music was there first ). Now I have a symlink within my home directory's music folder that points to this, however according to mpd it isn't a director.
I have an external fat32 hard drive. It already has some files on it. Now, I mounted it and it shows the owner to be root. fine. but when I change the permissions, it does not seem to change it. I am not able able to access the directories, even as root !.I was able to create directories, but "luckybackup" wan not able to write to it due to permissions. I take it the lucky backup is executing as root/ but even otherwise, if I do chmod ugo+rwx, why do the permissions dont change ? (no erorr message by chmod). here is the stuff: (ext drive mounted at /seagate)
I downloaded some backgrounds from Gnome Art and am having trouble moving them to the backgrounds folder. I've been trying this: sudo mv desktop <filename> usr/share/backgrounds.I moved to the desktop to make it easier.