General :: Back Up Scrip - Find / Cp / Md5sum / Rm - Move All Files And Directories
Oct 22, 2010
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
I'd like to move a selection of files from all the sub-directories within an overall directory to a single destination. I don't want any of the directory structure, just the files themselves. This is what I tried so far:
mv /dir1/*/igs*.sp3.Z /dir2
There are other .sp3.Z files in the * directories within /dir1 but I just need the ones that start with igs..
I have this script that I use to find log files in the /var/log directory that are 2 days old, move them to /var/log/tmp, rename them to the system date.filename and move them back to /var/log. Everything seems to work as planned, except that the files don't get moved out of temp, and they keep getting rename. This leads to very long filenames such as:
What is it about this script that isn't moving it back to /var/log? Also, is there a better way of doing this than what I'm doing? Basically, I'm just trying to set up an audit trail on some of the files in /var/log, so that at the end of the month I can tar them, and then have our syslog server pick up the one giant monthly log.
I have a question which has been in part answered many times but nothing I found relateds completely to my situation. I am sure there will be people who will say RTFM but believe me I did, and searched as well but to no avail. I have a situation where I want to copy files created withing last hour in one directory into another one. The problem is that that the directories are on different levels in the dir tree so the absolute path is different. But I want to keep the relative path the same.
I want to copy new files from /mnt/path_to_webdav/user to /home/user. so if there is new file /mnt/path_to_webdav/user/doc/xy.txt I want it to be copied to /home/user/doc/xy.txt. Also if there is a new dir, say /mnt/path_to_webdav/user/newdir I want a new dir to be created in /home/user/newdir with all the files in it, should there be any. I can do find with exec and copy all the files into one directory.This is not what I want though. How do I preserve the relative path and get the files copied into their corresponding directories?
I need to strip the executable flag from all files within a certain directory and sub directories. Right now I'm doing it with a 2 step process
find /dir/ -type f -exec chmod ugo-x {} ; find /dir/ -type d -exec chmod ugo+rx {} ;
Is it possible to modify the first line so that I can strip exec flag from all non-directory files? Since this needs to be done on a fairly regular basis across a lot of directories and files, I'd prefer not to use a bash script which would slow it down.
I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
As I'm gonna transfer large amount of data folders from one hard drive to another, I wanna make sure that the transfer has not corrupted the data. how could I generate MD5SUMs of entire directory including sub directories, in a single file and later, how could I verify with the data I've just transferred.
What i am trying is to check the file duplication in a folder and remove a file if it is a duplicate of another file ie the contents are duplicate; but names may be same.
Basically i am using md5sum to calculate the md5sum values of each file and redirecting to a file. And i am thinking of comparing the md5sum values.But i am finding it hard to decide how to complete the code after redirecting the output of calculation of md5sum to a file.
I noticed something a little odd I'm hoping someone can enlighten me on. I noticed in a couple of cases that a package has the proper version, but differs in two regards.
1. The package ends up with a .el4 on the end of the version for Red Hat 4.
2. The actual MD5Sum of the files the package provides differ.
An example below:
Code:
[root@RH4ES32-MCE bin]# for i in `rpm -ql GConf2`;do md5sum $i;done; md5sum: /etc/gconf/2: Is a directory 9f90335546f7c57ae6fb552cc2b919c5 /etc/gconf/2/path md5sum: /etc/gconf/gconf.xml.defaults: Is a directory
[code].....
So my package changed slightly to now show .el4 versus just 2-2.8.1-1 I've indicated in the first output above that the first couple of lines differ. I stopped my comparison at that point as they truly are different.
When I am in Nautilus, I want to be able to select a directory, then right click (or some other action) to do a file find on that directory. The gnome-search-tool would be a good candidate for this, if it could be an action in Nautilus. I know I can do a file find through other means, but Nautilus seems to be where I am when I want to search directories.
what i wanted to do was find all the files with a specific name from a tree, sort them by modification time and have their directory appended to them so that i knew where they were (because they all have the same name). i tried a whole bunch of different things and finally did this:
this did the trick pretty well, but as you can see it is far from elegant and i think i'm doing some things wrong and kludgy
first thing i tried was "ls -lRt | grep world.sav" which worked except i couldnt distinguish the files because there were no directories. that took a lot of looking till i accepted i couldnt make ls print directories as well and append them to the files somehow that their relationship would be clear. i tried piping ls to find, doing it in reverse, passing them from grep etc. etc. until i read some more stuff online that got me using gawk and sort. the questions:
1. is there some other, more elegant and simple way to do this kind of detection and sorting?
2. is there any way to use a pipe after using exec? the semicolon seems to prevent this entirely, forcing me to use an intermediate file as above. i could just remove it later, but i'd prefer a straight piping.
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files 1- directory 2- .txt files 2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
I tried setting up sftp for my users. Each of my user have their home directory at "/var/www/public_html/$USER". When my users are using sftp, they can only see their own directories and unable to move to other locations of the system. I followed through the following tutorials: [URL]
The users are able to sftp into the system successfully. However, they are able to see the whole system. Somehow, it appears that the users are not jailed in their home directory although in the tutorial it states otherwise. The difference of my system against the tutorial is that I am using Dropbear for SSH server while it is using Openssh server. Although dropbear does not support sftp, I am able to login through sftp through the use of sftp-server. For the internal mechanics, I am not sure how though.
Assuming that when I tried to SFTP, the sftp-server is ran with the sshd_config, then everything should be working fine right? Do i need to run chroot command at all? The following is the procedure I used to attempt the objective:
1) Add a new user to the group: SFTPonly 2) Chown user:SFTPonly user/home/directory 2) Modify the sshd_config to what is reflected in the tutorial and other paths.
am new to linux and trying to find a file in sub directories using find command as:find .-name *.jpg -type fBut I am unable to get the result as find command is not permitted by the server administrator.Is there any way to find files without using find command.
We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.
Here are the commands I have mainly been trying with various edits:
Code:
Code:
So far the most common complaint I have gotten "missing arguments to execdir".
I must be having a "senior moment".I just downloaded 'debian-sq-di-rc1-i386-netinst.iso' but I can't for the life of me find a list of Debian md5sums.I know I've done it before but I'm stumped. Sorry to be a pain.
I want to copy all directories, files, and hidden files and hidden directories with one command. I want these items to replace any same items in the target directory.
I have tried several things, such as:
cp -r * cp -aR *
but I only seem to get visible files and directories. Obviously, I am missing something. (A brain, probably....)
I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.