General :: Find Files With Specific Extension And Move To A Directory?
Apr 18, 2011
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
Many years ago, I converted a portion of my files to an arbitrary format with a specific extension. i no longer desire to have them in this format and i would like begin the process of replacing them because conversion is not an appropriate solution. unfortunately, they are mixed in separate folders of the same root folder with files in my current format of a different extension. I feel it would make this process easier if I were to move every folder that contained a file with the undesired format to a separate root folder. The files are stored on a Linux server and shared via samba. How can I do this with a couple of commands or a script? I am open to other suggestions as well. I want to avoid time spent editing text files. Ultimately, I'd like a command that produced a list of full paths for folders, sorted by the number of levels would be a nice touch. A list of all of the files is clearly not what I'm looking for.
i want such a shell script or single line command to delete all the files with extension specified in script i have bash !! ex... delete all files of extension .obj
Terribly new to Linux and find it mindboggling. I work on brain imaging and unfortunately all of the analysis runs on Linux, and I do not understand computers well coming as I do from a medical background. So my question - There are various folders of patient MRI scans (folders called P1, P2, P3 etc) and within them are enclosed certain files that I am interested in (always called the same name in all folders, say image001). I would like a script that enables me to copy and move this image001 in all these individual folders to another folder altogether.
I want a list of all my mp3 files (or any other kind of file, actually) telling me HOW MANY OF THEM I have in my computer.I tried with both find and locate commands in terminal, but they don't tell me how many files I have.
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
I'm trying to find all zip files timestamped from the past 7 days, then unzip them into a different director.I tried the following, but it only unzipped one of three files that meet the 7 day criteria. What am I missing?Code:find /home/user/public_html/zip_files/ -iname "*.zip" -mtime -7 -print0 | xargs -n10 unzip -LL -o -d /home/user/public_html/another_directory/
the system currently have a directory with all the invalid files. how bad is it to move a single file to a directory containing 3 million files already?
I would like to overwrite files in a directory tree, recursively. The ones I would like to overwrite match the filename "x_alpha*.png" and have a size exactly 456 bytes. Is there any way to search for these recursively in a directory tree, and overwrite them with a reference file, for example "e:mydirgood.png"
I am using Windows 7, but I have UnxUtils, so I can use those too. What I am looking for is something like this, generated automatically: copy /y e:mydirgood.png e:mydiracx_alpha0023.png copy /y e:mydirgood.png e:mydirefgx_alpha0045.png copy /y e:mydirgood.png e:mydirhx_alpha0248.png
I am a member of a group which has written a program whose source code is being held in a specific directory (~cs252/Assignments/basicAsst/project) and we want to go through and change the parameters for the function "sequentialInsert." My job is to find all occurances of the function call to "sequentialInsert" and to also list the files from where the code came from. Also, I have to be in the commandsAsst directory when I do this. I have tried grep and find combined together, and I am at a lost.
I'm looking for a script that can be run regularly with Cron.
Check a folder for Rar files every few minutes, Unrar if present, and delete the left over files once done.
Be able to specify the directory of which folder to watch within the script.
Run an extension white list (.avi, .mkv, .mp4) and blacklist (rar files) of files to be moved.
Specify within the script which folder to move found files to.
I've seen a few online that does some of this or much more than this but I'm looking for something that just does this in a simple and efficient way... (Also for the life of me, I just can't get how to edit this to do what I'm looking for)
I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
I have a script that checks a folder for zip files than moves them to a different folder. I want to check every 5 maybe 10 seconds and since cron is setup to run at least a 1 minute increment I'm not sure how to do that time check as probably a loop within the script. One other thing is once the time check is in the script how would a cron job be setup to run this script? Once the script is running cron doesn't need to run it again, is there a feature to check if it's running and if it's not then run it?
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
how can I rename all files in a directory up to the first dot (there by leaving the file extension alone) to the same thing? Im trying to rename all my media files and associated files in a directory to (preferably) the name of the directory it self. if I have
Code:
A Clockwork Orange - wzzyfg.cd1.avi wzzyfg.cd2.avi wzzyfg.nfo ACO.fanart.jpg orange.tbn
Id like to automatically mass rename them all to
Code:
A Clockwork Orange A Clockwork Orange.cd1.avi A Clockwork Orange.cd2.avi A Clockwork Orange.nfo A Clockwork Orange.fanart.jpg A Clockwork Orange.tbn
I have rename on my server which I used to remove underscores from file names, but I dont know how I would use it to rename everything up to the first period. Bonus points for renaming stuff to the name of the parent folder!
Is there an easy way to do a recursive command line search on a path for a particular type of file extension?I want to build a script that will check for the existence of any .xxx files in a recursive path, if they exist, I would like to run the "mail" command to send me a message. I already have mail running on he server.My thoughts were to tryQuote:ls -R |grep .iniorQuote:find . |grep .inibut neither of those return only the .ini files, they also return files that are named such as .ini.bak, .ini.original, .ini.old, ect...
I have directory with sub directories in it. Inside I have bunch of pictures. I would like to find all pictures, and move them to one tmp directory. While moving there might be files with same names. The command I use:
now the problem comes with overwrite if there are two files with same name. Is there any simple way to copy all files into one directory and not to loose any, appending certain, even random char, to the 2nd file would do.
i need to know how to find number of files in a directory? is there any system calls in fedora 12.And i need to know how to perform a operation if the that count increases by one?
I am trying find files in a directory that contain numbers. I have tried ls /etc *[0-9]* but that doesn't work. If I cd to /etc and run ls *[0-9]* it almost works but it also includes results from within files. My last thought was to try: find /etc [0-9] -type f but this does not work either. My second problem is that I am trying to get list of files in a directory that were changed less than 10 hours ago, using grep, while leaving out directories. I am completely stuck with the second problem.
I am trying to move all the txt files with a script from multiple directories to one directory, adding the parent directories of the files to the file names.It's a little complicated to explain, but i hope the script i have so far explains what im trying to do better:
I am looking for Windows Search equivalent looking for file name patterns (not file contents but file names)....
I am aware of "globbing" and wildcard recursive search functionality in ls but I am still not capable of finding files under directories.
for example: I want to find all files starting with a string lsnr* under root directory / and any sub-directories.....
ie I want to look for files like lsnr*.* anywhere under / and any sub-directories under / such as /dir1/dir2/dir4 and dir1/other/dir/someotherdir/sub-dir etc.
so if I have /dir1/lsnrcontrol and also have /dir1/dir/2/dir3/lsnr-tinit.dat then I want to list the files names etc.