General :: Write A Script To Move Specific Files In Various Folders To One Folder?
Nov 16, 2010
Terribly new to Linux and find it mindboggling. I work on brain imaging and unfortunately all of the analysis runs on Linux, and I do not understand computers well coming as I do from a medical background. So my question - There are various folders of patient MRI scans (folders called P1, P2, P3 etc) and within them are enclosed certain files that I am interested in (always called the same name in all folders, say image001). I would like a script that enables me to copy and move this image001 in all these individual folders to another folder altogether.
Many years ago, I converted a portion of my files to an arbitrary format with a specific extension. i no longer desire to have them in this format and i would like begin the process of replacing them because conversion is not an appropriate solution. unfortunately, they are mixed in separate folders of the same root folder with files in my current format of a different extension. I feel it would make this process easier if I were to move every folder that contained a file with the undesired format to a separate root folder. The files are stored on a Linux server and shared via samba. How can I do this with a couple of commands or a script? I am open to other suggestions as well. I want to avoid time spent editing text files. Ultimately, I'd like a command that produced a list of full paths for folders, sorted by the number of levels would be a nice touch. A list of all of the files is clearly not what I'm looking for.
I am using secure delete to remove files from a Debian Linux PC. However, secure delete does not remove folders. This has lead me to look at writing a script that would move files to a predetermined folder for deletion. My plan is as follows:I have a folder on my desktop called shredder where I move the contents of the waste bin to. The script needs to identify all files within the folders and sub folders, within the shredder folder, and move each file to the shredder folder and then delete the folder. At this point secure delete can be used with a command like shred -v -u *.*on the shredder folder.The problem I have is in creating the code to move files from the different folders and then deleting the folders. Note that the names of the files, folders and subfolders will not always be known
I've got a folder called Foo. In foo, there are 20 folders called bar1, bar2, bar3,...,bar20. In each of those barXX folders there are 2 files. How can i move those 2 files up one level into Foo with one command?
I dual boot a computer from separate hard drives in Windows XP and Ubuntu 10.04. Here is the deal: On my windows drive I have a folder that is filled with folders inside folders packed full of files in all the folders. There is a 100% possibility that I have multiple copies of any file in multiple locations. Is there any nice command or program to move all the files in all the folders to one central folder and in any way get rid of the multiple copies? Also, how do I compare to files that may or may not have the same name, but otherwise be identical to see if they are identical?
I have a script that checks a folder for zip files than moves them to a different folder. I want to check every 5 maybe 10 seconds and since cron is setup to run at least a 1 minute increment I'm not sure how to do that time check as probably a loop within the script. One other thing is once the time check is in the script how would a cron job be setup to run this script? Once the script is running cron doesn't need to run it again, is there a feature to check if it's running and if it's not then run it?
allow specific user permission to read/write my folder
I have a folder called /TAR/Sketch
I added a new user, named Snoopy, I want to grant this user the ability to add files & directories to this folder which is under the group Sketches and the owner is me.
In Linux bash shell, for a given directory, how can I list:The create date for that directory The number of files in that directory The number of subdirectories in that directory.
I wanna copy all folders and files created from 01.01.2011 until today to new placeie:cp -r /home/moviecar/public_html/wp-content/uploads/ /home/teaser/public_html/wp-content/uploads
Our client has a website (Joomla 1.5 based) to promote participation in and report results of a 10k race for runners in Scandinavia...
The race is scheduled for late June and will be covered by at least 5 still photographers at various points along the race course.
There are to be about 5,000 runners, each entered under a shirt number like bib_1000 through bib_5999, and this means there will be a significant number of digital jpg files submitted by the photographers...
The client has in place some software which will "automagically" resize, watermark, and rename the individual files in the form:
And so on with a possibility of as many as 10 or 12 images in the aggregate with the same alpha-numeric "shirt number" at the beginning of the filename.
Our presentation software does a great job of handling thumbnail image generation and displaying a slideshow in a lightbox of all files within a given folder on the webserver.
And now finally the question...
Given a folder containing 25,000 or so *.jpg files, how can we write a script that:
1. parses the filenames to unique "shirt numbers" 2. makes folders with each "shirt number" as the foldername 3. moves the files from the root of the original folder into the appropriate "shirt number" folder.
Note that the order of my list above is not important, and if you know of a better way to organize the task we are fine with that.
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I am looking for a script/advice or guidance on how to write a script so that when I use the 'del' command it removes/sends the files/folders to a I specify for example 'dustbin
In my bash script I need to move files in a folder if it is not in use.
Code: for entry in `ls /root/shared_storage/input`; do echo $entry run=`lsof /root/shared_storage/input/$entry` ru=${run:0:5} echo $entry if [ "$ru" == "" ]; then ........ It worked fine sometimes but sometimes it just get stuck at lsof. Is there any other way that I can use here to check if the $entry is using some other process?
I need 2 Linux users to share a folder. Within this folder, users should always be able to create files and sub-folders and write into any sub-folder (whether they own it or not). However, they should only be able to edit the files they actually own.
I need to search a bunch of files in a specific folder for a specific number and add all the numbers together to a total sum. I use Rsync everyday, everytime I run rsync i get a logfile (rsync output) witch contains the textstring "Total bytes sent: xxxxxx".
The "xxxxx" can vary in lenght. I need to extract the "xxxxxx" from each file and add the numbers together to a total size over a week or a month. Is this possible? And I wish to only use bash. One way of doing stuff at a time my friends .
I have many files and folders in my source folder. I want to copy some files and folders from that source folder to destination folder. What should be require to given with the "cp" command?
sudo cp ../../../rootfs_maker_ramdisk drivers/filesystem/ -rf give below errorCode:cp: cannot create special file ...._rootfs/dev/hda4': No such device or addressi get this error only in some specific locationsif i dont use sudo then i get permission denied msg
I am trying to use the command gzip to compress a directory or file list as argument and compress the file in a file named copia101225, within a directory named ZIPFILES. I want to make sure that if the arguments doesn't exist, the destination directory doesn't exist that it creates it. I keep failing at compressing the file to copia101225, that is within the directory named ZIPFILES This is what I have so far:
#! / bin / bash # Title: Compress a file # Author: Jose Miguel Colella # Description: Compress a file
I`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
I'm trying to move font files (.ttf and .otf) from the download folder to a folder Inkscape can find them in. I tried dragging and dropping them in Dolphin but I don't have permission! So tried in the terminal:
Code: ~$ mv ~/downloads/fonts/*.*tf /usr/share/fonts mv: cannot stat `/home/bryan/downloads/fonts/*.*tf': No such file or directory
i need to write a short script that will compress a specific folder that`s on the Desktop (and all it`s content) and also will encrypt it with a password that is inside the script --->meaning it wont ask for a password+verification when compressing+encrypting
I've got a NAS running and I'd like to somehow make some of the folders and files invisible to certain users only. For example, if I 'ls' a directory, I want to see files 'a', 'b', and 'c'. But if another user does 'ls' in the same directory, I only want them to be able to see 'a' listed.I know I can use 'chmod +700' to make certain files not able to be read/written, but the filename would still appear in a 'ls'.I know I can put certain files inside of a '.hidden' file in the folder, but then it would be hiEdit : I'd also like to mention that the users that connect to the NAS could be coming from Windows or Mac operating systems. So hopefully the solution would work for users from those systems also..
I need to write a shell script which can ready content of the folder and place files on remote FTP server. I need to make sure that a file that is already placed on remote FTP server is not attempted second time. The file names will be something like Records-2011-05-09. The files will be generated by MySQL every hour.
10.10 on a ext4 partition. I deleted a folder that sat on a NTFS partition that I use as data storage. I note that if I delete folders or files on this NTFS partition there is not the option to move to waste basket - it is just deleted. If the folder still exists on the hard drive (has not been over written) I may be able to retrieve it - but where could it be? On the NTFS partition?
I have a folderA that contains folderB that contains a lot of files. I would like to get rid of folderB, but not its contents. I want those contents to be inside of folderA. How can I accomplish this on the commandline?
What would be the best way to automatically copy all of the data off of a library of cds to a specific folder on the computer? I was thinking of running a bash script but I've run into a few snags figuring out the correct way to do it. Mainly due to the fact that the cd drive is mounted in a different folder in /media each time I insert a new diskAlso, the mounting and unmounting process causes it's own problems, but I think that could be covered by a for loop that checks /mtab every few seconds or so.