Debian :: Find And Copy Same Files To Single Directory?
Feb 17, 2011
I have a number of crash.log files scattered about my system and I would like to run a command to find all the crash.log files on the system and copy them to a single directory; each with a unique filename. For example, copy crash.log from ~/directory_1 , ~/directory_2 , ~/directory_3 and so on to ~/crash_logs/crash.log1 , ~/crash_logs/crash.log2 , ~/crash_logs/crash.log3 etc.
I'd like to copy a file, say widgets/water.txt, to all subfolders in the folder widgets using a single command. So if the folder widgets has 10 subfolders like widgets/blue, widgets/green, etc. I'd like to copy water.txt to all of them with one command.
I tried the commands
Code:
cp water.txt ./*/water.txt cp water.txt ./*/
However these don't seem to work. The latter gives 'cp: omitting directory' errors.
i want to copy a few files from my windows directory into the wine directory - its no big deal, just a few preference files so i dont have to set something up all over again. trouble is, i had the files copied, but i cant find the wine/ c: drive directory anywhere, anyone know where this can be found??
After i try to find logfiles follow date/month/year. i want copy this files to another directory with name's directory is time you find(date/month/year).
I installed a new music application. It reads covers.jpg as the cover of the album, however, my covers files where named album. I dont want to rename, I want to make a copy of album.jpg and if possible as well rename it to covers.jpg. The file has to be in the same folder that it currently is. I have looked around to see how I can do this but have not been able to.
Recently i am trying to check on the rsync speed for single file(2.4GB iso) directory ( 900MB directory with files inside ) When i run the rsync for single file: the speed i get is average 50MBps However, for a directory: average speed is 10MBps Is there any reason behind this ? i tried to google but unable to get the concept.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
How to copy a Read-Only file in Linux and make the copy writable with a single cp command in Linux (Ubuntu 10.04)? The --no-preserve and --preserve seemed to be good candidates, except that they should "and" the mode flags, while what I am looking for is something that will "or" them (add +w mode).
More details: I have to import a repository from GIT to Perforce. I want that all Perforce depot files are Read-Only (that is how Perforce was designed), while all other files that were derived/copied from depot files are writable. Currently if a Makefile tries to copy a Read-Only file then the derived file will also be Read-only. This leads to build-errors when cp tries to overwrite Read-Only file second time. Of course the --force is a workaround here but then the derived file is also Read-Only. Also I do not want to mess with "chmod" after each "cp" command - I will do that only as the last resort.
I am using Gpodder for the moment and would like to copy the 3 most recent podcast episodes of every podcast to a second directory (in fact the second directory resides on my android phone, mounted via usb)The setup is as followsGpodder downloads the episodes in
I am trying to read mulitple files and copy its contents (files) to a respective directory. I have a folder "Oracle". Under this i have mulitple files as mentioned below.
Each of the above has a list of files in them.
Now i have created new directories with the same name as the above files in another location say
The script that i looking at has to read the files from Oracle, backup its contents to its appropriate new directory.
shell scripting in Fedora14I want a script"Find in curent folder for files, and it copy first file he find with name gived by user, if name already exist then echo error message and finish"command usage " bash scriptname copyASname"
smthing like Code: #!/bin/bash for files in /home/user/* do
I am using my media server as my podcast collector. I am in the process of learning the ins and outs of NFS so i can mount a NFS directory and transfer my podcasts from server to player. For now i am using scp to transfer podcasts from server to desktop then to player. The problem is the path to the directory of one of the podcasts is /home/user/gpodder-downloads/The BILL&TIMMY Show Podcast.
whenever i try and run my scp command it fails because it thinks that TIMMY is a script i want to run in the background. I have tried to back-slash escape the character, i've tried single quoting and double quoting the character and i still get the same problem. as it sits now i have to move all podcasts to another directory and then transfer them to my desktop...but i would like to transfer the podcasts without un-necessary steps.
I have directory a and directory b. They are big. b is almost identical to a. "almost" means that 4-5 files differ, and I don't know which they are. I want to copy b over a, but only the files that differ. i'm in bash.(no, I can't simply delete a and replace it with b, because 1) a is version-controlled 2) a full copy (or a mv) would take too much. I want to copy only the files that differ).
We have two folders: source folder and destination folder. In source folder we have many sub folders and many files of different type!Script that would copy or move defined number of files from source to destination folder. Files must be selected randomly and sub folder in source folders must be selected randomly and we don't copy or move defined number of files just form one sub folder in source folder. In destination folder sub directory structure of source folder should not be preserved. Solution should be robust and as simple as possible.
I was trying to develop a script which needs to check the count of files on hourly basis and if it find any addition it has to sftp and send a email on the status with filenames and number of files copied via sftp. I will put it on cron to run every hour.
I'll use ls /abc|wc -l to count the no. of lines for the first time and from then whenever a new file will be inserted it'll copy that file to another location or I'll take the date of the files and whichever is having a new date that will be copied to another location.
Is there a simple command to copy files that have been created within the past 2 hours?I've been looking through the man pages for unisonrsyncfindcpand I can't find anything I'm looking for.All I need is a simple command.Code:Copy folder a to b if created < 2 hours.
I'm trying to copy a sample set of files/pictures to a directory on my desktop. For my sample from /home/user/pics containing 7,000+ pictures, I have a desired list of:
Code:
user@computer:/home/user/pics$ ls | tail
I use that to generate a list of a few files that I'd like to move to my desktop. I tried:
Code:
user@computer:/home/user/pics$ ls | tail | cp /home/user/Desktop
I thought that might dump the tail list of files for an argument in the cp command, but no luck. I then tried:
Code:
user@computer:/home/user/pics$ ls | tail | cp . /home/user/Desktop
copy a compact flash card with a form of Linux on it (Found out it was custom version based on Fedora Core 3). The flaky USB card reader seems to have hosed the flash card, it shows up with unknown volume after ejecting the card and reinserting it. My troubleshooting: I have Ubuntu on a flash drive that I used to start all this to read the flash card.
- I tried Disk Utility to reformat the card as Master Boot Record and the volume as ext3 with flag set to Bootable and copied the files using cp in command line.
- I tried ISO Master & mkisofs to make an ISO that the USB thumb drive tools can use, but it wouldn't copy all the files. Looks like symbolic links either were ignored or couldn't find the source file with -f.
- I learned that I might need a boot partition with a boot image, which I think I have in initrd-2.6.14.7img, but I don't know how to do that. Do I also need a swap partition?
My updated goal: using the files from the flash card, make a bootable compact flash card with Fedora Core 3.
Originally Posted by Kenny_StrawnPlease wrap [CODE] tags aroung any code posted here. The full source that way could still be posted.I am trying to copy all the files in the directory based on the modification date (i.e created on Dec 29). Not able to find the proper command for this. This is what I have tried.
I have a question which has been in part answered many times but nothing I found relateds completely to my situation. I am sure there will be people who will say RTFM but believe me I did, and searched as well but to no avail. I have a situation where I want to copy files created withing last hour in one directory into another one. The problem is that that the directories are on different levels in the dir tree so the absolute path is different. But I want to keep the relative path the same.
I want to copy new files from /mnt/path_to_webdav/user to /home/user. so if there is new file /mnt/path_to_webdav/user/doc/xy.txt I want it to be copied to /home/user/doc/xy.txt. Also if there is a new dir, say /mnt/path_to_webdav/user/newdir I want a new dir to be created in /home/user/newdir with all the files in it, should there be any. I can do find with exec and copy all the files into one directory.This is not what I want though. How do I preserve the relative path and get the files copied into their corresponding directories?
I need to copy all subdirectories and files from one directory to another ever 5 minutes or so, with the old data automatically being overwritten with the new data. I'd also like this to run at startup. Is there any way this can be done? If so, what program would I need to schedule the automation and what is the command line I would need.