Ubuntu :: Command To Copy Files In Directory Created Within 2 Hours?
Apr 22, 2010
Is there a simple command to copy files that have been created within the past 2 hours?I've been looking through the man pages for unisonrsyncfindcpand I can't find anything I'm looking for.All I need is a simple command.Code:Copy folder a to b if created < 2 hours.
I need to copy all subdirectories and files from one directory to another ever 5 minutes or so, with the old data automatically being overwritten with the new data. I'd also like this to run at startup. Is there any way this can be done? If so, what program would I need to schedule the automation and what is the command line I would need.
I created a directory somewhere with permissions rwxrwxr-x so that other users in my group can create files and directories in it.
I do need to be able to delete the contents in this "public" directory, but it seems that while I am able to remove any files in this directory I cannot remove and subdirectories under it.
Is there a way to remove such subdirectories owned by others under a directory owned by me?
I have a directory cookie_tmp which is owned by some:fella. Session cookies are being created under this directory as How can I set the directory so that files are created and owned by some:fella ?
i want to copy a few files from my windows directory into the wine directory - its no big deal, just a few preference files so i dont have to set something up all over again. trouble is, i had the files copied, but i cant find the wine/ c: drive directory anywhere, anyone know where this can be found??
After i try to find logfiles follow date/month/year. i want copy this files to another directory with name's directory is time you find(date/month/year).
I have backup_server and application_server.backup_server has directory AAA. I need to check from application serverthat is there any new files created today in the AAA dirctory. if yes, all files were created today or partial files?.
I would like to know how to use grep command to filter the log files created between 3:00 PM to 4:30 PM in buch of log for whole day in different headings. This files resembles like sar file in linux.
Is there a method at the command line to copy files from one location to another and retain the source files group and user?I'm migrating some MySQL files from one machine to another.I want to back-up the original files in the directory presently. They have owner:group of mysql, some have owner:group root:mysql and so on. To copy them under cli or Nautilus everything changes to root for I execute sudo cp or gksudo nautilus and copy via gui.
Since it is MySQL data I could simply do a dump of the database and restore it on the other machine. But there's about 20 db's and I want to do this via a copy for it will be faster - at least that is what I think.
I am using Gpodder for the moment and would like to copy the 3 most recent podcast episodes of every podcast to a second directory (in fact the second directory resides on my android phone, mounted via usb)The setup is as followsGpodder downloads the episodes in
I want to copy file from the Server cd drive and USB drive to the server root directory, but I haven't find any command of listing the cd drive or usb drive.
I'd like to copy a file, say widgets/water.txt, to all subfolders in the folder widgets using a single command. So if the folder widgets has 10 subfolders like widgets/blue, widgets/green, etc. I'd like to copy water.txt to all of them with one command.
I tried the commands
Code:
cp water.txt ./*/water.txt cp water.txt ./*/
However these don't seem to work. The latter gives 'cp: omitting directory' errors.
We have two folders: source folder and destination folder. In source folder we have many sub folders and many files of different type!Script that would copy or move defined number of files from source to destination folder. Files must be selected randomly and sub folder in source folders must be selected randomly and we don't copy or move defined number of files just form one sub folder in source folder. In destination folder sub directory structure of source folder should not be preserved. Solution should be robust and as simple as possible.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
I am trying to read mulitple files and copy its contents (files) to a respective directory. I have a folder "Oracle". Under this i have mulitple files as mentioned below.
Each of the above has a list of files in them.
Now i have created new directories with the same name as the above files in another location say
The script that i looking at has to read the files from Oracle, backup its contents to its appropriate new directory.
I'm trying to copy a sample set of files/pictures to a directory on my desktop. For my sample from /home/user/pics containing 7,000+ pictures, I have a desired list of:
Code:
user@computer:/home/user/pics$ ls | tail
I use that to generate a list of a few files that I'd like to move to my desktop. I tried:
Code:
user@computer:/home/user/pics$ ls | tail | cp /home/user/Desktop
I thought that might dump the tail list of files for an argument in the cp command, but no luck. I then tried:
Code:
user@computer:/home/user/pics$ ls | tail | cp . /home/user/Desktop
I am using my media server as my podcast collector. I am in the process of learning the ins and outs of NFS so i can mount a NFS directory and transfer my podcasts from server to player. For now i am using scp to transfer podcasts from server to desktop then to player. The problem is the path to the directory of one of the podcasts is /home/user/gpodder-downloads/The BILL&TIMMY Show Podcast.
whenever i try and run my scp command it fails because it thinks that TIMMY is a script i want to run in the background. I have tried to back-slash escape the character, i've tried single quoting and double quoting the character and i still get the same problem. as it sits now i have to move all podcasts to another directory and then transfer them to my desktop...but i would like to transfer the podcasts without un-necessary steps.
-the command to copy the file Practice.txt to a new name of Myfile.txt while in the home directory-found -command to create a directory in the home directory-found -say i just created a new directory called "test". whats the command to delete the test directory.-found -command to create a blank, text file without using an editor. -the exact syntax in Linux you would need to rename the file to a new name-found
I have directory a and directory b. They are big. b is almost identical to a. "almost" means that 4-5 files differ, and I don't know which they are. I want to copy b over a, but only the files that differ. i'm in bash.(no, I can't simply delete a and replace it with b, because 1) a is version-controlled 2) a full copy (or a mv) would take too much. I want to copy only the files that differ).
I have a number of crash.log files scattered about my system and I would like to run a command to find all the crash.log files on the system and copy them to a single directory; each with a unique filename. For example, copy crash.log from ~/directory_1 , ~/directory_2 , ~/directory_3 and so on to ~/crash_logs/crash.log1 , ~/crash_logs/crash.log2 , ~/crash_logs/crash.log3 etc.
I was trying to develop a script which needs to check the count of files on hourly basis and if it find any addition it has to sftp and send a email on the status with filenames and number of files copied via sftp. I will put it on cron to run every hour.
I'll use ls /abc|wc -l to count the no. of lines for the first time and from then whenever a new file will be inserted it'll copy that file to another location or I'll take the date of the files and whichever is having a new date that will be copied to another location.
I'm looking for a way to copy files with a certain file extension over to another folder. For exampleSource Folder: /home/user/downloadsFile Type: *.epubDestination Folder: /home/user/epubs/The downloads folder has several folders that may go as deep as 2 or 3 levels.I tried this but it didn't seem to work (and I'm not really sure what to do to modify it to get it to work).Quote:find . -maxdepth 1 -type f -exec grep -q "pattern" '{}' ';' -exec cp '{}' /path/to/destination
copy a compact flash card with a form of Linux on it (Found out it was custom version based on Fedora Core 3). The flaky USB card reader seems to have hosed the flash card, it shows up with unknown volume after ejecting the card and reinserting it. My troubleshooting: I have Ubuntu on a flash drive that I used to start all this to read the flash card.
- I tried Disk Utility to reformat the card as Master Boot Record and the volume as ext3 with flag set to Bootable and copied the files using cp in command line.
- I tried ISO Master & mkisofs to make an ISO that the USB thumb drive tools can use, but it wouldn't copy all the files. Looks like symbolic links either were ignored or couldn't find the source file with -f.
- I learned that I might need a boot partition with a boot image, which I think I have in initrd-2.6.14.7img, but I don't know how to do that. Do I also need a swap partition?
My updated goal: using the files from the flash card, make a bootable compact flash card with Fedora Core 3.