Programming :: Keep Original Mod Date Of A File And 'date' It Back To Same File?
Jul 12, 2010
I would really like to preserve a file's original modified date and pass it back to the file as the same attribute after a script has worked on it. I get a lot of JPEG files from different places on the Net which I either turn around and upload or burn to disk, and having the "original" date of either download or last mod in a graphics app would be for me, in the long run, a lot more helpful when deciding, for instance, which files to "recycle" or pass on backing up more than once.I've tried doing this on my own every now and then. Where I run into problems is that it appears "stat" and "date" use different formats for date information, and I can't seem to puzzle out how to "translate" one to the other satisfactorily for the latter command.
Just to give an example:
stat foo.jpg |grep Modify gives me
Modify: 2010-07-12 06:28:56.890625000 -0400
Passing that string as-is to date foo.jpg, I get the errordate: unknown option -- 0 and the usual semi-courteous suggestion to Try 'date --help' for more information.Somehow my TexInfo database got screwed up somewhere along the line and info dategives me the short article on date input formats, not the full documentation for the command
I am using CRON to create a new, blank file, every minute, in a specific location on my web server. After web searching, and reading man pages, I get the impression that the following command is supposed to work:touch /home/mydomain/var/folder/attachments/`date +%H%M`.txtThis should give me a new file with a file name that is the current hour and minute.However, when executed, the CRON mailer reports:touch /home/mydomain/var/folder/attachments/`date +/bin/sh: -c: line 0: unexpected EOF while looking for matching /bin/sh: -c: line 1: syntax error: unexpected end of fileSo, it looks like shell is seeing the plus (+) sign as an EOFObviously, nothing get created.What would be the easiest, single line command to create an empty file, at a given location, with a time based file name
i have text file that filename contain the date of creation (i.e 2010.05.02.log).I would like to create a script that:-Ask for start date -Ask for end date- Concatenate all file on the requested period by date order.
I am new to Scripting. I am trying to find out particular file is modified in last one hour or not in script and then if that file is modified in last one hour i need to copy that file to another directory.Can any one please provide me how to check the file is modified in one hour or not?
I just switched from a basic digital camera to a more advanced one that stores both Jpeg and Raw (.Nef - it's a Nikon) files for me.When importing files in Digikam, I rename the files so that they start with Date and Time. Example: 20110121-223748.JPG for a photo taken on Jan 21st 2011 at 22:37:48.I was a bit surprised when importing both the JPEG and the Raw version of the same photo, that the filename is different by a few seconds (no constant offset, sometimes they are the same):
20110121-223748.JPG 20110121-223750.NEF
I did some "research" by looking at the exif data of both files (using "exiftool 20110121-223748.JPG" from the command line). Here is what I got back
(amongst other data):20110121-223748.JPG File Modification Date/Time : 2011:01:21 22:37:48+01:00 Modify Date : 2011:01:21 22:37:48 Date/Time Original : 2011:01:21 22:37:48
[code]....
So it seems that Digikam is using the "File Modification Date/Time" (different in the Jpeg's and Raw's of my camera) rather than the "Create Date" (the same for both Jpeg and Raw). (The few seconds difference in "File Modification Date/Time" between the two versions of the same photo is probably due to the time that my camera needs to write away the data on the SD memory card. I guess.) Is there a way to have Digikam use the Create Date? (Or the Date/Time Original?)
I have log files that everyday are downloaded from my webserver in the format: Code: samplesite.com.xxxxxxxxxxx.gz xxxxxxxxxx is a 10 digit epoch time. I am trying to figure out a way in batch to:
1. find all of exisiting files containing the pattern (after the first run it will only be one a day) 2. Isolate the epoch string 3. convert the epoch string to human readable date/time 4. rename the original file as samplesite.com.mmddYYYY.gz
I need this script but I don't know how to do it I have one folder with several folders inside.On each folder a have one MKV or AVI file inside...What I need is a script to change the "modification date" of each folder to the "modification date" of each MKV or AVI that the folder has inside.
I'm new to UNIX scripting; I�m stuck with the following I have an Oracle SQL script that takes three parameters
1- File Name 2- File Path 3- File creation date
Under UNIX I have a folder where files will be placed frequently and I need to upload those files to Oracle, what I need is a UNIX script that can do the following
Loop through Directory "/home/applmgr/snktmp" Picks only files Pass the file name to parameter &1
[code]....
Is the above possible? I already knows how to call the Oracle Script from UNIX Im only stuck on writing the UNIX part where it List the files attribute(name,path,date) and store them to parameters ,Looping until the last file in the directory If the above is not possible,then how can I create the below from the command line
I need to get the modified date on a file in linux to use in a script.I tried using 'ls -l' on the file, but this caused problems when the date turned from a single digit into a double. The reason for the problem was because I was parsing the result string on spaces.How can I get the date of the last time a file was modified so I can use it in a script? For example, if a file was modified on 1/11/2010, I need the 11.
I would like to remove almost all of the Privacy Evading data that a digital camera is generating, such as EXIF DATA. EXIF DATA: Camera Brand, Camera Model, Date taken, Exposure Time, Flash Fired, Focal Length, Location (if you are using iPhone with categorization by location, if enabled), Metering Mode, etc.
May you, please, write a script which does that job for multiple files?Exiv2 seem to reduce more weight than Jhead so I'll use the command exiv2.That's, generally, what I want the script to do:Retrieve the (Modified) date of a file.
I'm trying to put the date in the name of my file but cant seem to get it I just end up with the word date in the file instead of the date. my script looks like this.
#!/bin/bash x="/var/log/system/" y=date z=${y:11:8} top > $x$y.txt
I'm looking for a method for modifying some jpg photo files last modification date with the corresponding timestamp creation date of each file.The reason is that shotwell import pictures in folders according to last modification date which is stupid on my opinion.
As a photographer I'm constantly taking photos and storing them in folders. Now occasionally I'm using two cameras (either for different settings or an assistant is also taking photos) which means that for one event I can have differently named images.Both cameras have the same time set (which always helps in Windows) but in Ubuntu when trying to sort my folder by date taken I can't.The options I'm given are to sort them: Manually, by Name, by Size, by Type, by Modification date and by Emblem.Now none of those are helpful to me once I've done a few edits to the images.So please if anyone knows, how do you organise a folder with images taken on different cameras by Date Taken rather than Date Edited?
How do I know the time and date of a file downloaded from the net. Is it possible at all? If I want to know when the downloaded file such as a text file was created ie written by the author if not mentioed at all in the entire document. The command I use locally to know file creation time is given below.
Code: ls -l filename.txt -rw-r--r-- 1 root root 691 Dec 3 11:12 filename.txt
I tried "slackpkg update", and it came back with Code: ERROR: Verification of the gpg signature on CHECKSUMS.md5 failed! This could mean that the file is out of date or has been tampered with. I googled this message and it pointed me to this thread: [URL]
However, when I type date I get this: Code: bash-3.1$ date Mon Sep 13 19:55:31 BST 2010 Which is correct for me. Not sure what to do now.
I have a script (below) which works ok, but I have tried to modify it as I want to keep the older files for a restore if needed. I have tried adding a date suffix to the newly created files (second lump of code), but it doesn't seem to work.I get the error:
I have some basic experiencing creating simple scripts/making directories/changing permissions/etc. but I'm stumped on this one.
I have two linux boxes. I have a script set up on box 'A' to SCP into box 'B', grab a copy of a database backup and store it on box 'A'. It looks like this:
I have generated a public key on box 'A' and placed it into the authorized_keys file on box 'B', so a password is not required and the file copies over successfully when the script is run. On to my problem...
I need to know what date the 'dump.23.gz' file was originally created when I'm viewing it after it's been copied to box 'A'. If I ls -l on box 'A' it only shows me the date it was created on box 'A' when it was copied.
What would I need to add to my script to append the backup's original creation date on box 'B' to the filename so that when it gets copied to box 'A' I know when the backup was created on box 'B'. I'm sure this is probably confusing. I've done lots of searching and can only find information on how to append the current date and time to a file name. I need to append it's original creation timestamp to the filename when it copies over.
I wrote this little script and I need some help, I am trying to achieve following:Every day I receive new file in the /home/denis/MyData/ folder and I don't know what the file mane will be but I want to move any file that arrives there to the new location /media/DataBackup/Linux/backup/ (media/DataBackup/ is external 500GB USB drive)to automatically create new folder with the date and time stamp every day and then to move content of the /home/denis/MyData/ into the new folder with current date stamp. So every day there will be new folder and will contain files for that day only.My script is as follows:
cd /media/DataBackup/Linux/backup/ mkdir MyData_$(date +%Y%b%d_%HH%MM) #this creates file MyData_current date and time
When I replace a drive in a RAID 1 and then resync it, why does the file access date (all the files) on the drive from which I am syncing not change? Shouldn't the file access date always change when I copy a file? Are there ways to overgo this?
the last filed of /etc/passwd file is the login_shell;how if I replace it to /usr/bin/date what would it happen.by the way I try to use $ subut do't know the password.what s the default root passworf for ubuntu