Ubuntu :: Backup NTFS Files Modified After A Certain Date?
May 14, 2010
I clone my entire notebook hdd once a month to a USB drive with an identical disk once a month using dd. I would like to find a way to automatically or manually do incremental backups at shorter intervals.
The first problem is that my incremental backup drive is not the same as my full backup drive (which is my clone). Is there some way to backup or copy all files on a document partition modified after a certain date?
The second problem is that my document partition is NTFS-3G. I guess this could be done pretty easily using "dump" if I stored my docs on ext. [I don't because I want to make sure that my docs are accessible from any machine (say in an Internet cafe) should my MacBook die and I need to rip out the hard drive and run to do my homework on another system; that is why I keep my docs on my Vista partition].
Prior to making a fresh install of 10.04, I made a back up of all my documents by copying them to a NTFS partition. I did this my selecting files in File Browser, then right clicking and selecting the Copy command.
When I came to move the files back after the fresh install, I was mortified to find that all the file modification dates had changed to the date I copied them! I've lost all the original file dates, which was the principal way I sorted my files. I guess there's no way of getting it back? It seems that Linux does not store File Creation dates either so I'm stuffed.
I need to know all files modified within a date and time range.E.g: All modified files between 20 April 2010, 1100-1200 Hrs."find / -mtime +10 ! -mtime +11" :: this i found for date but how to include time as well.
Sequentially number files based on date modified (rename cli)
I'm almost done a larger script which takes all the pictures in a folder, converts it to video, and emails it to me. Everything worked fine until I realized the picture filenames weren't always starting at 1, then ffmpeg chokes.
I have a bunch of files in a folder which I need to rename to:
I don't want to install any additional packages and I'd like this to run in a single command if possible.
If not possible, then a bash script would work too.
well, i know ther are issues when using rsync to copy files to ntfs partition like file permission blah blah. the thing is, i need to backup my music files periodically onto a ntfs partition from ext4. i really dont care about file permissions or any other stuff. when i use rsync, it should update the mp3 files on my ntfs (external) disc with the new ones.can i give a go with this operation? i have lot more important files on the external disc and i dont want this rsync corrupt or delete those files coz they are highly important files.
so I was wondering how I could do a simple find which would order the results by most recently modified. Here is the current fine I am using. (I am doing a shell escape in php, so that is the reasoning for the variables. find '$dir' -name '$str'* -print | head -10
How could I have this order the search by most recently modified. (Note I do not want it to sort 'after' the search, but rather find the results based on what was most recently modified)
I am new to Scripting. I am trying to find out particular file is modified in last one hour or not in script and then if that file is modified in last one hour i need to copy that file to another directory.Can any one please provide me how to check the file is modified in one hour or not?
I have two linux servers, they are backup together.
1. Server 1 have 3 files with name: file1, file2, file3 in the path: /etc/sysconfig/network-script/.
2. Server 2 have 3 files with name and path are the same as server 1.
- How to make a script to copy 3 files at server1 to overwrite on server2. But before overwrite, this script will check and compare the last modified date of these 3 files(on server1 and server2). if the modified date of file1, file2 or file3 on server1 is newer than 3 files on server2 then overwrite process will do, if not, will do nothing.
- see my script as below: it works find now but just overwrite. not check last modified date.
I am trying to restore an NTFS partition from a backup and I need the new drive to have the old (dead) drive's UUID (which I recorded).I really really really cannot use the option of changing fstab to mount using a new UUID, for this case I need the old UUID that existed on the other drive.Is there some ntfs equivalent of tune2fs that'll let me change the UUID on an ntfs partition?
I am actually modifying someone else's script and I need some help. The original script rotated image files to the left but it changed the "modified date stamp" which is something I didn't want.
Code: #!/bin/bash while [[ -n "$1" ]]; do #if a file and not a dir if [[ -f "$1" ]]; then
#the images that I copy from my cell phone don't have exif headers #so I am using the -mkexif switch first to match the exif information #to the "created date" in the .jpg file. jhead -mkexif "$1"
It's important to note that the original script, before I made any edits, did not have this quirk whereas I needed to "touch" the file to get it to orient itself correctly. The 'original' script is in black; my additions are in blue.
I need to get the modified date on a file in linux to use in a script.I tried using 'ls -l' on the file, but this caused problems when the date turned from a single digit into a double. The reason for the problem was because I was parsing the result string on spaces.How can I get the date of the last time a file was modified so I can use it in a script? For example, if a file was modified on 1/11/2010, I need the 11.
I'm looking for a method for modifying some jpg photo files last modification date with the corresponding timestamp creation date of each file.The reason is that shotwell import pictures in folders according to last modification date which is stupid on my opinion.
Possible Duplicate: Linux equivalent to robocopy? I have two websites - one is basically a development version and the other is a production version of the same site. So I'd like to be able to merge the changes made to the development site based on the modified date of the files. Is this possible with the 'cp' command?
I'm about to do a migration on a laptop where I have had to make a number of modifications to files mainly in /etc/ but I have lost track of what I have done. Does anyone have any suggestions as to how to identify those files that have been modified from their packaged versions?
I want to upload files from my computer to an FTP site and I don't want to upload files that are already on the server. So I need a tool that finds out which local files that are different from the ones on the server, or that don't exists on the server.
I'm using a cheap provider that does not support rsync or ssh, so I can only use FTP. I generate the files before uploading them, so comparing timestamps is meaningless. I've tried lftp with the mirror command. It's slow (I think it uploads all the files). I upload the files from different computers, so I can't use sitecopy, which uses a local database to keep track of which files are on the server. I'd like to be able to upload all changed files with one command. Preferably no GUI application. And it needs to run in Ubuntu.
I was thinking about creating a tool similar to sitecopy, but which stores checksums of all the files on the FTP server on the server itself. But then I thought that there may already be such a tool.
I am searching for a program which may be used in order to display a list of modified (non-distribution-default) configuration files. For example, assume we have installed package "example-utility" which uses /etc/example-utility.conf as one of its configuration files. The package provides a default configuration file upon its installation. Assume we have modified /etc/example-utility.conf according to our needs. This file should be included in the listing produced by the program I am looking for.
If such a tool does not exist, I would like to create it. However, I am new to RPM-based systems, and, as such, I am having difficulties finding the necessary documentation. Should I be reading the yum source code? Is there some sort of document describing the package database on RH/CentOS/etc. systems and how 3rd party applications are supposed to work with this database?
I have a simple scripting question. I am trying to list all files that have been modified in the last day and then collect metadata on those files. This command is going to be run on a number of nodes via ssh so I would like to append the hostname to start of each line (the below example has blade1 as the hostname). As you can see the loop is splitting the ls command out onto a separate line for each value. What I need to do is keep the `ls -ld` output all on one line and have the hostname echoed in front of each line.
for i in `find /var -mtime -1 | xargs ls -ld`; do echo `hostname` $i; done blade1 drwxr-xr-x. blade1 2 blade1 user blade1 group blade1 4096 blade1 Nov blade1 30 blade1 08:55 blade1 /var/cache/gdm/user