General :: Untar And Only Extract Those Files That Are Above A Certain Date
Jun 16, 2010
Is there any way to untar and only extract those files that are above a certain date including directory structure??
I restored a backup on a play server but it was a few days old. However I have a tar archive of the entire structure that is more up to date and healthy so now I want to extract all files (including directory structure) based on a date filter on the files if possible?
untar a bunch of files located in different folders, with folder deep unkown.Found an old post about this matter but the suggestion extracts all files in the same folder (your current).I wan't to extract files to the same folder as the tar file.The solution from the old post (extracts all files to current folder)find . -name "*.tar" -exec tar xvf {}
Will unzipping always append files if the directories already exist? What about tar?
I unzipped an archive the archive had 3 folders and dozens of subfolders within those. app, skin, js
The folder I unzipped it to also already had those 3 folders as well as many child folders etc.
My website is still working so I'm just wondering. Is this safe or is there anything I should worry about. The other option was to manually upload each file manually that is in those 3 folders and the dozens of sub folders.
I can not for the life of me determine how to get the monitor serial number / manufacture date using Extended Display Identification Data (EDID). Does anyone know? None of the following provide this info:
I'm looking for a method for modifying some jpg photo files last modification date with the corresponding timestamp creation date of each file.The reason is that shotwell import pictures in folders according to last modification date which is stupid on my opinion.
I haven't been to this site in quite a while, since it changed from LNO in fact. Good to see this place is still around, albeit under a newer name.
I'll get on with the problem. I've got a Netgear SC101T that I was using to store my files on. Some of you may know it uses the DataPlow SAN file system. It worked fine until I installed a firmware update which, for some reason, broke the mirror array. I've hated this POS ever since and want to pull the data from the drives and toss the box. The problem is, linux doesn't have support for this particular file system scheme.
What I'm wondering is, how does 'dd' work, in regards to keeping the file system. Does it simply copy files and disregards the structure, or does it make an exact copy, DataPlow FS and all? Anyone else ran into this conundrum?
I have tar files where I archive about 250 files, each about 80 Mb, without compression. In a few cases tar is only returning some of the files. For example, when doing an extract of the file using: tar -xvf 356.tar I got only 103 files, when it should return 255 files, but tar does not give me an error. Furthermore, the tar archive is 15.8 Gb while the extracted folder is just 6.4 Gb. The tar files were created using: tar -cvf 356.tar 356 where 356 is the name of the folder. All the steps where done in the same machines, under Ubuntu 6 and newer. Any ideas if there is a way to recover the files that are not being extracted?
I want to untar a package from one directory to another directory, directly from the command prompt. I want to untar the joomla package into the htdocs directory of xampp. How to do that directly from command prompt ? The reason i am asking this is if I try the "drag and drop" way, it won't be possible as xampp is stored in /opt directory and without super-user authentication nothing can be saved into it. You can argue saying that why did I untar xampp initially in su mode ,but that had to be done so that apache doesnt give me any start-up problems.
1. is there a way to prohibit a program from writing data on the hdd? 2. can i have different icons on each gnome workspace and how? 3. how to untar to current directory? "tar -xvvf blablah.tar.gz" does not work "tar -xvvf blabla.tar.gz -C ./" does not work in both cases, the files go into some strange random locations. 4. how can i change the way colors are displayed for different content in the xfce terminal? I used to have red for archives, blue for dirs etc. in kde, but lost all that after changing to gnome.
I had a program run riot and it has created hundreds of spurious files in one directory. Fortunately they are all dated 4th November so are easily identified. What bash command can I use from the console to delete them all?
I am running ubundu 10.10 and want to copy all files revised after a certain date (01.02.2011) to a certain location (usb memory stick) for backup purposes. How do I use the "cp" command, or do I have to use any other command ? Or may be this is not possible in Linux ?
This looks good, the files expected to be seen are output: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -print
But this shows me files that should not be output and likewise when I replace ls with tar it is tarring a whole bunch of stuff I do not want: find /usr ( -newer /tmp/empty_file -a ! -newer /tmp/empty_file1 ) -exec ls -l {} ;
In the end I would like to replace the "ls" with "tar cvvfp some.tar {} ;", but can't figure out what is going wrong here.
I was going to do a rsync -r -a -z -v -p -e sshto move some files frome server to another, but then realized all I really need are files which have dates starting June 1, 2008 to current. Is there a way to have rsync only sync those files?he directory structure that's my source goes all the way back to 2004.
I know find can do what I am looking for, but I am wondering if there is an alternative way to find files on the filesystem either created before/after a certain point, or at a certain time.
Typically I rely on updatedb & locate for most of my file searching needs. Issues with those tools, though, are that it only has directory and file names, and it only creates a database of local directories, not anything mounted via CIFS|NFS or via -o loop (eg, .iso images).
So if I need to find files created after yesterday across the entire system (local and remote filesystems), I am currently needing to use find.
What other tools, if any, would accomplish this in a similar fashion?
I have tried ls and grep, but that requires (in my attempts so far) multiple searches:
ls -lR | grep Aug | grep 10 ls -lR | grep Aug | grep 11
I have some basic experiencing creating simple scripts/making directories/changing permissions/etc. but I'm stumped on this one.
I have two linux boxes. I have a script set up on box 'A' to SCP into box 'B', grab a copy of a database backup and store it on box 'A'. It looks like this:
I have generated a public key on box 'A' and placed it into the authorized_keys file on box 'B', so a password is not required and the file copies over successfully when the script is run. On to my problem...
I need to know what date the 'dump.23.gz' file was originally created when I'm viewing it after it's been copied to box 'A'. If I ls -l on box 'A' it only shows me the date it was created on box 'A' when it was copied.
What would I need to add to my script to append the backup's original creation date on box 'B' to the filename so that when it gets copied to box 'A' I know when the backup was created on box 'B'. I'm sure this is probably confusing. I've done lots of searching and can only find information on how to append the current date and time to a file name. I need to append it's original creation timestamp to the filename when it copies over.
I am trying to find a command which will copy all the files in the folder with extension ".log" which is created one day before the current date. By going through other threads in this forum I found the half solution to this problem
find /mnt/hd -mtime -1 -exec scp {} /mnt/usb ;
This command copying the all the files created one day before(not only *.log) to the /mnt/usb folder. what is the modification required to above command.
I currently have a command to backup a directory it will zip the directoryand place it where i have told it too, Now what i am after is a command i can run before my code, that will delete and tar.gz files before todays dateso i my ideal world it would be something like this, delete <'date +%m_%d_%y'.tar so this will delete all the files in this folder before todays date,
I need to know all files modified within a date and time range.E.g: All modified files between 20 April 2010, 1100-1200 Hrs."find / -mtime +10 ! -mtime +11" :: this i found for date but how to include time as well.
I'm trying to write a script that searches my files and lists them by date. Can someone point me in the right direction? I've been looking through the books that i have but i'm just not finding the right commands to search dates.
I wanna copy all folders and files created from 01.01.2011 until today to new placeie:cp -r /home/moviecar/public_html/wp-content/uploads/ /home/teaser/public_html/wp-content/uploads
I am trying to configure logrotate on APP/DB servers.As per my backup policy,logs will compress in daily basis and and will be moved to a Central storage device.
My tomcat generate several application logs with date extension as well as .log extension.For eg app.log,app.log.2010-10-23-14,catalina.out,catalina.2010-10-25.log etc.
Currently my tomcat logrotation /etc/logrote.d/ #cat /etc/logroate.d/tomcat/ /usr/local/tomcat/logs/*log {
[code]....
But its rotating logs only with .log extension..ie app.log.2010-10-23-14 (with date extension) is not rotating.If i put "*" instead of "*log",its rotating all files including rotated files. How can i rotate files which is having date extension.Also i dont want to keep rotated logs for more than 3 days.
I have a folder with hundreds of .txt files (logs of some java application) that I have to merge in to one single .txt file. This application produces a new log file everyday:
day1: logFriday10September2010.txt day2: logSaturday11September2010.txt ... day8: logFriday17September2010.txt ... and so on...
I could merge the files easily with "cat" and ">>" however, the problem is that I have to do it by taking into account the date (creation or modification) of the file.
If I simple use the cat command the output file will receive for example, all Fridays in a row, then all Saturdays, etc. and in that way I'm not considering the date.
I've searched for the options of the find command, since the files after creation are not modified...I try to use this for example:
$ find . -newer <some old file>
but that lists me all files after that <old file> and not by correct date.
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
I am using CRON to create a new, blank file, every minute, in a specific location on my web server. After web searching, and reading man pages, I get the impression that the following command is supposed to work:touch /home/mydomain/var/folder/attachments/`date +%H%M`.txtThis should give me a new file with a file name that is the current hour and minute.However, when executed, the CRON mailer reports:touch /home/mydomain/var/folder/attachments/`date +/bin/sh: -c: line 0: unexpected EOF while looking for matching /bin/sh: -c: line 1: syntax error: unexpected end of fileSo, it looks like shell is seeing the plus (+) sign as an EOFObviously, nothing get created.What would be the easiest, single line command to create an empty file, at a given location, with a time based file name
I have bought an external usb hard drive on which I back up my three computers every once in a while.Space will quickly be used up.I can't find that little bit of research that I need yesterday.Here is what I would like to find:An application that eliminates doubles in identical files and renames files that have changed by appending the last saved date yyyymmdd to the file name.Does such an application already exist?
I am newer to Linux ( using Ubuntu 10.04) : I have noticed that during replacement of a file , no date and size of the new and old files are shown in the dialogue box so how to show that ( like the one in windows)
I know that it is easy question , but i really don't know how to do that , by the way I have checked folder preferences and system --> preferences but i did not find something for that
I need a script that will take all the files in a given directory and create new monthly sub-directories and sort all the files based on the creation date into the appropriate directory.For example, all files created between 01/01/09 and 01/31/09 will be placed in 'JAN-2009'