General :: Finding Files To Backup?
Mar 12, 2011
I have a situation where a directory has about 1.5 million files in it. On an hourly basis, I want to be able to find any files that have changed in the last hour, compress them, encrypt them and then copy them to both a local backup machine and an off site backup.
Is there any kind of utility or kernel module that creates some type of log of modified files? I know I can use find, but the search for -mtime in this directory takes quite a while and will not suffice for an hourly backup.
View 3 Replies
ADVERTISEMENT
May 21, 2011
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
1. /backup/
2. /media/backup/
3. /mnt/backup/
4. /home/chris/backup/
View 7 Replies
View Related
Jun 17, 2010
What is the best Linux Mint backup tool that is most like Time Machine (that ships on Macs)?
The one thing that I want it to have similar to Time Machine is that it only backs up files that have been changed, therefore making for faster backups.
View 2 Replies
View Related
May 10, 2011
I install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.
[URL]
The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?
View 1 Replies
View Related
Nov 29, 2010
To find all files recursively starting with a . (period), is the following OK:
find ./ -name '.'*
View 7 Replies
View Related
Apr 19, 2010
i am a newbie in linux ,i am writing a bash script to identify the files which are exactly 7 days ( a week old) i tried this command find /var/backup -mtime +7 -exec ls -d {} ;but this gives me even the files which are older than 7 days
[root@proxy access]# find . -mtime +7 -exec ls -d {} ;
./access.log.1.gz
./access.log.2.gz
[code]...
View 3 Replies
View Related
Apr 23, 2010
I am looking for this `struct messages_sdd_t` and I need to search through a lot of *.c files to find it.However, I can't seen to find a match as I want to exclude all the words 'struct' and 'messages_sdd_t'. As I want to search on this only 'struct messages_sdd_t' The reason for this is, as struct is used many times and I keep getting pages or search results.The directory I am searching in, has another directories so it will have to search recursively.I have been doing this without success:Code: find . -type f -name '*.c' | xargs grep 'struct messages_sdd_t'and thisCode: find . -type f -name '*.c' | xargs egrep -w 'struct|messages_sdd_t'
View 3 Replies
View Related
Nov 27, 2010
give me a script for copying all the files in a directory that are not directories to another directory. May be this could work:
Code:
FILE=(ls -l|grep ^-)
for i in $FILE
do
cp $i /destination/path/$FILE
done
View 14 Replies
View Related
Jan 28, 2010
I'm wondering if you can share some tips in regards to finding .conf files in programs when installing using package managers. I'm scratching my head on the fact that when you install a program through yum/apt-get, I don't know what and where the software is being installed at. In Windows, I know that when it installs an application, it goes into the Program Files directory, it's that simple.I know Linux has predefined directories for applications but sometimes it installs configuration files in /etc or some other locations in /usr which I have a tough time sifting through.
Is there a way to trace what .conf or any files for that matter which relates to what software that needs it? It's just hard for me to understand what file relates to what application at the moment. As much as I would like to learn more about Linux, this process for me takes up alot of time. I hope you can help me out on this one.
View 1 Replies
View Related
Oct 5, 2010
Assumed y directory structure looks as follow:
Code:
--root
-- dir1
-- dir1-1
-- dir2
-- dir2-1
-- dir2-2
And under each sub dir there are some log files ended with .log. Now I want to list all these log files. How to to that?
View 2 Replies
View Related
Aug 11, 2011
I know find can do what I am looking for, but I am wondering if there is an alternative way to find files on the filesystem either created before/after a certain point, or at a certain time.
Typically I rely on updatedb & locate for most of my file searching needs. Issues with those tools, though, are that it only has directory and file names, and it only creates a database of local directories, not anything mounted via CIFS|NFS or via -o loop (eg, .iso images).
So if I need to find files created after yesterday across the entire system (local and remote filesystems), I am currently needing to use find.
What other tools, if any, would accomplish this in a similar fashion?
I have tried ls and grep, but that requires (in my attempts so far) multiple searches:
ls -lR | grep Aug | grep 10
ls -lR | grep Aug | grep 11
View 6 Replies
View Related
Aug 24, 2011
My collection contains some MP3s which have some glitches like
displaying the wrong duration on loading
minor jumps
suddenly ending despite the duration claims another minute remaining
noise
I'm looking for a tool that can detect as many of these glitches as possible and fix those that can be fixed (obviously e.g. noise can not simply be eliminated in most cases).
View 2 Replies
View Related
May 29, 2010
I have a website that has a massive list of royalty free stock photos and I want to download all of them. I have bought a membership for [URL] so I am able to download as much as I want from them for the next month.
Instead of going page by page and manually downloading each set of stock photos manually, I would like to automate this process. Here's my idea:
1. Download the website with the links to hotfile [URL]
2. Use grep to retrieve all the links to [URL]
3. Feed the links I recieve from grep into wget and download the works of them.
The problem I'm getting is when I use grep, It retrieves the entire line of html code where "hotfile.com" is shown. So here is an example of one link I receive in the output:
Quote:
./1776-santa-claus-vector-set.html:<div align="center"><a href="http://hotfile.com/dl/18418176/181a55b/Santa_Claus_Vector_Set.rar.html" target="_blank">HotFile</a></div>
Is there a way to just have the link shown in the output?
PS: I have everything else working, I just need an automated process of getting all the links.
View 5 Replies
View Related
Jun 8, 2010
how to find a file with a ctime older than. let's say, 5 seconds? What I would like to do is to move all "new" files in a specific folder but not files that has not yet finished uploading.
View 1 Replies
View Related
Dec 9, 2010
I've got a quick grep question. I'm trying to work out a command I can use to locate all of the files in a directory that have sql database connection details. I want to do it by looking for the strings "localhost" and the name of the database.find . -type f -exec grep -l -E '^(localhost|DATABASE_NAME)' {} ;
View 4 Replies
View Related
May 18, 2011
if there an application which does the following:
1. I have a laptop with an internal 200GB HD.
2. I run the application & it creates a list of all files (size & time-stamp) without actually storing them. Let's call this the "snapshot list".
3. I update some of the files on the laptop.
4. Now I run the application & it only copies the files which have changed on the laptop, that have different size/time-stamp from the snapshot list, onto some external media, such as a memory card. Of course, the files should be copied onto their proper location in the directory tree & not just pile up in one place.
Why is this useful? although the laptop has a 200GB HD I typically only update a small number of files, whose total size is maybe 10MB or so. If I could only backup those which have changed, I could do this with a tiny SD card instead of lugging around an external usb HD.
View 8 Replies
View Related
Feb 11, 2010
I am running Live 12 on my CD rom drive of my dying laptop. I have a major Windows registry error on that system and am working to recover my files. I have successfully moved a couple of folders from the laptop to my Seagate Free Agent Drive as a test.What I would like to know is, is there a way to copy my files and folders without literally dragging and dropping each one? We're talking 140 G of folders....sigh.
View 1 Replies
View Related
Dec 26, 2009
I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.
So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.
View 1 Replies
View Related
Sep 18, 2010
What GUI Linux programs are there for finding files based upon their contents?
View 5 Replies
View Related
Mar 13, 2011
I have been using Linux close to 2 years now. One thing that always bewildered me is audio support in Linux. These days I login to windows only for listening to music. After reading various blogs, i decided to give it a try in Linux with Amarok. There again I am facing a problem.
while i try to scan for music files it is not finding any files.
what I did is as follows:
Quote:
Setting --> configure Amarok --> collection --> scan
I have tried with mp3 .wma format files. While I try to add these files individually, Amarok is able to play those.
I am using Amarok 2.3.2 with KDE 4.4.5. Fedora - 12 is my flavor.
Given below is the log obtained with amarok -debug option
Quote:
TagLib: MPEG::Header:arse() -- First byte did not match MPEG synch.
TagLib: MPEG::Header:arse() -- Invalid sample rate.
TagLib: A frame of unsupported or unknown type 'TSC' has been discarded
TagLib: A frame of unsupported or unknown type 'TSC' has been discarded
TagLib: A frame of unsupported or unknown type 'TSC' has been discarded
[Code].....
View 3 Replies
View Related
Jul 22, 2011
how to find total size of all files whose names starting with a
OS: SunOS
du -h a* is giving individual file sizes.
View 6 Replies
View Related
Jan 10, 2011
I copied a back up of my windows 'my documents' fold and all of its' sub folders into my linux (Mint Debian) Documents directory. I found that many of my files can be found in more that one directory so, what I want to do is to find all the dups and deal with them. Is there a good linux application to resolve this 'duplicates' problem. (I don't want to touch the linux system files.)
View 2 Replies
View Related
Feb 2, 2010
I'm using gnome. I created some text files. After I change something in myfile.txt there's myfile.txt~ automatically created as a backup. Ok... now, on KDE I see these ~ files and I can delete them if I want but they're hidden in gnome... Gnome seems to treat them as special hidden files (just like .mydir directories) How do I make ~ files visible in gnome so I could edit or delete them?
View 1 Replies
View Related
Sep 13, 2010
I am using MySQL as the database system for my application on a Linux system. Every week I update the system and take backups (mysqldump) of the databases changed (2 databases). I then .tar.gz them and ftp the resulting file to a remote server, after which I remove the original backups and tar.gz files from the Linux server. Being a complete novice when it comes to Unix systems, I would like to know if it is possible to write a script which would do all this automatically, i.e. perform the following steps.
1) Backup database A to A.sql (mysqldump)
2) Backup database B to B.sql (mysqldump)
3) tar -cvzf dest.tar.gz A.sql B.sql
4) ftp dest.tar.gz to ftp@remoteserver.com
5) Delete A.sql, B.sql, dest.tar from local server
View 2 Replies
View Related
Oct 26, 2010
I backed up my Laptop with a script, as follows:
Code:
#! /bin/bash
sudo
growisofs -Z /dev/dvd -dvd-compat -r -v /home
I then installed a new version of Ubuntu 10.04 from disk and copied the files in /home from the cd to the hard. I am able to open, view etc. all the files in most directories except those in /home/documents. There are text files created by gedit, OOWP and several PDF files. I cannot open or view these files, depending: gedit and pdf files gets a Err.Msg. "Don't recognize file type" (it is clearly marked PDF) . The OO files look like rows of 'high bits' and a dialogue box opens giving me the options to change Char. Set, Font, Language, Paragraph break.
View 6 Replies
View Related
Apr 13, 2011
How to find a file which is being written recently in directory of 1000 files?
View 8 Replies
View Related
Apr 21, 2011
I have around 100 users. I want to take backup of files which are on desktop for every user. My user directory path is -: /home/dr/<user_name>/Desktop
1) Script has to run on a particular time everyday
2) Script has to take backup of all files present in "Desktop" directory
3) Make a tar with name "yyyy-mm-dd-desk-files"
4) Make directory outside "Desktop" with name "Desktop-Backup", if already exist then don't make this folder.
5) The tar have to moved in this folder.
6) Remove the files from "Desktop" directory. (i.e. Desktop should be empty)
7) Mail the status that "Backup Successful"
View 2 Replies
View Related
Apr 29, 2011
I keep a backup of a bunch of files on a flash drive, so that whenever I change distributions I can just restore all my Android stuff (saves on re-downloading everything). One of these is the Android SDK.
In my ~/.bashrc I add the paths to some executables in the SDK, only if the directory exists, and only if the path is not already in $PATH. For the Android NDK this works fine, but for the SDK I get this:
Code:
snfo@snfo:~$ adb devices
bash: /home/snfo/Android/sdk/platform-tools/adb: No such file or directory
snfo@snfo:~$ ls -F /home/snfo/Android/sdk/platform-tools/adb
/home/snfo/Android/sdk/platform-tools/adb*
Everything else is fine though, just that one path is causing trouble.
Now, I've saw something similar to this before whenever you move an executable from one place to another. If you don't re-source your bash config it will continue to keep looking wherever it used to be located. But I've never moved these files.
View 3 Replies
View Related
Dec 8, 2010
I am backing up parts of my computer with DD, and i was wondering if there was a quick way to split the files created into 4.4GB sized files that will fit onto a DVD. Anyone have any idea of how to do this?
View 6 Replies
View Related
Jul 24, 2010
Today I tried to compress some folders containing backup files from last year. I right-clicked on the folders and selected compress as tar.gz. I let it work, and found that hours later, the folders were still compressing. How long is it supposed to take, anyway? I was trying to compress the two sets of backups simultaneously; together they're around 1.5 GB. They have many subdirectories.
View 4 Replies
View Related