Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories
Jun 16, 2011
I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?
This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.
I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.
I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.
problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).
I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).
I'm trying to extract the sender id from a fairly large number of files and am having trouble assigning variables from a file. Here is what I have so far, (which is fairly kludgy I know, but it's been some years since I've done any scripting or programming, and I find that I have lost the knack to a large degree).
I want to move files from a $SOURCEDIR to a $DESTBASE/$DESTDIR. Under $DESTBASE there are many directories, and I need to test beforehand if a file from $SOURCEDIR already exists in any of them.
This is obviously extremely slow, and the real use case involves dozens of dirs and thousands of files. Creating a temporary "index" file for the find command (instead of running it every iteration) speeds it up a little, but it's still very clumsy.
Here i want to compare these two files with diff command and need to take the common things in a separate file. File contents are 7 strings in length and everything will start with the character e. File 3 contents should be
I need to copy all subdirectories and files from one directory to another ever 5 minutes or so, with the old data automatically being overwritten with the new data. I'd also like this to run at startup. Is there any way this can be done? If so, what program would I need to schedule the automation and what is the command line I would need.
I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.
I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.
I've discovered that Dolphin seems to lose random files when copying many large folders.
I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.
Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.
It's not so critical with music or films but I can't afford to lose work data like this.
Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.
The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.
i currently have hundreds of files all in a single directory. What I would like to do is create 8 subdirectories and move the files into the subdirectories based on the first character of the file name. Ideally, the script would omit any 'the' or 'a' and use the second word for filing purposes. No filenames have spaces. Instead they use periodsThe subdirectories will be:
I have a large directory tree with my ebooks and some of these files are zipped. I would like to move all of the zip files to another one so I can manipulate them. Since they are all scattered inside the tree, I would like to do it quickly and painfully with CLI. How should I proceed?
Two machines -- both 64-bit, both the latest Truecrypt installed, one Windows, one Linux. I created encrypted pendrive (using partition mode; not file-partition on regular partition) with FAT system on it. I copied several files/directories there -- using Windows. Then I unmounted it, mounted in Linux, but when I checked if everything is OK, it appeared that only the highest level of file hierarchy is visible -- i.e. only files and directories at mount point. Directories are seen as empty (in fact they are NOT empty). When I unmount pendrive and mount it again on Windows, all files/subdirectories are visible again. It is not an issue with older version of TC on Linux, because the version is the same. The question: how to make files in subdirectories visible on Linux?
In reading the rsync man page and browsing a lot of websites, I ended up a bit confused, or maybe it was just too much eggnog. Anyway, to exclude a directory "videos" with everything in it, which is /home/user1/camera/videos and I'm rsyncing the whole user1 directory to an external drive