General :: Finding A Recursive Shell Or Perl Script To Delete Files With The Same Name As The Parent Folder?
Jun 29, 2010
is there a recursive shell or Perl script to delete files with the same name as the parent folder? i wish to include the starting folder name as argument to the script.
I am facing a problem in Windows due to a virus called Newfolder.exe which creats files with the same name as it's parent directory and an extension .exe and this happens for every directory in the entire hierarchy in the infected pen drive. The antivirus detects them, but is sucking slow. So I thought this is a good opportunity to use the concepts of the all mighty shell script to remove those as they follow the same pattern. Say my complete path is
Code:
/home/pkd/fol1/
The virus would have created an file with complete paths
Quote:
/home/pkd/fol1.exe
If fol1 has two more directories fol11 and fol12 Then there would be two more .exe(virus created) in the following path
I'm starting bash shell script and I'm looping without any solution.
I'm trying to find some files under a folder hierarchy and in case of errors moving these files to a destination folder under the same hierarchy recreating this hierarchy if not exists.
Finding all ._* files under /src and moving them to /dest recreating folder1 or the others which contains ._* files but without moving files which does not correspond to the pattern.
Code:
I tried find command and I'am getting all needed files
Code:
But I don't know how to use the output to get the parent folder of files which are found to
1- create folder with mkdir -p /dest/folder1 or /dest/folder1/folder4
2- move found files from /src/... to /dest/... with rm command
I'm working on a find command as this trying to do all in the same line but ... little lost
I have 2 external hdd in wich I have all my files.... yesterday, I have copied all the files from hdd2 to hdd1 and I want to eliminate duplicates so I used FSLint to find them, now, I have a txt file that looks like this:
Code: /media/My Book/!!!MIS DOCUMENTOS/Documentos/2 sep2003-jun2009 USB/!TESIS/TESIS/TESIS CVT LABVIEW Y CODEWARRIOR/LabVIEW85RuntimeEngineFull.exe /media/My Book/HDD_Toshiba/Borrable/Pen_Drive_4GB/Tesis/Super CD de la tesis/LabView/LabVIEW85RuntimeEngineFull.exe multiplied by millions of entries...
now I want to make a shell script to delete all the files/entries (read from the log file) that begin with:
Code:
/media/My Book/HDD_Toshiba/**** Since HDD_Toshiba is the folder in hdd1 (MyBook) that contains all the files from hdd2
Way to test permissions on all files/folders into a folder recursive, then if those are not user:user then do :
Code: chown user:user thatconcernedfile
The problem with that
Code: chown user:user -R /folder
is that it is doing changes on file permissions whihch are already ok. If you wanna maintain a specific permission on a folder this is really not good this :
Code: while [ 1 ] ; do chown user:user -R /folder # /folder contains 6.0 Tb sleep 2s done
Im pretty new to linux and Ive hit a wall. I have a folder 'forum'. I need all files and subfolders but I dont want them in forum. I have tried it with the gui, but it wont let me paste them. How do I move all the folder data without moving the folder itself via terminal?
I have a folderA that contains folderB that contains a lot of files. I would like to get rid of folderB, but not its contents. I want those contents to be inside of folderA. How can I accomplish this on the commandline?
I do have an Ubuntu Headless server which is running Samba on it. My mp3 file collection resides on that server on is being share.
So far, no problems connecting to that drive and writing from my Windows box on that share. But if I use my main laptop, which runs Ubuntu Lucid and download an mp3 song from Amazon, the moment I move that to the share, I got permissions problems from the Windows machine. This is clearly a permission issue with group and others; the song is being created on the share without read and write permissions to others or the samba group I created.
My question is. How can I make this process simple or automatic, when moving songs to the share? I don't want to go there everytime and run ...
Code:
Which was basically how I reset or fixed the problem.
I've read about umask, but not sure if that applies here or not, because I'm not creating the file but moving it.
i have to write a shell script that will delete all the .dat files in /var/oracle/etl/incoming which the created date of the file is 7 days before the currrent date.
I have a FAT32 SD card with a file on it, that, viewed in Windows the filename consists of a long string of nonsense. Viewed in my Android phone's Linux terminal, ls -a shows nothing in the directory. When I try to delete the parent directory with rm -rf deleteme, it fails with "Directory not empty". When I try to delete/move in Windows 7, it says the filename would be too long and/or Explorer crashes. Windows disk check doesn't find anything wrong. How can I delete this?
I currently have a command to backup a directory it will zip the directoryand place it where i have told it too, Now what i am after is a command i can run before my code, that will delete and tar.gz files before todays dateso i my ideal world it would be something like this, delete <'date +%m_%d_%y'.tar so this will delete all the files in this folder before todays date,
I have an archive directory that needs to be cleaned up once per quarter. The top level (/data/archive/*) directory names change daily, as well as the subdirectories and the filenames (the application names everything according to date). Also, there are two top level directories, bin and incoming, that we can't touch. I want to write a shell script that loops through the 15 or 20 top level directories and deletes all files and subdirectories older than 3 days (skipping the bin and incoming folders). Can someone get me started on a script? I am kinda new to shell scripting.
i want such a shell script or single line command to delete all the files with extension specified in script i have bash !! ex... delete all files of extension .obj
I can call this routine and it works fine when I enter a valid name for $PROJ. If I enter an invalid name it goes to the else block and prints the statement. However, it does not call itself. Instead the script just exits.I've googled 'perl recursive subroutines' and the example don't appear to be doing anything different.
there is a folder. Its empty. When every I drag a new file and put into it it echo out "there is file in there" and keep monitoring the folder. How can I do it?
I want to compare the following two tab-delimited .txt files (both were subsets of the original files) by comparing Columns 3 and 4 simultaneously. It is easy to compare C3 because both C3s are just numbers. But how to compare C4s?Basically, in File1, "G,G" = G in File2, "C,C" = C in File2, "A,A" = A in File2, "T,T"= T in File2.In File2, A/T in Column4 just equals "A,T" or "T,A" in Column4 of File1. C/T in Column4 just equals "C,T" or "T,C" in Column4 of File1, and etc.
I need to write a shell script which can ready content of the folder and place files on remote FTP server. I need to make sure that a file that is already placed on remote FTP server is not attempted second time. The file names will be something like Records-2011-05-09. The files will be generated by MySQL every hour.
Recently I setup a system for a non-technical user. He is only using Firefox, Pidgin and OpenOffice for about 2 hours a day. I have created a folder "/home/jim/myFiles" where he can save his document files. But Jim has accidentally deleted his myFiles folder on 2 occasions. He had intended to delete a file in that folder. Is there a way to lock the folder so that the user and create/read/write documents in that folder but not delete the folder itself?
I am writing a perl script where forking a child and the child is waiting for some response from C++ exe. Here is the code snippet:
Code: my $retPid = fork; if ($retPid) { # I am parent ... # Install signal handler $SIG{CHLD} = &REAPER; ... # sleep for a week (6 hours at a time) for ($cnt = 0; $cnt < 28 ; $cnt++) { select(undef, undef, undef, 21600); last if ( ! -e "/proc/$retPid" ); } ... } else { # I am child ($iid, $rc, %data) = $cpp_exe->getResponse(604800); # 1 week timeout ... }
If the cpp_exe takes very long time (around 30 hours) to respond then the perl script (both parent and child) is dieing. The SIG{CHLD} handler defined in parent is not getting called. It seems parent is dieing first then child is dieing (or may be both dieing at the same time). Why the script is dieing?
find out a command to search among all *.dat files in a certain path (including subdirectories) looking for the following text in them:
Code:
-------------------------------------------------------------------------------- Elements with small area Element Adjusted nodes --------- -------------- 16294 NO 17889 NO
and getting the list of elements with small area printed in a file "ErrorEl.txt". The output should have this form:
"/path/01/A.dat bad-el#01 bad-el#02
[code]....
I know already how to find out the dat files containg a certain string
Code:
c=/path/ grep -R --include="*.dat" "Elements with small area" $c | cut -d: -f1>> ErrorEl.txt
but I don't know then how to get the element numbers(16294 and 17889 in the example above)
I am writing a shell script that finds all files named <myFile> in a directory <dir> or any of its subdirectories, recursively. I also need to take care of symbolic links that may form cycles, to avoid infinite loops. I am not supposed to use find command for the same
I started writing the code but got stuck. I thought using recursion may be a smart way, but its not working.
I've been organizing my pictures (i.e. deleting the bad ones). However, I've recently got hold of those same pictures on a higher resolution and I'd like to delete the same pictures of the higher resolutions.This means that I'll have two folders, High Res and Low Res. I'd go through all files in the High Res folder and I'll check if there's a file with the same name in the Low Res folder. If there is, the file in the High Res folder will be kept. Otherwise, I'll delete it.So I wanted some quick way to delete PIC1.JPG, PIC3.JPG and PIC4.JPG from the High Res folder.