Ubuntu :: Easiest Way To Find Files In Folder / Directory?
Oct 10, 2010
With Windows XP, I just right clicked a folder/directory and pressed FIND then I could search for whatever file/folder name i wanted to. I could even do custom searches based on the size, modification date etc.How do you do this on Ubuntu? There doesnt seem to be a way to easily do it like this. So far i found PLACES -> SEARCH FOR FILES but that means I have to go into the directory i want. Where as I would much rather be browsing through directories and THEN want to quickly search in a particular folder. The SEARCH FOR FILES method in PLACES just wastes more time.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
I am trying to find a directory named 480debugerror nested under child directories. I don't know the exact path, or even if I have the exact spelling of the directory I ant to find.
Is there any linux command to find directories with a given prefix or suffix, for example directories with a name of debug or debugerror, with unknown some prefix or suffix?
I cannot find dpkg directory in /usr/bin folder and while installing mysql server it gives the following error Errors were encountered while processing:
/var/cache/apt/archives/mysql-server-5.0_5.1.30really5.0.75-0ubuntu10.5_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1)
I would like to know how to move all the files from a single folder and its subfolders to a single, different location in as few steps as possible. For example when I download files from one of my school's websites, the file I want is located in a deep sub-directory. So, I have to cd many times just to get to the file I want. Is there a way to recursively move all the files within a folder's subdirectories into a new location?
shell scripting in Fedora14I want a script"Find in curent folder for files, and it copy first file he find with name gived by user, if name already exist then echo error message and finish"command usage " bash scriptname copyASname"
smthing like Code: #!/bin/bash for files in /home/user/* do
I am trying to use "find" but I can't quite get all of the switches right for it to work. I have a folder that contains many folders. Let's call that original folder "MyFiles". The subfolders contain java files (and those subfolders possibly contain subfolders that contain java files). Here is what I want to have happen:
0. Create a file to print to, call it "output.ps" 1. Find all of the Java files in the MyFiles tree. 2. For each java file that is found, append it to output.ps along with it's absolute path name.
So far I have:
find . -iname *.java
and this finds all of the java files for me. But then I can't get the files to print to a file using exec.
I was messing around with fed 12 yesterday (only on a test installation) and i've hit a snag. I installed openbox and tint2, nitrogen, obconf etc so i could have a #!-style session at startup. It all worked fine until i installed pcmanfm and removed nautilus. The problem is that i can't display files in my home directory, either using pcmanfm, a reinstalled nautilus, or in terminals. Every time i try to point a file brower there it just seems to get stuck searching forever, until i kill it. Weirdly in terminator i can do an 'ls' to see visible files, but 'ls -la' causes the problem again.
ps i thought permissions might have something to do with it, so i did a 'chmod -R 777' as root. it changed permissions for quite a lot of the files but then froze again, and now the problem persists.
i need to know how to find number of files in a directory? is there any system calls in fedora 12.And i need to know how to perform a operation if the that count increases by one?
I am trying find files in a directory that contain numbers. I have tried ls /etc *[0-9]* but that doesn't work. If I cd to /etc and run ls *[0-9]* it almost works but it also includes results from within files. My last thought was to try: find /etc [0-9] -type f but this does not work either. My second problem is that I am trying to get list of files in a directory that were changed less than 10 hours ago, using grep, while leaving out directories. I am completely stuck with the second problem.
I now have 3 desktop computers hard wired into my wireless router and another desktop plus 2 laptops connecting wireless. All are running Ubuntu 10.4 or 10.10. I have not had the chance to get them all upgraded yet. What is the easiest way to get these computers connected so that they can share files? Do I have to set up a server or is there a simpler way? I just want to be able to copy a file from one computer to another.
when we do enter on a folder it take some time for loading the folder depending on the no of entries in the folder . If the folder has more entries it take more time to load and if less no of entries then correspondingly less time . the delay in loading the folder varies due to reading of the folder entries in advance . SO what i want to know is that what is the MAX no of entries read in advance while opening a folder in linux and also how can we calculate this
I have a number of crash.log files scattered about my system and I would like to run a command to find all the crash.log files on the system and copy them to a single directory; each with a unique filename. For example, copy crash.log from ~/directory_1 , ~/directory_2 , ~/directory_3 and so on to ~/crash_logs/crash.log1 , ~/crash_logs/crash.log2 , ~/crash_logs/crash.log3 etc.
Recently I have moved to F11. While I am trying to install directFB explicitly, I am getting error related to X 11 dependencies. It could find no includes files under the directory /usr/include/x11/. But x11 libraries are available in the system. I remember that, i had these x11 include files in my F9. Do I need to install any external packages to get the x11 include files in place?
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I am looking for Windows Search equivalent looking for file name patterns (not file contents but file names)....
I am aware of "globbing" and wildcard recursive search functionality in ls but I am still not capable of finding files under directories.
for example: I want to find all files starting with a string lsnr* under root directory / and any sub-directories.....
ie I want to look for files like lsnr*.* anywhere under / and any sub-directories under / such as /dir1/dir2/dir4 and dir1/other/dir/someotherdir/sub-dir etc.
so if I have /dir1/lsnrcontrol and also have /dir1/dir/2/dir3/lsnr-tinit.dat then I want to list the files names etc.
I cannot change directory to a more than three folder tree destination folder from ~ in terminal. I've checked everything. No Typos or misspell. The destination folder was recognized by "ls" command but when I went to it, the terminal said, "no such file or directory."
Is there a way to recreate all the folders from one directory to another without copying over the contents of the folder? I've been trying to do something like this,
Code:for i in `ls $X`; do mkdir $PATH/$i; doneUnfortunately $i is deliminated by whitespaces in the filenames and not the actual folders.
$X contains only other folders so I dont have to worry about regular files but any kind of more "advanced" solution would work.
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
There are millions of files in many directories. Wherenver i try rm * or find or use xargs, they say 'argument list too long' and exit. How can i deleted files in a directory with so many files without deleting the directory itself.
when i used windows there was this wonderful editor named Notepad++.it was perfect(it still is) some of its best and useful features of it (for me) was:
1-open all files in a folder when drag and drop the folder on it 2-search and replace a statement in all open files 3-have an extended mode which include special characters like
and so on.. i want to know if there is an editor with this feature in ubuntu?