I noticed that Ubuntu takes some time in building the file search index. It seems to do it overnight while my PC is powered down )Is there a way to build the index by a command so that I could do searches whenever I wanted, specially immediately after installing a new program?
I am used to doing very fast searches on the Mac. When I type into the search box in any finder window, the search results present themselves as I type. I am looking for the same functionality in Ubuntu. I tried Catfish but it doesn't have the speed I see on the Mac. Is there a way for me to index my drives so that searches are instantaneous, or are there any options I haven't explored yet?
for example we search a file for a certain keyword..is there any application available which will enable us to search for a single keyword in all the files within the folder ?i want to search for a keyword in about 1000 files..if i do it manually it will take loads of time..
I used wget -r to get all the web pages that were linked from index.html. The pages listed in index.html are all chapters. After using wget -r, all the chapters are now in the same folder on my local hard drive. Is there a way to build the chapters in their proper order into a "long"/"full" web page, rather than simply having each chapter as a link/next link on a previous page?
I use nautilus at the moment on Ubuntu 10.10 Maverick. I use grep cli command to locate text words or comments in a folder containing hundreds of files, each having 5 or 6 text documents in Greek and English.
I am looking for a manager that will index all entries, and enable me to find a phrase or word within these documents quickly. I seen to recall that some time ago I saw a manager that was capable of this, that generated an index, as data was added, but have been unable to locate it with google.
Is there a program available that would allow me to create an index in a pdf file that has no security restrictions on it? I know people can lock there files so I am not worried about thise but if I have open permissions on a pdf file how do I go about creating an index. It seems that by default you get the thumbnail view but I like to be able to click on a index list to go to a page.
i have a weird issue with my web server. every time i enter my domain name instead of processing my index file it downloads it. wth could be the problem first time this ever occurred. recently installed webmin.
I am running debian lenny running apache (latest) i am trying to make a webserver. i put the index.html file in my www folder. it shows up when i go to my site (localhost) i put an index.php file in my www folder and when i try to go to my site it Downloads the index.php file instead of showing it.
I have my home network up and running. I can access my index file by the ip I gave to eth0 - 192.168.0.2
I put this same ip into etc/hosts: 192.168.0.2 localhost localhost.localdomain One space separates the hostnames from the ip in /etc/hosts. The hostname command returns: localhost.localdomain
I have /etc/resolv.conf configured the right way, so I can use my isp's DNS servers. I know this because yum works and I can ping anywhere by url. The next step is to learn to access the index file by url locally on my own network.
I have a set of PDFs I'd like to index and made searchable. Most importantly, I'd like a web interface so I can search and access remotely. This morning I researched into several different options, but all seemed to fail.
Google Desktop Search - No good because web interface is only to the local host Beagle - No good because it is no longer support and there are executation errors when running in Maverick Recoll - No web interface Strigi - No web interface(?)
I recall using one about 2 years ago that was deployed on Apache Tomcat, but I can't remember the name.
I need a script that can do this: A script that searches all directories and subdirectories for .html files When a .html file is found it creates a index.html file in that folder. It then edits the index.html file and inserts links to all of the .html files that are in that folder into the body. If no .html files are found, it searches for folders. It then creates a index.html file with links to all of the folders.
My setup is two laptops connected by crossover cable. One runs Windows xp the other Fedora 13. Neither is connected to internet. I'm using a subnet of 192.168.1.x On Fedora eth0 is up, apache runs because it creates pidfile. Everything pings fine. Windows xp ip pings fine from command line. Gave Windows xp a static ip of 192.168.1.1 mask 255.255.255.0 and gateway 1.2, same as eth0. xp says it sees the server. eth0 is up.
DirectoryIndex looks at index.html. I created that file with very simple code and put it in document root. Document root permissions are 755. Access_log 770. Error_log 644. Apache User 755. Listen 80 When I type the ip for eth0 (192.168.1.2) into firefox, firefox gives me an error message - can't find server. The connection status says its connected.
The error log includes a line: [warn]./mod_dnssd.c:No services found to register I don't know what this means. Apache is not writing to access_log. When I cat the path to access_log I get nothing, then a command prompt. I'm looking for the part I'm missing that will let Apache serve that index.html file to firefox so I can see how my code looks to firefox as I go.
I am trying to install flash-plugin-10.0.45.2-release.i386.rpm so I inputed the folowing command and got the following error output:
Code: bash-3.1$ rpm -i *.rpm error: cannot open Basenames index using db3 - No such file or directory (2) error: cannot open Providename index using db3 - No such file or directory (2) error: cannot open Conflictname index using db3 - No such file or directory (2) error: Failed dependencies:
Dear all I am new to git (and storing multiple copies of my code).
I was using git for one month (git status, git add, git commit) to store my files.
I have problems to add more files to my branch
git add BasisExpansion.R fatal: Unable to write new index file
this was working great unless the day my system administrator destroyed my main partition (it was an accident) and installed everything from scratch. My home directory stayed intact (I did not lose any files not git files that are stored under /home). But from this point of time git returns always that error. My system administrator creater the same user and group as my old user was but this didnot fix the problem.
I am not sure If I can recover my old git structure or would it be better to wipe out the old git dir and start a new one?
Can anyone point me to already built and mostly working utilities that will catalog the contents of (external) USB and flash media in a way that they might be searched?With cheap (under $100 US) terabyte external drives, it is too easy to get another drive and fill it. I'm looking for utilities or applications that will help me know what I have, cull the duplicates, and avoid the need to spin a drive just to see if it holds what I seek.
Years ago there were utilities that would read the contents of diskette media and create a printable "index" or "catalog" page. The pages were conveniently sized to match the diskette so that one could store the page in the diskette sleeve for future reference.Later, the pages got replaced with applications that stored a disketter ID along with each file name. One could then search for a name, or pattern, and discover which diskette held the file(s) of interest. Today we don't have diskette sleeves, but we do have cases for our external USB drives and wallets for our flash media. A printed index would be nice to have.
I played a bit with the upgrade configuration files, to solve a couple of problems I have with Ubuntu 10.04, now when the upgrade procedure starts I getQuote:Failed to fetch Release Unable to find expected entry multiversdeb/source/Sources in Meta-index file (malformed Release file?)Some index files failed to download, they have been ignored, or old ones used instead.
I'm trying to upgrade from 8.04 to 9.10 via 8.10 etc. When I run update manager, I get this:
W:Failed to fetch [URL] Unable to find expected entry universal/binary-i386/Packages in Meta-index file (malformed Release file?), E:Some index files failed to download, they have been ignored, or old ones used instead.
and then it closes. I've unselected all the third party packages and tried various servers, but no difference. I can wget the file, but when I look at it I see entries for "universe/binary-i386/Release", but nothing for "universal" .
For some reason Ubuntu's built-in file search app doesn't find the file I'm looking for sometimes. My 10.10 build has been running excellently for some time now and haven't experienced any problems but this is a wierd one. If I use another app like Catfish it usually finds the sought file, but sometimes it gives a fatal error and stops working. Others have had this problem with Catfish, too, so I'm looking for another reliable file or folder search app. I've went thru what I could find in the Software Center but haven't found anything really good. I need an app that'll search for hidden files or folders on the whole PC
I backed up .themes from /home, but on trying to install them (files don't show as theme packages) it says "There was an error installing the selected file, index.theme doesn't appear to be a valid theme". Did I backup the right thing?
I can't search file in another folder other than the home folder. Like when I try to search in my mounted NTFS drive, it only shows and only searches my home folder, not the current open folder in mounted drive. I also have installed "Beagle Search" to index my data and reduce time searching,