On Windows, I used to be able to search for items on my home server quickly if it had indexing enabled by simpily typing a query in the search box when I was viewing one of its drives (i.e. samba shares).Since I moved the server to Ubuntu, I can find plenty of indexing software like tracker but I cant figure if any of them support querying them over the network. I need a fast way to search through all the files on the server with indexing without the hassle of having to VNC into it.
how one can use the nepomuk/strigi file indexing service. I started up dolphin, and typed the search string in the upper right corner, and clicked Search. The search result is a long list of files. The one I need is a PDF file. When I select this file, on the right hand side there is the following info: 'Is part of: 15.2 GiB Removable Media' which is clearly wrong, since this file resides in my home directory, which is on a 200 GiB partition. If I double-click this file to open in okular, I get an error message:
Code: Could not open nepomuksearch:/nepomuk_3A_2Fres_...... Reason: Please insert the removable medium "15.2 GiB Removable Media" to access this file I guess this is some old result from nepomuk's cache (in the meantime I have had to repartition my hard disk, and restore my home directory from a backup copy on an external hard drive - but this has been many months ago, so I would expect that nepomuk have had the time to recognize this)
I have noticed on my netbook that F14 is running some sort of indexing service in the notification area. What is this, and how do I turn it off? I know what it is doing, but why is this just now important in F14. It is killing the battery life on my netbook.
I am having trouble with beagle. I havent used it for a while (probably since the upgrade to karmic). I have found now that it does not provide any results for any of my files, specifically my pdf files (I am doing a PhD and it is easier to search for journal articles by the author/title than it is to find them on the file system with 100s of files all over the place, I wish I was more organised!).
Do I need to install a sepparate backend to index pdfs?
I had a look in the log files and for all my pdfs (and some others) I get something like this:
Code: 20100202 11:10:42.8736 15579 Beagle DEBUG: Delaying add of file:///home/philip/Desktop/PBR Stuff/Lee 1986.pdf until FSQ comes across it
I have a large number of video files on a couple of high capacity HDDs. The files are reasonably clearly named. I've never bothered creating a database of the files because I am inherently lazy and believe the computer should be doing this for me.
Anyways, crunch time. Had a look at the stuff in the public repositories, but they all involve too much typing, too much work. So, what are others using/ doing to maintain their "collection" indexes?
I will be running a network of +- 35 Ubuntu pc's from next year on. I have already started setting up remote desktop viewing utilities, but I now need a way to remotely shut down all the pc's at the same time, with a simple command as Admin. Setting up scripts to execute at a certain time is not gonna help, as the online time wont be the same all the time. I want to avoid having to remotely shut down each and every pc, as this will take quite a lot of time.
I have installed Tomcat on one machine and MySQL on another. When I run tomcat on the same machine as Mysql and use 'localhost' in the jdbc url, I have no problem accessing mysql through jdbc. So my problem is accessing a remote mysql instance. I have granted privileges in the following way from the MySQL prompt:GRANT ALL PRIVILEGES ON mydatabase.* TO myusername @192.168.0.103 IDENTIFIED BY 'mypassword';
FLUSH PRIVILEGES; Still I get : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorEx ception: Access denied for user 'myusername'@'192.168.0.103' to database 'mydatabase' sun.reflect.NativeConstructorAccessorImpl.newInsta nce0(Native Method) sun.reflect.NativeConstructorAccessorImpl.newInsta nce(NativeConstructorAccessorImpl.java:57) [Code]....
I have no idea whether this is a problem at the operating system level i.e. allowing networked hosts, or whether it is something to do with MySQL. I have commented out bindaddress and, on this version, there is no 'skip-networking' line to comment out.
Ok here is the deal, I am allowing my neighbor access to some networked folders on my ubuntu file server in exchange for access to their washer & dryer. I have already created mapped drives on their xp machines but now I want to only allow them "read only" access so they don't accidentally delete anything?
Is there any way to make it so I can let my laptop go into standby that is connected to my printer and make it so that I can still see the printer from my other systems. Then if I print something it brings the networked printer/laptop out of standby and prints my job. I just ran a test and when my system went into standby the printer went "offline" and I couldn't print from it. I don't want to be forced to leave my computer fully on 24/7 to have a networked printer.
I am running Server 10. I have a requirement to perform an action before the MySQL service starts, and perform another action after MySQL service stops.
I found the init script for MySQL under /etc/init/mysql.conf. I added my thing to the pre-start script there and works fine.
I am having trouble finding the script that stops the server so I can modify.
My situation: I have a desktop, with the following sound systems internal to it:HDA IntelSBLive! I also have a Creative Labs webcam connected via USB and another device (VR goggles) which have an audio component.My problem: a fair number of game applications use ALSA card#0 to play games, now pulse's ALSA module is at best lacking, so I don't use that for games. At this point I ahve pulse configured to always use HDA Intel for it's sink, what I would like to do is to configure it so that the SBLive! card is always ALSA card#0, regardless if the 2 USB audio-ish devices are installed. That way pulse can be running and I still get to play the games with native ALSA audio. The trick is how do I force by SBLive card to _ALWAYS_ be card#0 ?Update: there is an easy way, that I should have realized before, which allows for ALSA games to use the SBLive without having it at index 0, below is how to do it:
I'm concerning about my web server, I use nikto to see where should I improve my configurations, then I just know my web server is enable directory indexing. I have searched and found that I should just put
Code:
Options -Indexes to disable directory indexing. I have already restart apache but directory indexing still enable here is my httpd.conf Where did I wrong ?
Code:
ServerTokens OS ServerRoot "/etc/httpd" PidFile run/httpd.pid
I know this has been asked before, but in case there are some significant changes in this area with Ubuntu/Kubuntu 11.04, I ask. Are there any good search and indexing tools available preferably something that also look at the content of files? Something along the line of Beagle. Is it still possible to install Beagle?
If I DL something from the Internet It askes were to save. This is normal however even though my 2 computers are nicely networked I can't seem to navigate to the network to save onto the other computer.
So having recently set up my server (including some disk space and ushare, and an additional laserjet printer) I was wondering if anyone had any experience of putting a multifunction printer into an ubuntu server and adding it locally (to the server) then to the networked computers....I know this printer works perfectly (the scan too!) when plugged directly into a machine's USB....I suppose my real question is....does anyone know anything about the scanner side of it?? is this going to work?? I reckon the printer side will (/should - i know the printer works and i know the server can share printers.....lol) but the scanner side is relatively new (when put into the context of shared from my HS.Please note:My server is just running vanilla (plain) ubuntu 9.10....and my compo's are 10.04 and 9.10....ubuntu ONLY network...
I have developed a site in localhost in my laptop. Here are the specs: Ubuntu 11.04, Joomla! 1.7, Db Ver: 5.1.54-1ubuntu4, PHP Ver: 5.3.5, Web Server Apache/2.2.17
This laptop is connected to a wi-fi network. Task: I like to see this local site from my laptop to other desktop computers in the same network. I understand that I need to do some changes in /etc/hosts what changes I need to do to be able to see my localhost site in the network.
can I index files so whenever I do a find somethingdo find / -name libSDL-1.2.so.0It doesn't take 10 mins. I do know there are packages such as tracker but that one does indexing all the time. I would be happy to have something which can be done by hand or something which is done once in 12 hours or so.
I use the latest Fedora 14 with Evolution 2.32.1 and Tracker 0.8.17.In Evolution is the tracker plugin enabled, tracker is not indexing my email.It says in the status: Filesystem 100%, Email 0%, Applications 100% Why are the Emails in Evolution not indexed?
The goal is to have a single folder that has symlinks to all the files in each of the drives. Pretty much a poor man's JBOD. Previously, I had problems with conditions like 2 drives having the same sub folder contents, but I ended up solving that with the current script I'm using now.What I'm looking for now is speed. I'm very new to Perl and the script takes about 12 minutes to complete with the current drives.
Basically, the script makes a list of all directories and files in each drive. First, it makes the directories. I didn't use any validation because if a directory already exists, it simply won't make one. However, with the files, I used a hash to only keep the unique files. Then I use the key/value pairs with ln to create every link to the files only, not directories.
Code:
#!/usr/bin/perl use warnings; my @drives_to_sync = qw ( /mnt/sda/ /mnt/sdb/ /mnt/sdc/ /mnt/sdd/);
My friend has a Windows 7 desktop with a large data hard drive which is set up to allow other users to connect to remotely, such that they have read and write access. To set up a connection from another Windows machine, it is as simple as right clicking in the My Computer window and selecting "New Network Connection", then entering the information requested.
The information my friend as given me is: - The IP Address - The Hard Drive's Name - A Username and Password
Back in school I remember using an application that would identify active IP addresses on a network, and basically show you a log of activity. We actually monitored another lab and went in and showed them what we saw (all the machines had IP addresses on the monitors.) We could see websites, bandwidth, etc.
I'm trying to find an application that would do this again. I've been trying to monitor my networks to see what machines are performing unauthorized operations. ISP is showing high bandwidth usage and there is no way checking email and browsing is using this amount, 200GB a month! Something is going on here.
We have recently upgraded to ubuntu 11.04 64bit from 10.04 and 10.10 in a networked office in which our home directories are mounted on a server raid array. Ubuntu considers that the home directory is a local drive rather than on a networkWe have noticed that file preview are very very slow in 11.04 for pdf's and pictures in nautilus, causing a lag of around 10 seconds for each file. This can cause a problem for folders which hold multiple files which can be previewed. Preview worked fine in previous versions of ubuntu on the same machine, with the same network setup. Previewing of actual local files works faster, but is slower on 11.04, than previous versions.
Both the 10.10 and 11.04 systems are up to date with available updates (11.04 - 2.32.2.1 and 2.30.1 on ubuntu 10)Obviously, there is a work around for this problem - turn off preview. But, it would be nice to turn it on as its a useful feature.
tab-completion indexes system folders (like /usr/bin, /usr/local/bin)! so say i'm in a folder that has two files, 'text' and 'myprog', i type in an 'm' then tab, and i get hundreds of results including 'mysql', 'mysqlconfig', and others as im sure you can imagine. is there a way to set it to default or something else that will only make it index the current folder?
i tried changing my PATH variable so i could execute programs in the current directory without './' - what i added to PATH was ':.' at the end (apparently this is not the way to do it... :S). i tried resetting PATH various times ('unset PATH', 'PATH=$whatever...') but this has not fixed the problem. using 'unset PATH', of course, removes everything from PATH, which meant that functions (like 'ls') in /usr/bin and /usr/local/bin can't be found. obviously i want those to be found, but i would rather not tab through them!
I came out with this issue as I saw that my CPU was most of the time between 70 and 80 percent of usage, all by some process called tracker-extract. Maybe when tracker is committing its initial index this is admissible, but this situation repeats every time I boot my notebook and it lasts at least 3 or 4 hours, or until I tell tracker to stop indexing. I hard-reseted tracker to see if there was any corrupt indexes but with no luck, behavior was the same. BTW, this issue wasn't present with ubuntu 10.04
I've got a lan with a mixture of linux and win machines. I've got one of those network addressable printers that I really like since I can access it from any machine on the lan in an os independent manner.
I saw in the local computer store network addressable hard drives, i.e., those that have an ethernet address and port. I really like the concept of having hdd storage that is both machine and os independent, just like my printer. However I don't know how to make it secure from spoofers. The only filter between it and the outside world would be my linksys wireless router, which has an internal firewall, but that doesn't seem to be enough security to me.