Price: $40 shareware. Platform: Perl CGI, should work on Unix, Linux, Windows NT, 2000, 95, 98, ME, XP, etc.
Robot crawler follows links to locate files, or the local file system
Indexes multiple web sites.
Indexing filters allow control over pages included by host name, URL, content, RASCi and PICS headers. Includes 40 preset filter rules.
Indexes HTML and text, as well as PDF with a free helper.
Indexes metadata: keywords, description, title and URL.
Can handle up to about 10,000 documents.
Resource-intensive actions, like indexing, are spread across multiple CGI executions, using META refreshes, preventing web server timeouts. Recognizes Internet Query Operators (+ and -) and Boolean AND, OR, and NOT.
Allows phrase searching with quotes.
Extended characters reduced to English equivalents.
Relevance ranking gives extra weight to phrase matches, and matching words in title, keywords and descriptions.
Recommended pages given extra weight in results.
Option to allow public submission of URLs for portals.
Searches logged by time, search, number of matches, and user host.
CGI by default, can be called as an API.
Results pages are template based and easy to edit.
Browser admin interface.
Extensive help files, active discussion and email from developers.