Perlfect Solutions
 

[Perlfect-search] Scalability

Daniel Naber daniel.naber at t-online.de
Mon Nov 22 08:36:55 GMT 2004
On Monday 22 November 2004 00:57, Philipp G�hring wrote:

> Can we get Perlfect Search to handle 200.000.000 documents?

No, it won't scale well enough. Besides that, Perlfect Search doesn't 
support incremental indexing, i.e. you would need to re-index everything 
if only a single document changes.

I suggest you try Lucene which scales much better. However, you cannot 
search 200 million documents on a single machine with acceptable speed, 
you'll need to distribute the index on several machines (unless your 
documents are *very* small, e.g. < 1KB).

> What is causing the memory-consumption here?
> Are the database-tied hashes using so much memory?

Yes, they are not optimized for fulltext indexing.

> By the way, we will soon have it finally integrated on
> http://www.quintessenz.at/ , which is needing about 50.000 documents.

Also note that there's a limit at about 64,000 documents in Perlfect Search 
(but that can be removed).

Regards
 Daniel

-- 
http://www.danielnaber.de