Perlfect Solutions
 

[Perlfect-search] Scalability

Philipp G�hring mailinglists at futureware.at
Sun Nov 21 23:57:37 GMT 2004
Hi,

Perlfect Search is really nice. 
But for only 2000 documents, I am using grep. 
Can we get Perlfect Search to handle 200.000.000 documents?

Nov 21 16:06:46 linux3 kernel: __alloc_pages: 0-order allocation failed 
(gfp=0x1f0/0)
Nov 21 16:06:47 linux3 kernel: __alloc_pages: 0-order allocation failed 
(gfp=0x1d2/0)
Nov 21 16:06:47 linux3 kernel: __alloc_pages: 0-order allocation failed 
(gfp=0x1d2/0)
Nov 21 16:06:47 linux3 kernel: VM: killing process perl

:-(

What is causing the memory-consumption here? 
Are the database-tied hashes using so much memory? 
Or is the problem in the indexer itself?
( I already have $LOW_MEMORY_INDEX = 1; )

By the way, we will soon have it finally integrated on 
http://www.quintessenz.at/ , which is needing about 50.000 documents.

Many greetings,
Philipp G�hring
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : http://hottub.perlfect.com/pipermail/perlfect-search/attachments/20041122/df2dc611/attachment.bin