[Perlfect-search] Perlfect Search
Daniel Naber email@example.com
Wed, 30 May 2001 19:30:14 +0200
On Wednesday 30 May 2001 00:27, you wrote:
> I have installed the Search.pl etc correctly (i think...) and it will
> index directories containing a small number of pages, but once I try to
> index the entire site it just stops at the 129 or 130th page and does
> not create any new data/ files. Is this a memory problem???
Probably, here's a new FAQ about that (Giorgos, can you put that online,
too, I just commited it?):
How many pages can it handle?
The latest version of Perlfect Search (v3.20) is capable of indexing sites
with 1000+ documents easily.
The more pages you have, the more memory indexing will need. This is
sometimes a problem for people whose webspace provider doesn't allow
scripts to use much memory. You might get a "out of memory" error or the
indexer.pl script might just stop before it finishes. In this case, you
should talk to your webspace provider - it is not a bug in Perlfect Search
and there's no simple way to "optimize" its memory usage without other
drawbacks (the only case that can still be optimzed is the indexing of
very large files). With some webspace providers CPU usage is a problem,
too. If the indexer.pl script gets killed because it uses too much CPU
time, it's not a bug in Perlfect Search either and you'll have to contact
your webspace provider.
The speed of searching is always very fast even for very large sites. In
general, the performance of version 3.20 is more than enough for all but
the most demanding sites.
Daniel Naber, Paul-Gerhardt-Str. 2, 33332 Guetersloh, Germany
Tel. 05241-59371, Mobil 0170-4819674