Perlfect Solutions

[Perlfect-search] Indexing propblem continued
Sun, 9 Sep 2001 00:17:52 +0300
> > Is there a way to tell the
> > script to add these 10 new pages to the index and keep the previously
> > indexed 350 pages still the same in the index.
> No, that's not possible.
> Regards
>  Daniel

My host gives me only 60 seconds runtime, increase is not possible. They do
not accept. Well for a lot of people in my position, this is a serious
limitation to the script. After the first 350 pages (most of them under 50 k
in size) I am no more able to index anything. Logically, there must be a
script change, because I do not need to index 90 % of what is indexed in
each new indexing. How come this is not taken into consideration? Or is
there a way the database works so that this kind of stepped indexing not
possible? Well there should be a way, at least theoratically.

I also tries what was mentioned below on m ylocal machine and uploaded. It
did not work. The points of difference are:

Paths are no problem, I imitate the same paths locally,
My local machine has win OS and host is on unix,
I do not know my DB_File  version. How can I find it out on my local machine
which has win OS?
I asked my host, is there a way or script to get the DB_File version on
remote unix systems?

Thanks for the time given for the answers.


> --
> Daniel Naber, Paul-Gerhardt-Str. 2, 33332 Guetersloh, Germany
> --__--__--
> Message: 2
> charset="iso-8859-1"
> From: Daniel Naber <>
> To:
> Subject: Re: [Perlfect-search] Long indexing process
> Date: Wed, 5 Sep 2001 18:32:41 +0200
> Cc: Sascha Claus <>
> Reply-To:
> On Wednesday 05 September 2001 17:22, you wrote:
> > Is it possible to run the script on a local PC and upload the indexes to
> > the server?--
> If all the pathes are the same, the operating systems are the same and
> DB_File has the same version on both machines, it *should* be possible.
> Regards
>  Daniel