[Perlfect-search] Re: need support: Response code 500 with HTTP indexing
Adrian Gudas firstname.lastname@example.org
Tue, 1 Apr 2003 14:55:17 -0500
Yup, those files are in place. Perlfect was able to index in directory mode
without a problem, but we need the HTTP mode for our dynamic content. I
thought perhaps it was a perl module issue, but usually perl screams at me
when that happens, and there were no @INC errors. (And according to cpan,
I'm up-to-date.) I'm new to Perl, and someone told me I might need a 'use'
line in the Perlfect configuration to point to the modules that cpan
installed (since I don't have root access on this machine). If that made any
But I am 90% certain the modules were already installed, and are working,
and were installed long before I first logged in to the shell in the first
Also, on another server, this one running debian instead of RedHat,
everything worked right away without any problems, with no manual
configuration (just after running setup.pl and then changing the
$HTTP_START_URL to non-blank).
Which pretty much describes my configuration on both servers, but right now,
the RedHat box seems like it's not connecting to itself to index the site.
My Apache logs contain no info, and yet the indexer.pl outputs this HTTP
Error: Couldn't get 'http://newweb.greenparty.on.ca/robots.txt': response
Not using any robots.txt.
Error: Couldn't get 'http://newweb.greenparty.on.ca/': response code 500
(newweb.greenparty.on.ca points to 184.108.40.206 -- there is no virtual
directory aliasing or anything like that.)
Thanks for your quick response! :)
Daniel Naber writes:
> On Tuesday 01 April 2003 20:38, Adrian Gudas wrote:
> > Loading http://220.127.116.11/robots.txt...
> What happens if you try to load that file in a browser? I don't mean from
> your desktop computer but from the server where you start the indexer
> (e.g. with lynx http://18.104.22.168/robots.txt)? From here I can index
> http://22.214.171.124/ without a problem (okay, it's just one page, but
> that works).