[TYPO3-Solr] Indexing hangs on localized records

Tomita Militaru tmilitaru at arxia.com
Thu Mar 14 10:55:50 CET 2013



> Stephan Schuler <mailto:Stephan.Schuler at netlogix.de>
> 14 martie 2013 11:42
> You're mixing different things.
>
> CLI and CGI usually go with completely different php.ini files. What your browser shows in phpinfo() is only valid for CGI and does not count at all for CLI.
>
> The number you mentioned (536870912) should be exactly 512MB. So it's not configured somehow lower then 512MB for CLI but your CLI simply needs even more than 512MB.

I don't see it needing more, since it tries to allocate only 90 bytes 
and it crashes.
>
> The frontend indexing works very different then record indexing.
>
> For record indexing, the CLI loads each record (DAM or news or whatever), builds its indexing strategy by pure typoscript. Then it walks through several filter methods and pushes the result array to Solr. The memory here comes only from class loading (that should not be too much), the database results for the records (that should not be too much, too), accumulated generated data defined in your indexing configuration (even not so much memory, usually) and sometimes external value by some foreign bash commands, such as pdftotext or something.
>
> The frontend indexing is completely different. The indexer loads the page records and fetches the frontend by CURL. So when indexing a single frontend page, there are two PHP processes: One CLI process that does the queue runner and another one that was triggered by your webserver and servers your webpage. The second one is a little different from default page rendering: A post processing mechanism cuts the output data and does a huge string manipulation operation over all of the output content.
>
> You could try to find out which one dies with memory errors: The record indexer, the CLI part of the page indexer or the frontend part of the page indexer.
By frontend I meant the initializeTsfe from solr because I did some 
debugging and that's where it dies.
>
> I really don't believe that allowing for more than 512MB memory for a single index document is the way you want the problem to be solved. Which means: I'm completely unsure if raising the memory solves your problem. But if it does, it's although the wrong way.
I am not trying to increase the 512M, the PHP installed on the server 
has Suhosin which has it's own memory limit[1] and the value of 
suhosin.memory_limit is not shown in phpinfo, only suhosin.log or other 
suhosin log related information, that's why I emailed the hosting 
company. The server is freeBSD and since a clone of the website works 
fine on a Debian server without Suhosin, that's my guess for now. I 
could be wrong, but I got no other explanation why it crashes on 
localized records. I might be wrong, but on records in a different 
language tx_solr_Util calls along the way initializeTsfe for a different 
language.

Kind regards,
Tomita

[1] - 
http://www.hardened-php.net/suhosin/configuration.html#suhosin.memory_limit


More information about the TYPO3-project-solr mailing list