Support Article - Collecting Important Log Information from Solr Maintenance Operations and Repairing Solr Index

Table of Contents


Product Version

7.10.38.11

Product To Version

7.11
Status

Introduction

In case of Solr index maintenance errors or executions, the following guideline will help you to collect important log information from Solr maintenance operations logged in ICM AS and Solr Cloud server logs.
This article mainly focuses on the custom appender which collects Solr index related database statements, classes, pipelines, and pipelet execution from search index maintenance operation triggered by a scheduled job or back office GUI interaction.

Setting up a Special Logging

To set up the special logging, you need to perform the following steps. As a result of this, SolrAdapter.log is written in \eserver\share\system\log for further analysis.


  1. Use and adjust the log appender for index names/needs.

    The log appender generates dedicated log output from Solr index maintenance operation or index queries.
    You only need to adjust the index name mentioned in your Intershop Commerce Management, because it can be specific.
    In the following example, the log appender will collect index maintenance information from the ICM back office search index called "product-search-idx-en_US".



  2. Enable ICM performance sensors in \eserver\share\system\config\cluster\appserver.properties and restart the affected application server.

    intershop.monitoring.requests=true
    intershop.monitoring.pipelines=true
    intershop.monitoring.pipelets=true
    intershop.monitoring.queries=true
    intershop.monitoring.sql=true
    intershop.monitoring.class=true
    intershop.monitoring.objectpath=false
    intershop.monitoring.pagelet=false
    intershop.monitoring.pipelinenodes=false
    intershop.monitoring.templates=false


    As a result, the provided Solr log appender is able to log runtimes from pipelines, pipelets, or database statements you may need for (performance) analysis.

  3. Start the search index creation inside ICM, or run the SMC job "Rebuild Search Index" or "Update Search Index".

  4. Collect and check the following information after the test:
  • ICM Solr appender logs written in \eserver\share\system\log
  • ICM-related batch log files from affected unit/site, example: \eserver\share\sites\inSPIRED-inTRONICS-Site\units\inSPIRED-inTRONICS\batch\log\*.*

  • Solr Cloud Server logs from all nodes and Zookeeper logs


SolrAppender.xml
<?xml version="1.0" encoding="UTF-8" ?> <included>

	<appender name="SolrServer" class="ch.qos.logback.core.rolling.RollingFileAppender">
		<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
			<level>TRACE</level>
		</filter>
		<File>${intershop.logfile.Directory}/SolrServer-${intershop.logfile.NamePostfix}.log</File>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<FileNamePattern>${intershop.logfile.Directory}/SolrServer-${intershop.logfile.NamePostfix}-%d{yyyy-MM-dd}.log.zip</FileNamePattern>
		</rollingPolicy>
		<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
			<layout class="ch.qos.logback.classic.PatternLayout">
				<Pattern>
					[%date{yyyy-MM-dd HH:mm:ss.SSS Z}] [%thread] [Job:%mdc{job.name}] [Req:%mdc{request.uuid}] %-5level %logger %marker - %msg %ex%n
				</Pattern>
			</layout>
		</encoder>
	</appender>
	
	<logger name="org.apache.solr">
		<level value="INFO" />
		<appender-ref ref="SolrServer" />
	</logger>	
	

	<appender name="SolrAdapter" class="ch.qos.logback.core.rolling.RollingFileAppender">
		<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
			<level>TRACE</level>
		</filter>
		<filter name="JobContextFilter" class="ch.qos.logback.core.filter.EvaluatorFilter">
			<evaluator name="JobOnly">
				<expression> 
				(mdc.containsKey("job"))  &&     
				(mdc.containsKey("containsKey") || 
					"Cleanup Search Indexes".equals(mdc.get("job.name")) ||
					"Rebuild Search Indexes".equals(mdc.get("job.name")) || 
					"Update Search Indexes".equals(mdc.get("job.name"))  ||
					"SearchIndexGenerationproduct-search-idx-en_US".equals(mdc.get("job.name")) ||
					"SearchIndexGenerationproduct-search-idx-de_DE".equals(mdc.get("job.name"))
				) ||
				logger.startsWith("com.intershop.adapter.search_solr.internal")
				</expression>
			</evaluator>
			<OnMatch>NEUTRAL</OnMatch>
			<OnMismatch>DENY</OnMismatch>
		</filter>
		<File>${intershop.logfile.Directory}/SolrAdapter-${intershop.logfile.NamePostfix}.log</File>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<FileNamePattern>${intershop.logfile.Directory}/SolrAdapter-${intershop.logfile.NamePostfix}-%d{yyyy-MM-dd}.log.zip</FileNamePattern>
		</rollingPolicy>
		<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
			<layout class="ch.qos.logback.classic.PatternLayout">
				<Pattern>
					[%date{yyyy-MM-dd HH:mm:ss.SSS Z}] [%thread] [Job:%mdc{job.name}] [Req:%mdc{request.uuid}] %-5level %logger %marker - %msg %ex%n
				</Pattern>
			</layout>
		</encoder>
	</appender>
	

	<logger name="com.intershop.component.search.internal">
		<level value="DEBUG" />
		<appender-ref ref="SolrAdapter" />
	</logger>	

	<logger name="com.intershop.adapter.search_solr">
		<level value="DEBUG" />
		<appender-ref ref="SolrAdapter" />
	</logger>	

	<logger name="com.intershop.beehive.core.internal.performance">
		<level value="TRACE" />
		<appender-ref ref="SolrAdapter" />
	</logger>	

	<logger name="com.intershop.beehive.orm.oracle.internal.query">
		<level value="DEBUG" />
		<appender-ref ref="SolrAdapter" />
	</logger>	

	<logger name="com.intershop.beehive.orm.mssql.internal.query">
		<level value="DEBUG" />
		<appender-ref ref="SolrAdapter" />
	</logger>	

</included>
Disclaimer

The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Web site, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.

The Intershop Customer Portal uses only technically necessary cookies. We do not track visitors or have visitors tracked by 3rd parties. Please find further information on privacy in the Intershop Privacy Policy and Legal Notice.
Customer Support
Knowledge Base
Product Resources