Related Documents
Document Properties
Last Modified
Added to KB
Public Access
Doc Type
Guidelines, Concepts & Cookbooks
  • ICM 7.10
  • Solr

Guide - Kafka Appender Removed

Table of Contents

Product Version


Product To Version


1 Introduction

The Kafka Appender was used to store logs directly at Kafka queues. With K8s all logs will printed to standard out. JSON format and the Kafka Appender are no longer used.

Some implementation partners want to use Kafka as messaging platform and get library conflicts with our Kafka Appender. Therefore the Kafka Appender was removed.

2 Migration

In case the Kafka Appender is used at projects, please add the following configuration to a custom cartridge:

dependencies {
    runtime 'com.github.danielwegener:logback-kafka-appender'
or with version
    runtime 'com.github.danielwegener:logback-kafka-appender:0.1.0'

Original logback extension for Kafka Appender at folder staticfiles/cartridge/logback:

<?xml version="1.0" encoding="UTF-8" ?>
<!-- In order to log into kafka use property intershop.logging.configurationfile.main=logback-kafka.xml 
	intershop.logging.configurationfile.dbinit=logback-kafka.xml -->


	<property name="intershop.CICLayout"
		value="[%date{yyyy-MM-dd HH:mm:ss.SSS Z}] %-5level ${intershop.HostName} ${intershop.InstallationID} ${intershop.ServerName} [%mdc{requestsite}] [%mdc{requestapplication}] %logger [%marker] [%mdc{request.type}] [%mdc{}] [%mdc{request.uuid}] "%thread" %msg %ex%n%mdc{}" />

	<!-- Kafka appender definition -->
	<appender name="kafkaAppender"
		<!-- This is the default encoder that encodes every log message to an utf8-encoded 
			string -->
			<layout class="ch.qos.logback.classic.PatternLayout">
				<pattern>[%date{yyyy-MM-dd HH:mm:ss.SSS Z}] %-5level
					${intershop.HostName} ${intershop.InstallationID}
					${intershop.ServerName} [%mdc{requestsite}]
					[%mdc{requestapplication}] %logger [%marker] [%mdc{request.type}]
					[%mdc{}] [%mdc{request.uuid}] "%thread" %msg
			class="com.github.danielwegener.logback.kafka.keying.RoundRobinKeyingStrategy" />
			class="" />

		<!-- each <producerConfig> translates to regular kafka-client config (format: 
			key=value) -->
		<!-- producer configs are documented here: -->
		<!-- bootstrap.servers is the only mandatory producerConfig -->

	<!-- Define default appender because they might be used as references  -->
	<appender name="Error" class="ch.qos.logback.core.helpers.NOPAppender">
	<appender name="Warn" class="ch.qos.logback.core.helpers.NOPAppender">
	<appender name="ImpexError" class="ch.qos.logback.core.helpers.NOPAppender">
	<appender name="Job" class="ch.qos.logback.core.helpers.NOPAppender">

	<!-- Log everything using the kafkaAppender -->
	<root level="INFO">
		<appender-ref ref="kafkaAppender" />

	<logger name="org">
		<level value="INFO" />

	<logger name="javax">
		<level value="INFO" />

	<logger name="java.awt">
		<level value="INFO" />

	<logger name="tomcat">
		<level value="INFO" />

	<logger name="sun">
		<level value="INFO" />

	<logger name="com.sun">
		<level value="INFO" />

	<logger name="org.apache.catalina.startup.Catalina">
		<level value="INFO" />

	<logger name="org.apache.jasper.compiler.Compiler" additivity="false">
		<level value="INFO" />

	<!-- avoid error messages from sandesha startup Message "Could not load 
		module policies. Using default values." -->
	<logger name="org.apache.sandesha2.SandeshaModule" additivity="false">
		<level value="OFF" />



The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Web site, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.

Customer Support
Knowledge Base
Product Resources