SoFunction
Updated on 2025-03-04

Extended logback to output logs to Kafka instance detailed explanation

Extended logback to output logs to Kafka instance

introduce

logback is a powerful Java logging framework, it is the successor to log4j, providing rich features and configuration options. When processing large amounts of log data, outputting logs to message queues like Kafka is a common requirement, which can make log data easier to process and analyze. This article will explain how to output logs to Kafka instances by extending logback.

Preparation

Before you start, make sure you have installed the following dependencies:

  • JDK 8 or higher
  • Maven 3 or higher
  • Kafka 2.0 or later

Create Kafka logs Appender

logback provides the ability to extend log processing, we can output logs to Kafka by creating a custom Appender. Here is a simple example of KafkaAppender:

import ;
import ;
import ;
import ;
import ;
import ;
public class KafkaAppender extends UnsynchronizedAppenderBase<ILoggingEvent> {
    private KafkaProducer<String, String> producer;
    public void start() {
        ();
        Properties props = new Properties();
        // Configure the properties of Kafka producer        ("", "your-kafka-broker:port");
        ("", "");
        ("", "");
        producer = new KafkaProducer<>(props);
    }
    public void stop() {
        ();
        ();
    }
    @Override
    protected void append(ILoggingEvent event) {
        String message = ();
        String topic = "your-kafka-topic"; // Replace with the actual Kafka topic        (new ProducerRecord<>(topic, message));
    }
}

This Appender inherits from​UnsynchronizedAppenderBase​​, it is a thread-safe Appender that can be used safely in a multi-threaded environment. ​​start​In the method, we configured the properties of Kafka Producer and created a Kafka Producer object. ​​append​The method is to actually send log messages to Kafka.

Configure logback

Next, we need to add support for KafkaAppender in the logback configuration file. Create a name called​​​file and add the following configuration:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <appender name="KAFKA" class="">
        <!-- Other configurations -->
    </appender>
    <root level="info">
        <appender-ref ref="KAFKA" />
    </root>
</configuration>

In this configuration file, we define a name called​KAFKA​​Appender and assign it to the root logger. Make sure to​​​Replace with the actual package name.

Using KafkaAppender

Now you can use this custom Appender in your Java application to output logs to Kafka. In​src/main/java/​Create a directory called​​​ file and add the following code:

import ;
import ;
import org.;
public class Main {
    public static void main(String[] args) {
        // Get the context of logback        LoggerContext context = (LoggerContext) ();
        // Get the root logger        Logger rootLogger = (Logger.ROOT_LOGGER_NAME);
        // Set log level        rootLogger.existJavaIn application,useLogbackOutput logs toKafkaThe example usually requires the following steps:
1. ConfigurationLogbackof`<appender>`元素以useKafkaAppender。
2. ConfigurationKafkaAppenderof属性,includeKafka brokerList、Topic name, etc.。
3. Create aKafka producer。
4. Writing logger code,Send log information toKafka。
下面是一个简单of示例,展示了如何existLogback中ConfigurationKafkaAppender,并将其与一个简单ofJava应用程序结合use:
first,你需要existMavenorGradleAdded to the projectLogbackandKafkaof依赖项。Here you assume you have installed itKafkaand相关of依赖项。
```xml
<dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <version>1.7.25</version>
</dependency>
<dependency>
    <groupId></groupId>
    <artifactId>logback-classic</artifactId>
    <version>1.2.3</version>
</dependency>
<dependency>
    <groupId></groupId>
    <artifactId>kafka-clients</artifactId>
    <version>2.4.1</version>
</dependency>

Then, you need to add the configuration of KafkaAppender in the configuration file of Logback. Here is a simple configuration example:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <appender name="KAFKA" class="">
        <brokers>localhost:9092</brokers>
        <topic>your-topic-name</topic>
        <encoder class="">
            <Pattern>%d{HH:mm:} [%thread] %-5level %logger{36} - %msg%n</Pattern>
        </encoder>
    </appender>
    <root level="info">
        <appender-ref ref="KAFKA" />
    </root>
</configuration>

In this configuration, we set up a name called​KAFKA​KafkaAppender, which sends logs to the local Kafka broker (​localhost:9092​​) on​your-topic-name​​Theme. ​​PatternLayoutEncoder​​Use to format log messages.

Next, you need to create a simple Java application to test Logback's KafkaAppender. This application will use SLF4J to log.

import org.;
import org.;
public class KafkaLogbackExample {
    private static final Logger LOGGER = ();
    public static void main(String[] args) {
        ("This is an info message");
        ("This is a debug message");
    }
}

When you run this application, the log message should be sent to Kafka's​your-topic-name​​Theme.

Please note that this example is very basic. In actual application, you may need to deal with more complex scenarios, such as message partitioning, consumer groups, message formatting, etc. In addition, you may also need to consider log reliability, performance, and maintainability. To output the logs to the Kafka instance, you need to use the logback-kafka appender plugin. This plugin can publish logback log events to Kafka topics. Here is a simple configuration example showing how to output logs to Kafka:

First, you need to add the logback-kafka appender dependency to your project. You can add this dependency via Maven or Gradle.

For Maven project, add the following dependencies to your​​In the file:

&lt;dependency&gt;
    &lt;groupId&gt;&lt;/groupId&gt;
    &lt;artifactId&gt;logstash-logback-encoder&lt;/artifactId&gt;
    &lt;version&gt;6.6&lt;/version&gt;&lt;!-- Please use the latest version --&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
    &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
    &lt;artifactId&gt;slf4j-api&lt;/artifactId&gt;
    &lt;version&gt;1.7.30&lt;/version&gt;&lt;!-- Please use the latest version --&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
    &lt;groupId&gt;&lt;/groupId&gt;
    &lt;artifactId&gt;logback-core&lt;/artifactId&gt;
    &lt;version&gt;1.2.3&lt;/version&gt;&lt;!-- Please use the latest version --&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
    &lt;groupId&gt;&lt;/groupId&gt;
    &lt;artifactId&gt;logback-classic&lt;/artifactId&gt;
    &lt;version&gt;1.2.3&lt;/version&gt;&lt;!-- Please use the latest version --&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
    &lt;groupId&gt;.wj89620&lt;/groupId&gt;
    &lt;artifactId&gt;logback-kafka-appender&lt;/artifactId&gt;
    &lt;version&gt;1.1.1&lt;/version&gt;&lt;!-- Please use the latest version --&gt;
&lt;/dependency&gt;

For Gradle projects, add the following dependencies to your​​In the file:

dependencies {
    implementation ':logstash-logback-encoder:6.6'
    implementation 'org.slf4j:slf4j-api:1.7.30'
    implementation ':logback-core:1.2.3'
    implementation ':logback-classic:1.2.3'
    implementation '.wj89620:logback-kafka-appender:1.1.1'
}

Then, you need to configure the Kafka appender in the logback configuration file. Here is a basic configuration example:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <appender name="KAFKA" class=".">
        <topic>your-topic-name</topic>
        <brokers>your-kafka-brokers-list:9092</brokers>
        <encoder class=""/>
    </appender>
    <root level="INFO">
        <appender-ref ref="KAFKA"/>
    </root>
</configuration>

In this configuration, you need to replace​your-topic-name​​For your Kafka topic name,​your-kafka-brokers-list:9092​​For your Kafka broker list and port number.

Please note that this configuration uses Logstash's Logback encoder (​​​) to format log events so that they can be processed more easily by the consumer side.

Finally, make sure your application loads the logback configuration file correctly. In Java applications, this is usually done by placing a classpath called ​​​​ configuration file to complete.

When you run the application, all log events matching the root logger (in this case the INFO level and higher level) will be sent to Kafka's​your-topic-name​​Theme.

Note that this configuration is a simplified example, you may need to adjust it according to your specific needs, such as adding error handling logic, setting different log levels, etc.

This is the article about extending logback to output logs to Kafka instances. For more related logback logs to Kafka, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!