All posts
7 min read
Beyond structured logs - Part 1

While logging might not seem like the most exciting topic, it’s undoubtedly one of the most crucial aspects of software development. Logs serve as the eyes and ears of your application, providing critical insights into its behavior and helping with debugging, monitoring, and performance optimization.

In this article, we’ll explore how to enhance Java’s logging capabilities beyond the basics. We’ll focus on two key aspects: what we log and how we structure it. Our goal is to create logs that are not just readable but also easily parseable and queryable.

We’ll examine how to create structured logs that are both human-readable and machine-parseable, while also ensuring that the code generating these logs is clean and maintainable. This approach will help you build a logging strategy that seamlessly integrates with your application design.

Level 0 - The Defaults

Let’s start with the basics. While this isn’t bare Java, it represents a standard and commonly used setup. We’ll use slf4j with logback as our logging framework:

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.util.UUID;

public class Level0 {

    private static final Logger log = LoggerFactory.getLogger(Level0.class);

    public static void main(String[] args) {
        // Some sample data
        var messageId = UUID.randomUUID();
        var result = "REJECT";
        var exc = new RuntimeException("Exception message");

        // Logging starts here.
        log.info("Message received. Message id: {}. Result: {}", messageId, result);
        log.error("Cannot transform transaction", exc);
    }
}

The output from this code looks like this:

16:32:05.242 [main] INFO com.github.rkondratowicz.Level0 -- Message received. Message id: 77f57357-71f7-44be-b4a8-e92cd87c0b82. Result: REJECT
16:32:05.244 [main] ERROR com.github.rkondratowicz.Level0 -- Cannot transform transaction
java.lang.RuntimeException: Exception message
	at com.github.rkondratowicz.Level0.main(Level0.java:19)

As you can see, these logs lack structure. While we can identify different pieces of information within them, they’re difficult to parse and query programmatically. The readability could also be improved. Let’s enhance this by switching to JSON format.

Level 1 - JSON

To implement JSON logging, we need to add some dependencies to build.gradle:

dependencies {
    implementation 'ch.qos.logback.contrib:logback-json-classic:0.1.5'
    implementation 'ch.qos.logback.contrib:logback-jackson:0.1.5'
    implementation 'com.fasterxml.jackson.core:jackson-databind:2.16.1'
}

And configure Logback with this configuration file:

<configuration>
    <appender name="json" class="ch.qos.logback.core.ConsoleAppender">
        <encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
            <layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
                <jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter" />
                <timestampFormat>yyyy-MM-dd'T'HH:mm:ss.SSS</timestampFormat>
                <timestampFormatTimezoneId>UTC</timestampFormatTimezoneId>
                <appendLineSeparator>true</appendLineSeparator>
            </layout>
        </encoder>
    </appender>
    <root level="INFO">
        <appender-ref ref="json"/>
    </root>
</configuration>

Note: While similar functionality can be achieved with Log4j, this article focuses on Logback.

The logging code remains unchanged, but the output now looks like this:

{
  "timestamp": "2025-03-17T16:48:28.125",
  "level": "INFO",
  "thread": "main",
  "logger": "com.github.rkondratowicz.Level1",
  "message": "Message received. Message id: 7b1b3928-cbac-47ee-8395-00c43cf9e061. Result: REJECT",
  "context": "default"
}

Now we have a much better structure. Each piece of information has its own field, making it easier to query and analyze. However, the message content itself is still a single string containing multiple pieces of information (the message itself, ID of the message and a Result).
Wouldn’t it be great if we could filter logs based on specific fields within the message?

Level 2 - Custom Logger

To achieve this, we’ll leverage MDC (Mapped Diagnostic Context), which provides a map-like structure for storing key-value pairs that the logging framework automatically includes in each log entry. Since we’re using JSON format, these pairs will appear as separate JSON fields.

Let’s create a custom logger that will make our logging code more elegant and maintainable. The goal is to write logs that look like this:

private static final EventLogger log = EventLoggerFactory.getLogger(Level2.class);

// ...

log.info(
    "Message received",
    Map.of(
        "message_id", messageId,
        "result", result
    )
);

Here’s how we can implement this. To avoid naming conflicts, we’ll call our new logger EventLogger. For the sake of this article, we’ll implement just the info method:

public class EventLogger {

    private static final Logger log = LoggerFactory.getLogger("EventLogger");

    private final String eventSource;

    public EventLogger(String eventSource) {
        this.eventSource = eventSource;
    }

    public void info(
        String eventName,
        Map<String, String> params
    ) {
        try {
            MDC.put("event.name", eventName);
            MDC.put("event.source", this.eventSource);
            if (params != null) {
                params.forEach((k, v) -> MDC.put("event.properties." + k, v));
            }
            log.info(eventName);
        } finally {
            cleanUpMDC(params);
        }
    }
}

Now, along with the message, we can pass a map containing our parameters. The logger iterates over this map and adds each key-value pair to MDC. Let’s examine the logs produced by this code:

{
  "timestamp": "2025-03-17T21:41:28.948",
  "level": "INFO",
  "thread": "main",
  "mdc": {
    "event.name": "Message received",
    "event.properties.result": "REJECT",
    "event.properties.message_id": "a4639312-78f1-4443-84ff-d5e6017facf8",
    "event.source": "com.github.rkondratowicz.Level2"
  },
  "logger": "EventLogger",
  "message": "Message received",
  "context": "default"
}

This is a significant improvement! Each property is now its own field within the mdc object, making it easy to create sophisticated queries. However, while our logs are now well-structured, we can still enhance how our custom logger is used.

There are two main issues to address:

  1. Passing the map every time can be cumbersome and lead to bloated code
  2. Some logs might be created in multiple places, and we want to ensure consistency in both the message and the parameters

Level 3 - Logging Records

We can address these concerns by using predefined records for our logging events. This approach will make our logging code more concise and type-safe. Here’s how the code will look:

log.info(new MessageReceived(messageId, result));

Much cleaner! Instead of creating a map and explicitly naming properties each time, we simply instantiate a record. This approach also provides compile-time safety - if you ever need to add a third parameter to a log event, any missing code updates will be caught during compilation.

Here’s the definition of the log event we’re using:

public record MessageReceived(
    String messageId,
    String result
) implements InfoEvent {
}

We’re using an empty marker interface to distinguish between info and error events, which also improves IDE support and makes it easier for LLMs to understand our code structure.

Our custom logger implementation remains similar but with some improvements:

private final EventRecordHelper helper;

// ...

public void info(InfoEvent event) {
    String eventName = helper.getName(event.getClass());
    Map<String, String> fields = helper.getFieldsAsMap(event);
    try {
        MDC.put("event.name", eventName);
        MDC.put("event.source", this.eventSource);
        if (fields != null) {
            fields.forEach((k, v) -> MDC.put("event.properties." + k, v));
        }
        log.info(eventName);
    } finally {
        cleanUpMDC(fields);
    }
}

Here’s the helper class that uses reflection to create a map from a record:

public class EventRecordHelper {

    public String getName(Class<?> clazz) {
        clazz.getSimpleName();
    }

    public Map<String, String> getFieldsAsMap(Event event) {
        return Arrays
            .stream(event.getClass().getRecordComponents())
            .collect(toMap(rc -> rc.getName(), rc -> getValue(event, rc)));
    }

    private String getValue(Event event, RecordComponent rc) {
        try {
            return rc.getAccessor().invoke(event).toString();
        } catch (Exception exc) {
            // in case invoking a getter fails... (shouldn't happen on records)
            return "N/A";
        }
    }
}

That’s it! Our logging system is now fully type-safe and uses records for all events. I recommend organizing all your application logs in an ‘event catalog’ to maintain a single source of truth for your logging events. This approach also enables automatic documentation generation for your application’s logging system.

You can find a complete example project implementing these concepts in the structured-logging-demo repository.

In Part 2, we’ll explore how to generate documentation from our event catalog, making it easier for teams to understand and maintain their logging strategy. We’ll also look at how to integrate this logging system with popular observability tools and create dashboards from our structured logs.

Stay tuned!