TestContainers

TestContainers is a popular Java-based library that simplifies testing by providing lightweight, disposable instances of databases, message brokers, and other infrastructure services. It uses Docker containers to provide these services, making it easy to create isolated, reproducible test environments.

https://testcontainers.com/

Key Features:

  1. Docker Integration: Leverages Docker to spin up containers for testing purposes.
  2. Disposable Containers: Each container is destroyed after the test, ensuring a clean state.
  3. Wide Ecosystem Support: Supports databases (e.g., PostgreSQL, MySQL), message brokers (e.g., Kafka, RabbitMQ), and other services (e.g., Selenium, Elasticsearch).
  4. Integration with Testing Frameworks: Compatible with JUnit, TestNG, and other testing frameworks.
  5. Network Isolation: Containers are isolated from each other and the host system, reducing interference.

Common Use Cases:

  1. Database Testing: Spin up database containers with schemas preloaded for integration testing.
  2. Messaging Systems: Test Kafka or RabbitMQ messaging flows.
  3. Web Application Testing: Use Selenium containers to perform browser-based UI tests.
  4. Third-Party Service Testing: Mock external services by running lightweight API servers in containers.

Advantages:

  • Eliminates the need for complex test environment setups.
  • Reduces “works on my machine” issues.
  • Enables parallel, isolated tests.

Limitations:

  • Requires Docker to be installed and running.
  • Slight overhead in starting and stopping containers.

TestContainers is a powerful tool for developers aiming to enhance the reliability and repeatability of their integration and end-to-end tests.

Example: Database Testing with PostgreSQL

1. Database Testing: PostgreSQL

import org.junit.jupiter.api.Test;
import org.testcontainers.containers.PostgreSQLContainer;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;

import static org.junit.jupiter.api.Assertions.assertEquals;

public class DatabaseTest {

    @Test
    void testPostgreSQL() throws Exception {
        try (PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:latest")) {

            postgres.start();

            Connection connection = DriverManager.getConnection(
                postgres.getJdbcUrl(), postgres.getUsername(), postgres.getPassword()
            );

            Statement statement = connection.createStatement();
            statement.execute("CREATE TABLE test (id SERIAL PRIMARY KEY, name VARCHAR(50));");
            statement.execute("INSERT INTO test (name) VALUES ('Test Name');");

            ResultSet resultSet = statement.executeQuery("SELECT name FROM test WHERE id = 1;");
            if (resultSet.next()) {
                String name = resultSet.getString("name");
                assertEquals("Test Name", name, "Name should match the inserted value.");
            }

            connection.close();
        }
    }
}

2. Messaging Systems: Kafka

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.junit.jupiter.api.Test;
import org.testcontainers.containers.KafkaContainer;

import java.time.Duration;
import java.util.Collections;
import java.util.Properties;

import static org.junit.jupiter.api.Assertions.assertTrue;

public class KafkaTest {

    @Test
    void testKafka() {
        try (KafkaContainer kafka = new KafkaContainer()) {
            kafka.start();

            String bootstrapServers = kafka.getBootstrapServers();

            // Kafka Producer
            Properties producerProps = new Properties();
            producerProps.put("bootstrap.servers", bootstrapServers);
            producerProps.put("key.serializer", StringSerializer.class.getName());
            producerProps.put("value.serializer", StringSerializer.class.getName());

            KafkaProducer<String, String> producer = new KafkaProducer<>(producerProps);
            producer.send(new ProducerRecord<>("test-topic", "key", "value"));
            producer.close();

            // Kafka Consumer
            Properties consumerProps = new Properties();
            consumerProps.put("bootstrap.servers", bootstrapServers);
            consumerProps.put("group.id", "test-group");
            consumerProps.put("key.deserializer", StringDeserializer.class.getName());
            consumerProps.put("value.deserializer", StringDeserializer.class.getName());
            consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

            KafkaConsumer<String, String> consumer = new KafkaConsumer<>(consumerProps);
            consumer.subscribe(Collections.singletonList("test-topic"));

            boolean messageReceived = consumer.poll(Duration.ofSeconds(10)).stream()
                .anyMatch(record -> "value".equals(record.value())); // equal check

            assertTrue(messageReceived, "Kafka should receive the message with value 'value'.");

            consumer.close();
        }
    }
}

3. Web Application Testing: Selenium

import org.junit.jupiter.api.Test;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testcontainers.containers.BrowserWebDriverContainer;

import static org.junit.jupiter.api.Assertions.assertTrue;

public class SeleniumTest {

    @Test
    void testSelenium() {
        try (BrowserWebDriverContainer<?> chrome = new BrowserWebDriverContainer<>()
                .withCapabilities(new org.openqa.selenium.remote.DesiredCapabilities().chrome())) {
            chrome.start();

            WebDriver driver = new ChromeDriver(chrome.getWebDriverCapabilities());
            driver.get("https://www.google.com");

            String title = driver.getTitle();
            assertTrue(title.contains("Google"), "Page title should contain 'Google'.");

            driver.quit();
        }
    }
}

4. Third-Party Service Testing: Mock API

import org.junit.jupiter.api.Test;
import org.mockserver.client.MockServerClient;
import org.mockserver.model.HttpRequest;
import org.mockserver.model.HttpResponse;
import org.testcontainers.containers.MockServerContainer;

import static org.junit.jupiter.api.Assertions.assertEquals;

public class MockServerTest {

    @Test
    void testMockServer() {
        try (MockServerContainer mockServer = new MockServerContainer("mockserver/mockserver:latest")) {
            mockServer.start();

            MockServerClient client = new MockServerClient(mockServer.getHost(), mockServer.getServerPort());

            // Setup mock response
            client.when(HttpRequest.request()
                    .withMethod("GET")
                    .withPath("/api/test"))
                    .respond(HttpResponse.response()
                            .withStatusCode(200)
                            .withBody("Mock response"));

            // Test request to the mock server
            HttpResponse response = client.sendRequest(HttpRequest.request()
                    .withMethod("GET")
                    .withPath("/api/test")).get();

            assertEquals("Mock response", response.getBody().toString(), "Response body should match the mock setup.");
            assertEquals(200, response.getStatusCode(), "Status code should be 200.");
        }
    }
}

Each test now includes assertions to validate the behavior and ensure the expected results. These assertions make the tests robust and suitable for automated validation.

EmbeddedKafka

EmbeddedKafka is a library that provides an in-memory Kafka broker for testing purposes. It is particularly useful for unit and integration testing of applications that use Kafka for messaging, as it eliminates the need for an external Kafka cluster. EmbeddedKafka is often used in projects written in Java or Scala, typically with the Spring Kafka framework or plain Kafka clients.

https://github.com/embeddedkafka/embedded-kafka


Key Features

  1. In-Memory Kafka Broker:
    • Spins up a Kafka broker and Zookeeper instance within the same JVM.
  2. Lightweight:
    • No external dependencies or Docker required.
  3. Fast Setup:
    • Quick startup and teardown, ideal for automated test pipelines.
  4. Programmatic Control:
    • Configure topics, partitions, and brokers programmatically.
  5. Integrated with Test Frameworks:
    • Works well with JUnit, TestNG, or other testing frameworks.

Advantages

  • Speed: Faster than spinning up Docker-based containers.
  • Portability: Does not depend on external infrastructure.
  • Isolation: Tests are fully isolated, ensuring predictable behavior.
  • Ease of Use: Simplifies Kafka setup and cleanup for tests.

Common Use Cases

  1. Testing Kafka Producers: Verify that messages are being published correctly.
  2. Testing Kafka Consumers: Ensure that messages are consumed and processed as expected.
  3. Integration Testing: Validate interactions between multiple components using Kafka as a message broker.
  4. Error Handling: Test how your application handles Kafka-related errors, such as missing topics or timeouts.

Basic Example: Using EmbeddedKafka

Below is an example using EmbeddedKafka in a Java-based test setup:

Maven Dependency

<dependency>
    <groupId>info.batey.kafka</groupId>
    <artifactId>kafka-unit</artifactId>
    <version>4.2</version>
    <scope>test</scope>
</dependency>

Sample Code

import info.batey.kafka.unit.KafkaUnit;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;

import java.util.Collections;
import java.util.Properties;

import static org.junit.jupiter.api.Assertions.assertEquals;

public class EmbeddedKafkaTest {

    private KafkaUnit kafkaUnit;

    @BeforeEach
    void setup() {
        kafkaUnit = new KafkaUnit(5000, 5001); // Ports for Zookeeper and Kafka
        kafkaUnit.start();
        kafkaUnit.createTopic("test-topic");
    }

    @AfterEach
    void tearDown() {
        kafkaUnit.shutdown();
    }

    @Test
    void testKafkaProducerAndConsumer() {
        String topic = "test-topic";
        String messageKey = "key";
        String messageValue = "value";

        // Kafka Producer setup
        Properties producerProps = new Properties();
        producerProps.put("bootstrap.servers", kafkaUnit.getKafkaConnect());
        producerProps.put("key.serializer", StringSerializer.class.getName());
        producerProps.put("value.serializer", StringSerializer.class.getName());

        KafkaProducer<String, String> producer = new KafkaProducer<>(producerProps);
        producer.send(new ProducerRecord<>(topic, messageKey, messageValue));
        producer.close();

        // Kafka Consumer setup
        Properties consumerProps = new Properties();
        consumerProps.put("bootstrap.servers", kafkaUnit.getKafkaConnect());
        consumerProps.put("group.id", "test-group");
        consumerProps.put("key.deserializer", StringDeserializer.class.getName());
        consumerProps.put("value.deserializer", StringDeserializer.class.getName());
        consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

        KafkaConsumer<String, String> consumer = new KafkaConsumer<>(consumerProps);
        consumer.subscribe(Collections.singletonList(topic));

        ConsumerRecords<String, String> records = consumer.poll(java.time.Duration.ofSeconds(10));

        assertEquals(1, records.count(), "There should be exactly one record in the topic.");
        records.forEach(record -> assertEquals(messageValue, record.value(), "Message value should match."));
        consumer.close();
    }
}

EmbeddedKafka vs. TestContainers

Here’s a detailed comparison of EmbeddedKafka and TestContainers (Kafka) :

Aspect EmbeddedKafka TestContainers (Kafka)
Overview In-memory Kafka broker for lightweight, fast testing within the same JVM. Uses Docker to run a real Kafka broker for realistic integration testing.
Setup Time Faster setup (milliseconds). Slower setup (requires Docker container startup, typically a few seconds).
Resource Usage Minimal as it runs within the JVM. Higher due to Docker overhead.
Complexity Simple to set up and use, suitable for unit tests. Slightly more complex due to Docker dependency, better for integration tests.
Dependency No external dependencies other than the library itself. Requires Docker to be installed and running.
Realism Simulates Kafka but does not fully replicate a real Kafka cluster. Runs a real Kafka broker, providing a production-like environment.
Test Isolation Tests are isolated but run in the same JVM, sharing memory space. Tests are isolated at the container level, ensuring no interference between tests.
Supported Features Limited support for advanced Kafka features like transactions, security, and custom configurations. Full support for all Kafka features, including transactions, security, and configuration options.
Portability Fully portable as it runs in-memory within the JVM. Depends on Docker’s availability, which might limit portability in environments without Docker.
Performance High performance due to lightweight nature and absence of network latency. Slightly slower due to containerized network interactions.
Error Simulation Limited ability to simulate network issues, broker crashes, etc. Can simulate real-world issues like network partitions, broker restarts, and performance bottlenecks.
Use Cases - Unit tests.
- Simple producer/consumer logic tests.
- Quick feedback loops in CI pipelines.
- Integration tests.
- End-to-end tests involving multiple services.
- Simulating production environments.
Scalability Not suitable for testing distributed Kafka clusters or large-scale scenarios. Can be scaled to test distributed Kafka clusters by spinning up multiple brokers in containers.
Ease of Debugging Easier to debug as everything runs within the JVM and can be logged directly. Debugging might involve analyzing Docker logs and container interactions.
CI/CD Integration Very lightweight and integrates well with CI/CD pipelines. Requires Docker to be available on CI/CD agents, which might add some setup complexity.
Realistic Testing Less realistic; suitable for simple test scenarios. Provides a production-like environment, ideal for realistic testing.
Learning Curve Low – straightforward API and simple to use. Moderate – requires understanding Docker and containerized environments.

Recommendation Based on Use Case

  • Use EmbeddedKafka if:
    • You need quick, lightweight tests for producer/consumer logic.
    • Your tests are limited to single-node Kafka simulations.
    • Your CI/CD pipeline prioritizes speed and minimal resource usage.
  • Use TestContainers (Kafka) if:
    • You need realistic integration tests that mimic production environments.
    • You are testing advanced Kafka features like security, transactions, or multi-broker setups.
    • Your tests involve interactions between multiple services using Kafka.

This table provides a comprehensive view to help choose the right tool based on your testing requirements.