Integration Testing in Kafka SpringBoot

Introduction
Integration testing in software aims to uncover defects that arise from the combination and interaction of different components, ensuring that they work together correctly and as expected. Integration testing in Spring Boot involves testing the integration between various components of a Spring Boot application, such as controllers, services, repositories, and external dependencies.
And, Integration testing in Kafka with Spring Boot involves testing the interaction between Kafka producers and consumers in a controlled environment.
Usage of this doc:
You can use this doc to perform integration testing with Apache Kafka in your Spring boot Application.
Prerequisites
You need a good understanding of Java, spring-boot, Apache Kafka, maven or Gradle before going through this doc else will suggest you check their official documentation and guide.
Installation
Before writing a Kafka Integration test case in your project, you must complete a few steps to configure the required dependencies in your pom.xml(Maven) or build.gradle(Gradle project)
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
Writing Integration Test Class
Before writing the Integration test class, you should configure your component so that your test runs fine when you are simply invoking the test class instead of running the spring boot application, so you need to add the @ComponentScan annotation as shown below in your KafakInferIntegrationTestConfig class:
package your_kafka_dir_package;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
@Configuration
@ComponentScan(“directory where you have wriiten your kafka components(producer and consumer”)
public class KafakIntegrationTestConfig {
}
You are scanning your Kafka components above example shown below:
If you have a directory structure like this —
com
— — app
— — — — event
— — — — — kafka (components producer and consumer class)
Then your scanning will go like shown below:
@ComponentScan(“com.app.event”)
Now, you are ready to write your Integration test.
Sample Below —
import org.springframework.kafka.test.context.EmbeddedKafka;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.junit4.SpringRunner;
@DirtiesContext
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
@EnableAutoConfiguration
@SpringBootTest(classes = KafakIntegrationTestConfig.class)
@TestPropertySource("file:src/test/resources/kafka-integration-test-application.properties")
@EmbeddedKafka(partitions = 1, brokerProperties = { "listeners=PLAINTEXT://localhost:9092", "port=9092" })
public class KafkaIntegrationTest {
// Test methods go here
}
You can see various annotations used above:
— By applying the @DirtiesContext Annotation at the method or class level, you instruct Spring to reset the application context after the annotated test method or all test methods within the annotated test class have completed execution.
— With @TestInstance set to PER_CLASS, the test instance is created once for the entire class. This enables you to have instance variables in your test class that preserve state across multiple test methods.
— The @SpringBootTest annotation provides various attributes to configure additional aspects of the test environment, such as setting specific profiles (properties), defining custom environment variables, specifying configuration classes, and more.
— Use the @EmbeddedKafka Annotation to set up an embedded Kafka broker for testing. Specify the number of partitions, the number of brokers, and other relevant Kafka configuration properties as needed. For example:
—Add the Annotation: Annotate your test class or test method with @TestPropertySource to indicate that you want to use custom property sources for testing.
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.test.context.EmbeddedKafka;
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
@EnableAutoConfiguration
@SpringBootTest(classes = KafakInferIntegrationTestConfig.class)
@TestPropertySource(“file:src/test/resources/kafka-intg-infer.properties”)
@DirtiesContext
@EmbeddedKafka(partitions = 1, brokerProperties = { “listeners=PLAINTEXT://localhost:9092”, “port=9092” })
class KafkaInferIntegrationTest {
private CountDownLatch countDownLatch;
@SpyBean
private KafkaEventConsumer consumer;
@Autowired
private KafkaEventProducer producer;
@Value(“${spring.kafka.app.topic}”)
private String topic;
private String message;
@Captor
ArgumentCaptor<String> argumentCaptor;
@BeforeAll
public void init() throws InterruptedException {
producer.send(topic, message);
countDownLatch = new CountDownLatch(1);
countDownLatch.await(10, TimeUnit.SECONDS);
}
@Test
public void whenProduced_ThenConsumed() throws Exception {
verify(consumer, timeout(5000).times(1)).consume(argumentCaptor.capture());
String requestMessage= argumentCaptor.getValue();
assertNotNull(requestMessage);
}
}
When invoking a mocked method that you want to capture arguments from, use the capture() method of the ArgumentCaptor instance to capture the argument values.
Conclusion and Takeaways:
Integration testing Kafka in a Spring Boot application is necessary to ensure the correct functioning of Kafka-based messaging systems. Still, you should consider a few points mentioned below carefully:
- Every annotation mentioned above plays a crucial role in running your test setup. Ensure proper cleanup of Kafka topics and resources after each test execution to maintain a clean and isolated testing environment. Use the @DirtiesContext annotation to reset the application context if necessary.
- Integration Testing Setup: Configure any necessary dependencies, properties, or environment variables specific to the test environment using the annotations used above.
- Embedded Kafka: Utilize the spring-Kafka-test library to set up an embedded Kafka server for integration testing. This allows you to simulate a Kafka environment without needing an external Kafka cluster.
- Testing Producer: Test Kafka producers by sending messages to Kafka topics and verifying the successful delivery and processing of messages. Use the KafkaTemplate to send out the messages that can help you to produce the messages to the topics.
- Testing Consumer: Verify that the consumer properly processes and handles the received messages according to your application logic. Use the @KafkaListener annotation or Kafka consumer APIs to consume messages and perform assertions. Ensure that the message processing logic within your Kafka consumers or listeners is working correctly.
- Testing Error Handling: Test the error handling capabilities of your Kafka-based application. Simulate scenarios where messages fail to be processed or encounter errors. Validate that error handlings mechanisms, such as error handling callbacks, are processed correctly.
So, with the above guidelines, you can effectively test the integration between your Spring Boot application and Kafka, ensuring your Kafka-based messaging system's reliability, correctness, and error-handling capabilities.
Thank You!