Written by Patryk Kurczyna
Kotlin/Java Developer
Published November 27, 2023

Integration Testing Deep Dive Part I

 

1. Overview

Professionals well-versed in  The Testing Pyramid recognize the pivotal role Integration Tests play in contemporary backend (micro)services development. In this article I will try to make a deep dive into one of my favourite ways of setting up Integration Tests suite for Java/Kotlin applications. It encompasses a comprehensive exploration of setup methodologies, essential tools, best practices, and the seamless integration with various third-party entities often encountered in the development landscape.

As an illustrative example, the primary emphasis will be on the Spring Boot framework, but I believe many of the concepts presented in the article can be used in other technologies too. Personally, I have garnered experience in setting it up with Ktor and Docker Compose as well.

I have opted for Kotlin as my language of choice and Spock framework (based on Groovy) for testing, as I am a big fan of it. However, it’s important to note that you are free to leverage alternative tools aligned with your preferences. If you want to learn more about Spock read my previous article about it.

2. Preparation

All of the code in this article can be found in this GitHub repository

Project setup

Let’s start with the project setup and dependencies. This is the build.gradle.kts file to setup basic Spring Boot application, plus unit and integration test suites. We shall go through the most important parts of it in detail.

Plugins

Here is the list of the plugins you need with a short description.

Integration tests task setup

Adhering to best practices, I do recommend separating the unit test suite from the integration test suite.

That is why I opted for having a separate Gradle task for integration tests. This deliberate separation extends to declaring different dependencies for each suite, fostering a high degree of decoupling.

Firstly, let’s create a new sourceSet for integration tests and help IntelliJ IDE to find it. The unit tests will be stored in src/test/groovy directory and the integration tests located in src/itest/groovy directory.

Secondly, let’s create a gradle task itest which will inherit from the Test task and become responsible for running integration tests.

Now, you should plug-in the task into the standard Gradle check task.

Last but not least, the next step is to create a helper method for adding dependencies in itest scope only.

At this point, defining all the dependencies required for a given project becomes feasible.

This is all we need to run our first integration test.

3. First Integration Test

Simple Spring Boot application

Let’s write a very basic Spring Boot application and our first integration test.

Due to the inclusion of spring-boot-starter-actuator to the classpath, the application can expose management endpoints such as health. To enable the creation of our initial integration test, specific configurations in the application.yml file are necessary. This setup allows the test to initiate the application and confirm the proper functioning of the health endpoint.

It’s typically advantageous to create a common base for all the tests, encapsulating any boilerplate code and incorporating common configurations. Here’s how the IntegrationTestBase class could be written for Spring Boot.

We can break it down into details:

  • @SpringBootTest annotation tells Spring to run this class as an integration test, which means it will start the whole application.
  • webEnvironment = RANDOM_PORT makes sure the application runs on a random available port on our machine
  • @ActiveProfiles('itest') activates Spring itest profile which we will need later to make custom configuration for our application

As you can see – the test base is simple, but its subsequent stages will be expanded later.

First Integration Test

Let’s write our first test.

This concise yet powerful test case perfectly exemplifies the ease with which one can commence in terms of a testing journey.

This single test case showcased Spock’s framework syntax and structure. Additionally, it injects @LocalManagementPort, the port where our application exposes all management endpoints. The objective here is to execute a straightforward HTTP call to one such endpoint – the health endpoint – enabling everyone to assert its successful response.

When executing all application’s tests (both unit and integration) from IntelliJ, the result is a concise and informative summary.

test-results

Furthermore, upon building the project with Gradle, it becomes evident that there are two separate tasks allocated for both unit and integration tests.

4. Testing REST endpoints

If the application exposes REST endpoints, Spring Boot offers convenient tools to test them in the integration test context. In the block below, we have a simple REST controller featuring a single endpoint for calculating a square root of a given number.

To double check current endeavours, write a test that spins up the Spring Boot application and verifies whether the endpoint works properly.

In order to do that, you should enhance the IntegrationTestBase first.

@LocalServerPort injects the randomly selected port number, on which the application is running, into the variable required for subsequent use
@Autowired
TestRestTemplate restTemplate
injects the Spring Boot utility RestTemplate which can be used for making REST calls to the application.

The test can take the following form:

An actual GET request to the /api/math/sqrt/9.0 endpoint is initiated using TestRestTemplate. Following this, several assertions can be conducted, including the verification of the response status code, response headers, or the body.

5. Database testing

One of the most common integrations for a number of applications is SQL database. In this section, let’s cover tools and techniques that one can use to be able to test this integration in a simple and effective fashion.

Required dependencies

We have to add some dependencies to our build.gradle.kts file:

Database schema

You should now create a database schema for the application, initially requiring a single table named users. Liquibase, a widely utilised database schema change management tool, will be employed to execute the DB migration during application startup.

The sole task required is to specify migration scripts (.yml files) in the proper place, so they will be automatically picked up by Spring Boot when liquibase-core dependency is on our classpath.

The two files need to be created:

  • src/main/resources/db/changelog/db.changelog-master.yaml
  • src/main/resources/db/changelog/db.changelog-users.yaml

Users table has two columns:

  • id – primary key
  • name – varchar

The rows can be represented by this object:

Upon the subsequent application startup, Spring Boot will attempt to execute this migration, thereby creating the users table.

Controller and repository

Let’s implement a simple CRUD API for the application. The API will be responsible for managing user entries in the SQL database. For illustrative purposes, PostgresQL will be used, but you can use any RDBMS of your choice.

UserRestController

The controller exposes two endpoints:

  • GET /api/users – returns the list of all users from the DB
  • POST /api/users – adds the user to the DB when it does not exist, or updates existing entry

UserRepository

Thanks to the spring-boot-starter-jdbc you can use NamedParamJdbcTemplate to make database calls using simple queries and map the results to the User object.

Integration Test

Testing scenarios

Multiple testing scenarios can be defined for the two methods that our controller provides:

  1. List all users from the database
    • Prepare database so the table contains some users
    • Call the GET /api/users endpoint
    • Verify that the response body contains all the users
  2. Add new user
    • Call the POST /api/users endpoint
    • Verify the response status
    • Verify if the user is added to the DB
  3. Update the user
    • Prepare database so the table contains user with id x
    • Call the POST /api/users endpoint with a new user name
    • Verify the response status
    • Verify if the user name in the DB is updated

Database test setup

To execute the test scenarios described above, additional setup is required.

First, the task involves spinning up the actual PostgresQL database server to which our application connects. We could set up the server locally or use the database on our development server, however the primary goal of this setup is to ensure independence from the testing infrastructure. This is why Testcontainers will be employed. It’s a library that provides lightweight, throwaway containerised instances of common services, like databases, message buses or basically anything that can run in a Docker container.

  • It requires Docker to be installed
  • It supports many JVM testing frameworks like JUnit4, JUnit5 and Spock

The setup is very straightforward. You need one more dependency (Testcontainers Postgres Module) in the itest scope added in our Gradle script:

In addition to that, you need to configure your application so it connects to the proper database server. This can be done by specifying the url in the application-itest.yml file. This approach allows for the separation of the application’s production configuration – where it should connect to the ‘real’ production database – from the test configuration.

The specification of a database driver (to utilise the testcontainers one) and the database URL is required. springit serves as the name of the database that is intended to be used.

Typically, when using Testcontainers, the lifecycle of the containers needs to be managed – start and stop them at a specific time, usually when the application starts and stops. However, the PostgresQL Testcontainers Module is a bit special and needs nothing beyond the earlier definitions. When setting up the DB driver and the url that points to Testcontainers Postgres, it is enough for Spring Boot to know that it should start the container before the application startup and close it after it shuts down.

Additional testing tools

Considering the testing scenarios we’ve outlined, it’s evident that a tool is required to assist in preparing the database before the tests and validating the database state after the tests. Various approaches can be employed for this purpose. I opted for creating a small utility class in Groovy that uses NamedParameterJdbcTemplate to make database calls.

You can ask why we cannot use the already existing UsersRepository. It can be done, but I prefer not to use application components in isolation, when I run integration tests. In fact, UsersRepository is indeed one of the components “under test”.

First, let’s inject the jdbcTemplate in the IntegrationTestBase:

And here’s our DbTestClient:

The details of this utility class won’t be explored, but the most important parts are the two methods it exposes:

  • getUserById(Long id) – fetches the user from the DB by its id
  • insertUser(Map args = [:]) – inserts the user to the db, the input is a map of arguments for the columns of the table

Test implementation

This represents the implementation of the defined testing scenarios. Spock allows the addition of comments for specific test sections, improving readability and making the code self-explanatory.

A critical point to keep in mind is that this is the interaction with an authentic database. Its lifecycle spans the whole integration tests suite run, meaning that DB tables, and more importantly its contents, are carried over from one test case to another and from one test class to another. Always be mindful of that particular aspect. For instance, you should not assume that the database is empty before the test execution; it might still contain some data from the previous runs. That’s why it’s considered a good practice to use random identifiers for your test entities. Hence, it is recommended to use RandomUtils.nextLong() every time you need to generate user id in the tests. Therefore, the rows will almost certainly not interfere with each other between different test runs.

6. External REST services

Another common integration point for most of the applications are external REST APIs. It would be highly beneficial to simulate and test various scenarios for the external calls made – from the happy path to different error scenarios, such as internal server errors or service being unavailable. To achieve this, it is advisable to mock the responses from external REST API, and there are several ways to accomplish this.

Firstly, there are services like: https://smartmock.io or https://mockable.io that provide very powerful tools for mocking and stubbing http requests, along with its own servers, therefore they can even be used from your applications running on DEV or STAGING environments.

There is, however, a more convenient utility that we can use in the integration tests context and it’s called WireMock. It’s an open-source tool that provides many different distributions, but also libraries that allow creating mock APIs.

Payment Service

In the illustrative scenario, to test a REST service for processing payments via Stripe, consider the following setup:

Required bean definition:

The url (payment.url) for the service is defined in the application.yml:

Integration Test

Testing scenarios

You could define two example testing scenarios for the payment service:

  1. Successful “make payment” call
    • Mock Stripe API to return success status code (e.g. HTTP 202)
    • Call makePayment method
    • Verify that Stripe API has been called exactly once
    • Verify that NO exception has been thrown
  2. Failed “make payment” call
    • Mock Stripe API to return error status code (e.g. HTTP 500)
    • Call makePayment method
    • Verify that Stripe API has been called exactly once
    • Verify that expected exception has been thrown

Test setup

For the application, only a single dependency is required to use WireMock.

Now, it’s important to start the WireMock server before the tests are run. It can be done in the IntegrationTestBase class.

WireMock server is now up and running, but how to “tell” the application that it should use it? The accurate step includes configuring the payment.url property to be pointing to the WireMock server. There are multiple approaches to achieve this, but my preferred method is to use the port placeholder in application-itest.yml:

You may ask, how does Spring Boot know what is the value of the wiremock.stripePort when it’s about to run the application? We can populate this property in the IntegrationTestBase using TestPropertySourceUtils:

Also, ensure that PropertyInitializer is picked up by Spring during context initialization by adding ContextConfiguration annotation.

Now, the application is configured to use the WireMock server.

Defining mocks

To decouple WireMock internals from the integration test logic, create an utility class for mocking all Stripe requests. Let’s call it StripeMock.

It offers 3 methods:

  • payRespondWithSuccess – creates a successful stub (HTTP 202 Accepted response) for the POST /api/pay call, we also assume the request body to be the Payment payment object serialised to json
  • payRespondWithFailure – creates a failure stub (HTTP 500 Internal Server Error response) for the POST /api/pay call, we also assume the request body to be the Payment payment object serialised to json
  • verifyPayCalled – verifies that the call to POST /api/pay has been made exactly x (variable times) times, with the proper body

You can now initialise the mock class in the IntegrationTestBase, so it can be used by all child test classes.

Test implementation

Having all those utilities in place makes it fairly easy to implement the integration test.

7. Conclusion

This article implemented many integration tests covering the most common external services that most applications use, including databases and REST APIs.

We also looked at the exact setup that is needed to create an efficient integration tests suite for your application, using Gradle, Spock and Spring Boot.

However, this is certainly not everything, that this mighty setup empowers you to do. There is many more things that can be tested in a similar fashion, such as:

  • Kafka
  • Google Cloud Storage
  • AWS S3
  • AWS SQS
  • AWS SNS
  • AWS DynamoDB
  • Email Sending
  • Elasticsearch
  • … and many more

I will cover some of them in the Part II of this article, it will be coming soon!

I’m not saying goodbye, stay tuned! 🙂

Useful links

Written by Patryk Kurczyna
Kotlin/Java Developer
Published November 27, 2023