Introduction to Acceptance testing
Let's be honest for a moment: Testing is hard. The promise of TDD is delusive. In the beginning, everything works great - we add new test cases, write the code, and then are happy to see green, passing tests everywhere. However, the longer a project is developed, the harder it becomes to maintain those tests. At some point, we're spending twice as much time on adding new features and refactoring the old ones because we need to update our test code as well constantly.
I think the main source of this problem is that we tend to focus on the implementation details too much.
What we could do instead is focus on the system as a whole and treat its internals as a black box. This is commonly known as integration and acceptance testing.
Kotlin testing
For the purpose of this article, we assume that we're working on a simple application for managing Electric Vehicles Charge Points. Imagine that we have a simple service capable of performing basing CRUD operations for Charge Points. We use the Spring Framework, Kotlin, and PostgreSQL as the DB.
The ideas and technologies we're showing here aren't anything new and have been around for a while now. This article describes how to use them together. Although the examples are based on Kotlin and Spring, the general principles they demonstrate are universal and can be applied to any language or tech stack.
Our goal with this architecture is providing a pleasant developer experience (DX) so that everyone is happy to write tests, the tests themselves are easy to implement and maintain, and the code is easy to follow and understand.
testcontainers in Acceptance tests
Before we dive into the test code, we need to explore testcontainers. For our integration tests, we have to connect to a database. Here are the available options for getting that done:
- Use in-memory DB, like H2 - this method could cause trouble when using a DB-specific code in the implementation (for example, geospatial features).
- Connect to a local DB instance - this option could be troublesome to set up. It requires each developer (+CI environment) to have a DB instance set up locally.
- Use testcontainers - this will automatically take care of initializing the DB docker image, setting up a running DB instance, and cleaning up after the tests once they're completed.
As you can see, using testcontainers greatly simplifies the DB setup process. Please refer to the testcontainers documentation >> for full setup instructions. Here’s a simplified version:
1. Add testcontainers to build.gradle dependencies.
testImplementation("org.testcontainers:testcontainers:1.11.3") // use the same DB as your production system testImplementation("org.testcontainers:postgresql:1.11.3") testImplementation("org.testcontainers:jdbc:1.11.3")
2. Override Spring properties for test, providing the correct testcontainers JDBC connection url (in application-test.properties):
spring.datasource.url=jdbc:tc:postgresql://localhost/testdb?TC_DAEMON=true spring.datasource.driverClassName=org.testcontainers.jdbc.ContainerDatabaseDriver
The actual DB name is irrelevant. Adding TC_DAEMON=true allows us to keep the container running and be reused across multiple tests.
3. (Optional) To simplify things further, we'll create a base class for all the integration tests to inherit from. This will take care of selecting the correct test properties file (defined in the previous step) and provide a couple of useful helper methods. Such class looks as follows:
@RunWith(SpringRunner::class) @SpringBootTest @AutoConfigureMockMvc @ActiveProfiles("test") abstract class BaseIntegrationTest { @Autowired lateinit var mockMvc: MockMvc val objectMapper = jacksonObjectMapper() fun asJson(vararg properties: Pair): String { return objectMapper.writeValueAsString(properties.toMap()) } }
We'll see how those helpers come in handy once we start writing tests.
Acceptance tests
Let's take the usual testing mantra:
Arrange, Act, Assert
And take a look at each step in detail.
Arrange
Here's where our custom Kotlin DSL comes into play. We want to be able to set up our DB with test data quickly. However, creating our object hierarchies is sometimes very tedious and requires a lot of boilerplate code. Creating a custom DSL helps with reducing that unnecessary noise from the test code by moving object creation responsibilities into the DSL code. Compare these two examples:
Java
ChargePoint chargePoint = new ChargePoint(); Address address = new Address("Sesame street"); chargePoint.setAddress(address); Geocoordinates geocoordinates = new Geocoordinates(10.1234, 99.54321); chargePoint.setGeocoordinates(geocoordinates);
Kotlin
val chargePoint = chargePoint { address { street = "Sesame street" } geocoordinates { latitude = 10.1234 longitude = 99.54321 } }
Not only the Kotlin code is less verbose, the DSL can do much more for us. For example, initializing default properties by generating random values using library >> .
Kotlin code is also more explicit. How would you know if the geocoordinates constructor parameters order is lat/long or long/lat without looking into the code or using IDE to help you?
Once we have the objects ready we can easily persist them to the DB:
chargePointRepository.save(chargePoint)
Act
For acting step we will reproduce the usual app’s use-case, that is, handling an HTTP request. Inside our helper base class we already initialized mockMvc. Let’s take a look at a simple action of retrieving Charge Points by sending a GET request:
val result = mockMvc.perform(get("/api/v1/charge_points")) .andExpect(status().isOk) .andReturn()
Notice, that by using mockMvc we’re emulating a real-world HTTP request which also goes through (and validates) elements like the Spring validation, authorization, JSON body parsing etc.
Assert
Finally, we'd like to confirm that the HTTP response is correct. We could check if the returned result includes all the correct fields manually, but that might include a lot of manual assertions for complex responses. What we could validate instead is whether the response is the same as before when we make changes inside our code. After all, the most important thing for us is not introducing some API breaking changes by accident.
This is where snapshot tests come into play. Let's see an example (based on the code above)
result.response.contentAsString.matchWithSnapshot()
This single line will:
- Create a new snapshot file when it's run for the first time.
- Check if an existing snapshot is already present in the code and compare the result to see whether it is the same or not.
- If the result and the saved snapshot differ, report a test failure.
An example snapshot could look like this:
[ { "id": "73759ba8-d9bc-4660-a51a-de69ffaac1e9", "address": { "street": "street 1", "id": "91914e4a-dda1-4a48-85a1-1b9b9233bb51" }, "geocoordinates": { "latitude": 50.0646501, "longitude": 19.944544, "id": "68366b4f-a698-459b-8bd1-045cce344c66" }, "comment": "", "chargePointState": "ACTIVE" } ]
As you can see, that's the exact JSON representation of the HTTP response, as expected. Now we can safely modify the internals of our app, and the test will validate that no breaking changes were made to the API.
However, sometimes we need to apply these changes - for example, when adding new fields. What then? We can easily update the snapshot executing:
./gradlew updateSnapshots
Note that one should still manually validate that the update makes sense and the API should indeed change. We can do that easily during the code review process. Remember that snapshots need to be committed to the git repository!
There's one more scenario where we need to perform a different type of validation. Let's look at creating a new Charge Point. We still have to send an HTTP request via mockMvc, but the response could simply be HTTP 201, without any data returned. A snapshot would be of no use to us here.
How do we assert that the request did what it was supposed to do? We could "invert" the process a little. We can create an expected object using our DSL, retrieve the saved entity from the DB, and compare these two.
mockMvc.perform(post("/api/v1/charge_points") .content(asJson( "latitude" to 50.0646501, "longitude" to 19.944544, "street" to "street 1" )) .contentType(MediaType.APPLICATION_JSON)) .andExpect(status().isCreated) assertThat(chargePointRepository.count()).isEqualTo(1) val expectedChargePoint = chargePoint { address { street = "street 1" } geocoordinates { latitude = 50.0646501 longitude = 19.944544 } comment = null chargePointState = ChargePointState.ACTIVE } val dbChargePoint = chargePointRepository.findAll().first() assertThat(dbChargePoint).usingRecursiveComparison() .ignoringFieldsMatchingRegexes("^id$", ".*\\.id$") // Ignore id field of all associated entities .isEqualTo(expectedChargePoint)
Note: Since we're clearing the DB for each test, it's fine to do findAll().first() since no other Charge Point (than the one created during this test) should exist in the DB. Using the recently introduced AssertJ functionality to compare objects recursively, we can easily validate that the saved entity matched the expected one. Skipping id fields is useful since they're only important for the persistence layer and will be regenerated by Hibernate automatically, so there's no point in comparing them.
Drawbacks
Of course, there's no silver bullet when it comes to testing, and this approach has some trade-offs that you need to conside.
- Integration tests are usually way "heavier" than unit tests, meaning that the test suite will take longer to complete (having to set up the DB, Spring context, API requests, etc.), leading to longer "feedback cycles."
- It's difficult to reason about failures. Since the entire stack is tested at once, it may be tricky to pinpoint the exact place where an error occurred.
- Snapshots require maintenance.
- It's not obvious what we're asserting.
Summary
If you'd like to check out the full sample project, you can clone it from here:
https://gitlab.com/solidstudio-team/acceptance-testing-demo
We've achieved over ~80% of code coverage with a relatively small amount of test code. Adding new tests is easy and painless. The tests themselves are clear and concise. Note that I haven't said that Unit Tests are bad or unnecessary anywhere in this article. On the contrary, I believe that they're still useful and have their place. Unit Tests are always super useful for testing business logic. However, I believe that they should be applied only to "units" that don't have any dependencies. Once we start introducing mocks, are the tests really validating the correct behavior?
Developing a well-defined set of integration tests that emulates the real-world scenarios performed against our app's API is a great way to ensure that the app is behaving correctly. However, remember that what worked for us here may not work equally well for everyone.