Restore database state after integration test with Docker?

We use PostgreSQL along with an EAR deployed to JBoss. During the build process, we have a dump of the development database, which is then used in the integration test: a special artifact is deployed, and the tests interact with the application using the http client.

Currently, the state of the database changes during the execution of the tests, so we have no way to place additional content for each test that modifies it, so no tests will depend on each other. It takes a lot of time and patience, since such a test becomes even dependent on the order of the records.

Is there a way to take a snapshot of the database to restore it after each trial run with enough resources? Did Docker Help? Or in any other way?

H2 is not, because we use some features of PostgreSQL. Tests can span multiple transactions, so I think the rollback didn't help either.

+5
source share
3 answers

One easy way with Docker is to take a snapshot of your database as the volume that you reset, and then mount it at the beginning of the test run.

You can massage this data before running your tests (either have money from the entire original database configuration, or something else), and then run the PostgreSQL test database with the test data mounted as a volume, and your PostgreSQL pointing to it for each test.

+2
source

One option is to visit the data again after completing each integration test. Spring offers such a function through the SqlScriptsTestExecutionListener and @Sql annotation. I know that you mentioned EAR and JBoss , so if Spring not used, I suggest that Arquillian can offer something similar.

In the case of re-sowing, the database becomes an expensive operation due to the amount of data used to test integration, another option is to use a Docker image from the database and already sown data. You will need to start the DB with data container DB with data before each integration test is completed and stopped after it is completed, in other words, you will need to manage the container life cycle throughout the integration testing. I wrote about how to do this a few months ago: Testing integration using Spring Boot, Postgres and Docker , again using Spring Boot , but ideas would be useful for use with a different structure. It covers creating Docker images with imported data, generating JPA entities from an existing schema, adding support for integration tests to start / stop a container before and after each test, and it could be easily expanded to start more than one container per test or run them at the same time, since the Docker container maps to an arbitrary host port.

At this point, it is a compromise, starting each test from a state of knowledge (IMHO, the right approach) by either re-sifting the data, or starting / stopping the containers for each individual test, or by running tests in a specific order and simply re-sowing the data before performing a set of tests.

+1
source

You can do this by cloning the original database and discarding the newly created test database after the test. When using the source database as a "TEMPLATE", this will probably only take milliseconds.

// FIXTURES // Create "initial_database" before all tests are run and load fixtures etc.

// TEST SETUP BEFORE EACH TEST CREATE DATABASE some_test_database_name TEMPLATE "initial_database";

// TEST TEARDOWN AFTER EACH TEST DROP DATABASE some_test_database_name;

0
source

Source: https://habr.com/ru/post/1258355/


All Articles