Extending Unit-Testing on Icinga2

by | Aug 20, 2025

Unit-Testing is important

Obviously nobody is disagreeing with this. It’s just that during ongoing development and while focusing on features and bug-fixes, testing often falls behind in priority, especially when developers would need to write tests for existing or legacy code, teams can be hesitant to invest the time.

C++ applications have to run a diverse set up target environments, varying in OS, compilers, C/C++ standard libraries and dependency versions. This is true especially for icinga2, which also has to support Windows (to a degree), Mac OS (for internal development) and several enterprise Linux Distributions that ship older versions of dependencies like Boost, which Icinga2 heavily relies on.

The initial motivation to give unit-testing another look came while working on pull-request #10516, which required more in-depth use of Boost’s beast library to get more memory-efficient streaming of HTTP responses. The initial version first failed to compile for several of our GitHub runners. Getting it to compile was easy, but I started wondering what other subtle differences these environments might have, especially since one of our target systems was still on the very first Boost version that introduced beast (1.66). Manually testing each revision on each target would have been a huge chore, so I set out to add unit-tests for the critical components involved in this PR.

On the current master-branch we only test the fundamental base classes and a few other selected components, nothing so far for HTTP. The reason why nobody added these tests yet was likely because as networking components they can’t work in isolation. A lot of setup needs to happen before you can even start testing, which required the addition of several test fixtures that simplify this setup.

What are fixtures in unit-testing?

Most unit-testing frameworks rely on so-called fixtures that set up and later tear down a well defined testing environment. They could for example prepare files against which to test the component, a database in a defined state, setup mocking objects, that imitate other components in a way controlled by the test case, or allocate system resources the tested component needs to even start up.

In case of icinga2, we’re using Boost.Test, but for most other C++ unit-testing frameworks, like GoogleTest or doctest, the principle is the same. The fixture model is implemented as a class that sets up the test environment (or part of it) in its constructor and tears it down in its destructor.

To test our HTTP server connection class we needed at the very least an SSL connection object, which means we need to generate valid SSL certificates, which means setting up a temporary directory to keep those certificates. Ultimately, we need a fixture that allows the test case to say “give me an SSL connection and throw it away when I’m done” and that fixture can then in turn say “give me a set of valid certificates and throw them away when I’m done” and so on.

Example

struct TlsStreamFixture {
    TlsStreamFixture()
    {
        // set up both sides of the TLS stream
    }

    ~TlsStreamFixture()
    {
        // tear down the TLS stream
    }

    Shared<AsioTlsStream>::Ptr client;
    Shared<AsioTlsStream>::Ptr server;
};

These then can be added to test suite or test cases:

BOOST_FIXTURE_TEST_SUITE(remote_httpserverconnection, TlsStreamFixture)
BOOST_AUTO_TEST_CASE(first_test){
    // Test the class using the connection from the fixture.
    // After the test is done, the fixture gets destroyed along with the connection.
}
BOOST_AUTO_TEST_CASE(second_test){
    // Another test gets a freshly constructed version of the same fixture.
}
BOOST_AUTO_TEST_CASE(third_test, OtherFixture){
    // If the fixture is not needed or a different one is needed the default can
    // be overridden.
}
BOOST_AUTO_TEST_SUITE_END()

As we can see in the example above, Boost.Test allows us to group test cases into a number of nested test suites. By default, the testing framework will apply the fixture associated with a test suite to the test cases and suites below it in the hierarchy. This can be overridden as necessary, as the third test-case in the example shows.

It will also set up and tear down these fixtures before and after each test case, which is generally this is a good thing, because it gives us a fresh state for each test case. But there are situations, where this might not be desirable. For example, generating the certificates for the test connection takes about ~1s on every run. That is not a delay we would want to add to each of potentially dozens of test cases that otherwise take only a few milliseconds to complete. It’s also completely unnecessary, since the certificates are static and not changed by the tests once they’re created.

For situations like this, Boost.Test has so-called global fixtures, which it sets up and tears down only once for the entire test-run. In theory that would allow us to keep certificates until all test cases are completed. In our case however, they don’t help us here.

CTest fixtures and Boost.Test fixtures

However, we don’t run our test suites directly, but via CTest, the part of the CMake build-system that registers the tests and provides a unified interface for IDEs and CIs to run them, or combine tests from different sources/frameworks under a single interface. The way CTest works is that it restarts the test executable for each test case. And obviously, Boost.Test fixtures, even global fixtures, will not survive a restart of the test executable.

To move around this limitation, CTest also provides a solution in the form of their own concept of CTest fixtures. Unlike fixtures in Boost.Test, these fixtures correspond to unit-tests on the Boost.Test level, that CTest puts in a dependency relationship with other unit-tests so for a given fixture.

add_test(remote_certs_fixture/prepare_directory
  <test_executable> --run_test=remote_certs_fixture/prepare_directory
)
set_tests_properties(remote_certs_fixture/prepare_directory
  FIXTURES_SETUP ssl_certs
)

add_test(remote_certs_fixture/cleanup_certs
  <test_executable> --run_test=remote_certs_fixture/cleanup_certs
)
set_tests_properties(remote_certs_fixture/cleanup_certs
  FIXTURES_CLEANUP ssl_certs
)

add_test(remote_httpserverconnection/bad_request
  <test_executable> --run_test=remote_httpserverconnection/bad_request
)
set_tests_properties(remote_httpserverconnection/bad_request
  FIXTURES_REQUIRED ssl_certs
)

We can see that three test cases are added with prepare_directory acting as the setup step for the ssl_certs fixture, cleanup_certs as the cleanup step, and the actual test case remote_httpserverconnection/bad_request depends on the fixture. So even when we don’t include them in test-runs explicitly, CTest will implicitly add them as dependencies.

# ctest -R remote_httpserverconnection/bad_request
Test project /home/jschmidt/Projects/build/icinga2
    Start 173: base-remote_certs_fixture/prepare_directory
1/3 Test #173: base-remote_certs_fixture/prepare_directory ....   Passed    0.03 sec
    Start 186: base-remote_httpserverconnection/bad_request
2/3 Test #186: base-remote_httpserverconnection/bad_request ...   Passed    0.68 sec
    Start 174: base-remote_certs_fixture/cleanup_certs
3/3 Test #174: base-remote_certs_fixture/cleanup_certs ........   Passed    0.03 sec

100% tests passed, 0 tests failed out of 3

Total Test time (real) =   0.74 sec

As you can see from the output of the test-run above, even though we selected only a single test, CTest added the fixtures as a dependency, with the setup running before, and the cleanup running after the dependent test.

Automated test discovery with Boost.Test

Another area where the current approach in the icinga2 code-base is lacking, is the support of Boost.Test in our build-system. To add our test targets, we rely on a custom third-party CMake module from 2014, which over the years our developers had to fix and modify to fit our needs. While it still works, the biggest disadvantage of this is that it requires us to list all ~200 test cases inside the CMakeLists.txt file, and now twice if they depend on or provide a CTest fixture.

Modern unit-test frameworks usually provide discovery methods that make it unnecessary to list test cases individually. For example GoogleTest has a gtest_discover_tests() command shipped and maintained as part of CMake itself. Catch2 and doctest each ship their own CMake module. There are third-party modules for Boost.Test that provide similar functionality, but since they neither made it into boost, nor CMake, they are now unmaintained and were never really finished.

But, instead of adding such a module as third-party code to icinga2, there might be an easier way for getting something like this with Boost.Test:

class CTestFileGenerator : public boost::unit_test::test_tree_visitor
{
public:
    void visit(boost::unit_test::test_case const& test) override;

    bool test_suite_start(boost::unit_test::test_suite const& suite) override;
    void test_suite_finish(boost::unit_test::test_suite const& suite) override;
};

struct CTestFileGeneratorFixture
{
    CTestFileGeneratorFixture();
};

CTestFileGeneratorFixture::CTestFileGeneratorFixture()
{
    auto& mts = boost::unit_test::framework::master_test_suite();
    int argc = mts.argc;
    for (int i = 1; i < argc; i++) {
        std::string argument(mts.argv[i]);

        if (argument == "--generate_ctest_config") {
            if (argc >= i + 1) {
                boost::filesystem::path testexe(mts.argv[0]);
                boost::filesystem::path file(mts.argv[i + 1]);

                CTestFileGenerator visitor{testexe, file};
                traverse_test_tree(mts, visitor);
            }
        } else {
            std::cerr << "Error: --generate_ctest_config specified with no argument.\n";
            std::_Exit(EXIT_FAILURE);
        }

        std::_Exit(EXIT_SUCCESS);
    }
}

This uses a global fixture that is initialized after the master test suite and looks for an additional command line argument --generate_ctest_config. When this argument is present, instead of running tests, it uses the traverse_test_tree() function to pass each test case to a visitor object. This visitor object then collects the metadata for each test and writes the CTest configuration to a file. We still need a small CMake function that then adds this step to the build process using CMake’s add_custom_command(). This will make CMake regenerate the config any time the source code changes:

function(target_discover_boost_tests target)
  set(testfile "${CMAKE_CURRENT_BINARY_DIR}/${target}_tests.cmake")

  add_custom_command(TARGET "${target}" POST_BUILD
    COMMAND $<TARGET_FILE:${target}> -- --generate_ctest_config "${testfile}"
  )

  set_property(DIRECTORY
    APPEND PROPERTY TEST_INCLUDE_FILES "${testfile}"
  )
endfunction()

Discovering CTest properties and labels

In addition to automatically discovering test cases the easy way, this also allows us to discover CTest fixtures on the Boost.Test level, by adding a custom decorator to test cases and suites:

class CTestProperties : public boost::unit_test::decorator::base
{
public:
    explicit CTestProperties(std::string props) : m_Props(std::move(props)) {}

    std::string m_Props;

private:
    void apply(boost::unit_test::test_unit& tu) override {}
    [[nodiscard]] boost::unit_test::decorator::base_ptr clone() const override;
};

BOOST_FIXTURE_TEST_SUITE(remote_httpserverconnection, HttpServerConnectionFixture,
    *CTestProperties("FIXTURES_REQUIRED ssl_certs")
    *boost::unit_test::label("http"))

The generator will add the CTest properties from these decorators, along with the labels from the label decorators generated CTest configuration. We could easily extend this to specify all the Metadata that CTest supports right where we define the test cases. For more details, see the PR.

Testing more components

At the moment they will only be used to test the HttpServerConnection component and the new HTTP message classes. But implementing fixtures for certificates and TLS connections, along with a fixture to capture and verify log messages, was only a start. In the future, this gives us the tools to extend testing to other similar components. In the near future, we’re planning to add tests for JsonRpcConnection, which handles connections between Icinga endpoints, the ApiListener which establishes these connections, and the individual HttpHandlers representing the API-endpoints. The ultimate goal is to increase test coverage step by step to all components we can feasibly test in isolation.

You May Also Like…

Icinga DB Web Automation

Icinga DB Web Automation

Icinga DB Web Automation allows you to automate monitoring tasks and integrate them directly into your systems and...

Endpoint Monitoring with Icinga

Endpoint Monitoring with Icinga

Monitoring with Icinga primarily focuses on servers and infrastructure. But there are also the people operating these...

Subscribe to our Newsletter

A monthly digest of the latest Icinga news, releases, articles and community topics.