Automatically Testing Icinga DB

by | Feb 17, 2022

In today’s blog post, I want to present something related to Icinga DB that you most likely will not come in touch with as a regular user: some of the test cases we built for Icinga DB and the tooling created to support them.

The Challenge

Even though Icinga DB is a new project written in the Go programming language, it still needs quite some supporting code in the Icinga 2 C++ project. The largest file icingadb-objects.cpp contains almost 3,000 lines of code. We want to test both components, but testing them individually is hard. Testing just the C++ code would mean having to verify huge amounts of data in Redis where in many cases, you can’t just naively compare it as it contains timestamps based on timing within the Icinga 2 process. On the other hand, testing the Go could would require some complex mocking of Icinga 2, just because Icinga 2 is quite complex in itself already.

Our Approach

We decided to test for what we really care about in the end: if we put something into Icinga 2, the appropriate changes should appear in the database managed by Icinga DB. So for example, if we create a host, it should show up in MySQL so that the web interface can display it properly. Of course, this means that in order to perform such tests, we need running and properly configured instances of Icinga 2, Icinga DB, Redis and MySQL.

This is where the new icinga-testing project comes into play. It is intended to be used in conjunction with the Go testing package and provides helpers for dynamically starting and configuring Docker containers with the individual components as required by the test cases. At the moment, this is implemented for the four components mentioned above.

Test cases can now be written in Go the same way you would do for any other Go project and firing up the required components becomes just a few lines of code like in the following example:

func TestExample(t *testing.T) {
    // `it` is the entry point of icinga-testing and is shared between all test cases.
    redis := it.RedisServerT(t)
    mysql := it.MysqlDatabaseT(t)
    icinga2 := it.Icinga2NodeT(t, "master")
    icinga.EnableIcingaDb(redis)
    icinga2.Reload()
    it.IcingaDbInstanceT(t, redis, mysql)

    // At this point, there is a fully functional Icinga 2 + Icinga DB setup ready for testing.
    // The objects provide additional helper functions abstracting away addresses and authentication:

    // Connecting to the MySQL:
    mysqlConn, err := mysql.Open()

    // Connection to Redis:
    redisConn := r.Open()

    // Connection to the Icinga 2 API:
    apiClient := icinga2.ApiClient()

    // Now test case could be implemented that submits a check result using `apiClient` and then checks
    // with redisConn that it appears in Redis and with `mysqlConn` that it appears in MySQL.
}

Using this, there are now a few test cases, the most notable ones are the following two:

  • Object Sync Test: Generates a config file for Icinga 2 containing all kinds of objects with different variations of attributes set. In addition, it applies runtime configuration changes using the API and checks that everything is properly written to the SQL database.
  • History Sync Test: This test case uses the Icinga 2 API to perform all the actions that end up in the history tab in Icinga Web 2 (like for example state changes, downtime events and so on) and checks that the appropriate history events are written to the database. Additionally, the test case checks that in a high-availability Icinga 2 setup, both nodes generate a consistent view of the history.

Conclusion

All in all, these tests have proven quite efficient by allowing to write test cases on a high level and automatically testing all components involved with good coverage. Though, they have the disadvantage that if they fail, they don’t point to the specific component that causes the failure. Therefore, some more manual work in isolating and diagnosing the problem is needed, but this turned out to be quite manageable with some experience. Having just a single set of test cases for both components also avoids the tests going out of sync in both components. Additionally, not specifying the exact format of data in Redis allowed us to do some changes there without having to heavily rework the test cases each time.

These tests have already helped to prevent the introduction of some bugs that otherwise probably would have wasted quite a bit of time for debugging. But just like all tests, they aren’t perfect and there will always be situations that you forgot to consider, so I want to finish with my favorite bug that was reported that was not caught by any of our tests: Under some circumstances, the Icinga DB process crashed because is was unable to get a new Redis connection from the connection pool in time. Internally, Icinga DB uses a few connections to wait for new data written to Redis by the Icinga 2 process and our Redis client library initializes the size of the connection pool based on the number of available CPU cores. The resulting pool turned out to be too small on a machine with just a single core. This shows that external feedback is always valuable, so if you haven’t tried Icinga DB 1.0.0 RC2 out yet, feel free to do so.

You May Also Like…

Subscribe to our Newsletter

A monthly digest of the latest Icinga news, releases, articles and community topics.