Thoughts on Unit Testing

Introduction

These ideas stem from recent work to implement new Juju features; work that I have felt was somewhat hindered in velocity due to working with (against) some of the incumbent Juju test suites.

The current state of unit tests in Juju reflects a historic preference of testing behaviour over implementation.

I will lay out where I think we are now, and propose some notions on where we should aim to be and how we might get there.

The Juju team has formidable engineering talent, so I don’t presume to be revealing anything unknown here; more to be pro-active about striving for the betterment of the project.

Current State of Play

Most of our testing, and the default mode of testing is “black box”. Test suites reside in packages outside of the ones we are testing and are generally named “{package}_test”.

This means the API surface we are allowed to manipulate in tests by default are methods/types/variables exported by the package being tested.

This has the following externalities:

  1. We test in a “loose” loop when writing code, tending to modify large swathes of logic before working out how to test them at arms length in the external test suites.
  2. In turn we write less testable code. We see things like:
    a. Methods that are too long.
    b. Instantiation of service dependencies inside logic that should know nothing about such specifics.
    c. Logic with concrete dependencies instead of indirections.
  3. Wrangling with the express purpose of making code testable like:
    a. “export_test” files inside the tested package to expose otherwise package-private logic.
    b. Package-level variables that exist for the purpose of being “patched-out” by the test suite.
    c. Esoteric test composition based on the specific wrangling required to test.
  4. Due to (3), intermittent test failures in CI based on hard-to-reason-about test set-ups.
  5. Development is slower than it might be due to the burdens imposed by working with test logic.

A Better Approach

While nothing can replace integration tests, unit tests are not quite intended to meet the same need. The former obviously verifies that the the end product correctly implements documented behaviour. The latter are more of an aid to development itself, in particular:

  1. To enable testing in a tight loop so that logic emerges as small, well tested, verifiably correct modules.
  2. To produce decoupled code so that modifications are easier/faster to execute.
  3. From (1) and (2) to produce testable logic.

I am not a fan of the dogmatic TDD approach, but my preference is to test in a tighter loop than I currently tend to do when working on Juju.

What I think we should work towards is unit testing inside the tested package by default, with well chosen integration tests in the external packages. This would manifest in:

  1. Better, faster tests.
  2. Smaller, modular code.
  3. Cleaner logic that is not polluted for exposure to external test suites.
  4. Clean lines of indirection and adherence to the (SOLI)Dependency Inversion Principle
  5. Better behaved CI infrastructure.

Getting There

In a perfect world, with infinite time we would:

  1. Create a same-package test suite for all packages.
  2. Rename the external test files as “{package}_integration_test”
  3. Delete all export_test files and move tests relying on them into the new unit test suites.
  4. Delete integration tests that flex large swathes of logic repeatedly to poke inner units and rewrite those as local unit tests.

But we can’t turn the Titanic on a dime, so these are some things I think would make things cleaner in the short term:

  1. Write new tests as real unit tests, reducing integration tests over time to a well chosen set.
  2. Write test suites in the “package_test” files and keep them local to the package.
  3. No new “export_test” additions.
  4. Move patching and “export_test” and other test-specific logic inside the package-local suites, gradually deleting “export_test” files.
  5. Use these test suites for both unit and integration tests, so that “package_test” is the sole location for test composition logic.
  6. Use more factory patterns in the dependency engine/manifolds to reduce concrete service instantiation.

I would love to hear others’ thoughts on this.

3 Likes

This feels like a good direction to go in.

Would love a re-architecture of our testing infrastructure to work towards a ‘one true way’ of testing, rather than the sort-of-deprecated-but-not-replaced jellybean jar of old techniques.

Sidenote: Am keen on introducing property-based testing in places where it fits. First implemented as QuickCheck and implemented in Go (sort of) as "testing/quick". I’m sure there is a more complete Go implementation, but I haven’t looked.

2 Likes

You might be interested in how we test the OpenStack charms! Generally speaking, a charm has unit tests in-tree that validation (with lots of mocking) that templates are rendered as expected and things are called as expected.

When then use Zaza as a test framework to allow us to write functional tests. The OpenStack charms share all of the test code as we generally are wanting to test that a deployed cloud “works” regardless of which specific piece is being changed.

In addition, Zaza supports in-tree functional tests as @gnuoy explored. This also works for adding additional testing via shared python libraries, as long as the library is installed into the virtualenv!

We have gotten a lot of experience with these types of tests in both our classic charms as well as the newer reactive charms and are constantly working to make our (testing) lives easier!

1 Like

The problem that “testing/quick” package fails to do is reduce the problem that it has identified, so that you can see at what point it breaks (length of a string, numbers are too large or wrong unicode characters to name just a few). I made shrink to solve that problem, having worked on a large code base with property testing, it’s helpful to know when the input broke and help surmise that for you.

3 Likes

That’s really cool! Do you think that property-based testing has a future in Juju?

I can’t see why not, it’s just a case of creating an proof of concept and showing others and getting their reaction.

I’d like to take this idea of {package}_integration_test one step further and use the go compiler to our advantage. In order to get better and quicker feedback loops for the tests we write, we should also add // +build integration to the headers of the {package}_integration_tests. We could then use the tags option to identify if we would like to include integration tests into our tests or not. Obviously CI tests would always run with go test -tags "integration" (this would test both unit and integration).

There are advantages to running without integration tags:

  1. It’s easier to reason about the code and tests at our level of abstraction. I don’t need to know what’s above or below our level. Abstracting our unit of code then means we can change the underlying concrete type either being a mock or or real, for better encapsulation.
  2. Testing becomes faster as we get a much quicker turn around (mocks vs integration)
  3. There could be a lot more concise tests targeting the code, we can use code coverage (see point 2)

We can then back up our unit of work testing with a full integration test to make sure all points align. Then have an acceptance test for the full e2e Q&A, which is run often.

1 Like

Let’s see if I can land this : https://github.com/juju/juju/pull/10554/files#diff-f9d397babd6957a20d88e8c3242c3fceR34

I’ve been implementing cmr-migrations into description and import/export of migrations within the state package. Unfortunately most if not all tests in the state package are a hybrid of integration tests and end to end tests. They don’t actually test directly the unit of work level that is required for unit tests.

Coming back a fresh from the Paris sprint, talking about implementing unit tests at all levels, I thought this would be an ideal opportunity to get it working in the state package. I’ve had to move some files to expose the possibility of doing this and introduction of point of use interfaces that allow for the correct level of unit testing. The great thing about doing it this way is that we can now achieve 100% coverage of the unit and we can test that we did just do that with the go tool party trick.

Running the following in the state package root:

go test -v -coverprofile=coverage.out . -check.v -check.f=MigrationImportSuite

Outputs a coverage report. Considering we’re just testing the MigrationImportSuite the coverage results are going to be really low, but we can safely ignore that. What we can do next is view if we achieve coverage in html form, by doing:

go tool cover -html=coverage.out

This pops open a browser and we can navigate to the file in question and see if it’s correct or not.

The results as follows:

2 Likes

More information on the same vein

1 Like