Systems testing is about verifying the important properties of your software in various realistic environments. The goal is the gain confidence that certain properties such as performance and correctness will hold when finally deployed to production.
Some properties such as the performance profile of your software can only be verified in an environment that matches production. If your software runs on tens of thousands of customer sites, the number of environments is without limit presenting further challenges. On top of understanding performance profiles on a given number of environments, there can also be many types of workload each exhibiting a different profile.
At RabbitMQ we use systems testing to gain confidence in new alpha/beta releases and test for regressions but also in order to make data-driven decisions regarding experimental optimizations, choosing the best defaults, and making recommendations to customers for best configurations on certain hardware setups.
In this talk Jack goes through some of the strategies, tooling, practices, and lessons learned from building out the systems testing capability at RabbitMQ, finishing with a look at future tooling and testing in Kubernetes.