It’s a shame that unit testing of databases is, relatively speaking, the poor cousin of application unit tests. After all, the same benefits of surety of code and ease of refactoring can be achieved by this simple means, and ultimately costs saved by ensuring that the (inevitable) bugs in an application and database are caught earlier in the product lifecycle.
Whilst Unit testing is a key point in Agile software development, there are too many projects out there where the existing codebase is perceived to be too large to allow unit tests to be built up for a product. This isn’t necessarily the only way to tackle an existing codebase- I’d recommend starting with bug fixes, and building up from there.
I can foresee that an idea that is common in microservice provision, to allow a development team to “adopt” or retain responsibility for a software product for the lifecycle of that product – from initial concept to final retirement, with all the rollout, upgrade and support in the middle – will be what finally convinces database developers that it’s not worth risking buggy code – or the chance of breaking something else when the inevitable support call does come. This merging of the traditional development and operations functions into one service-oriented team makes it in ones’ own interest to minimise total effort on issues – and a few other best practices like good documentation and minimising “just in case” code along the way.
As a strong proponent of data integrity, I see unit testing as part of the armoury of tools necessary to develop and maintain a system to capture information in a robust format, and ensure that the data retains it’s integrity through its’ lifetime.
If you want to find out more about how to justify unit testing for databases, I’ve recently published a course on this very subject through Pluralsight. Whilst Pluralsight isn't free to access, there is a free trial available, so set some time aside and watch it to help you build a business case for adopting this best practice.