I wonder if your git-like DB could solve another, equally important problem space, that you didn’t mention in your blog post: unit testing database apps.
Unit testing is great, but when you have a database app, the cost-benefit trade-off starts to change.
Here is how we (and I thing most people) currently do it: We create and populate a db with “test data” in the unit test’s setup method (from a json file or XL file, for example). Then we check-in json/XL test data into git as part of the unit tests. But this is a pain to maintain, and it won’t be as useful as testing against real data. And as the “test data” gets larger it’s really slow also. Importing a 100MB data file into mysql is really slow.
One could writing their unit tests against a production DB, but that has all sorts of issues: every time the DB changes, your tests break. Or worse, if what you are testing mutates the DB, you could break the production DB.
Now, reimagine unit testing DB apps, with git-like DB:
- Each developer workstation (and testing/CI machine) has a full copy of the production db (including all versions) at his workstation.
- Accessing any particular version of the DB is just a matter of referencing a sha.
This would make the problem of unit testing databases trivial. Right? Or is my thinking bad? Aside from all of the other value your solution brings, if all you did was solve the problem of unit-testing DB apps, you might win a noble prize for just that.