Re-thinking the ‘Good’ in ‘Good-Cheap-Fast’

Michelle-Joy Low
Published in
6 min readDec 8, 2020

--

I was drawn to Reece in part because of its willingness to play the long game. With branch time, the business’s operational excellence was on full display, leading me to reflect on how core operating disciplines translate when delivering data solutions .

It is rare for a stakeholder discussion to go by without some reference to the proverbial project management triangle. “Good, cheap or fast: pick two” is its most rudimentary form, and it has become a standard resource management framework for the disciplined delivery manager, who must balance the cost, time and scope constaining the quality of outcomes delivered.

Good-Cheap-Fast makes obvious the economics of scarcity in project delivery, but its brevity omits two nuances I’ve come to appreciate: (1) choice trade-offs are often described as if on a linear continuum (scope, cost and time increases or decreases on a mental slider scale), and (2) that any scope-cost-time option on this continuum is acceptable for consideration.

If only decision-making was that straightforward for data-powered solutions! Discussions so far with my team at reecetech would suggest there is plenty more than meets the eye.

Teamwork makes the team work

I dwelled previously on the multidisciplinary field that is Data Science; by extension, the success of any scale data-driven solution, with or without the science, similarly necessitates the deep convolution of human and machine-based capabilities play nicely together.

Far from being limited to Data, multidisciplinary systems are everywhere in our physical world. From Formula 1: five types for engineers for one car, ten core roles supporting heart surgeries, to something as simple as stewing a pot of beef rendang (food of my childhood!) — all are systems of disparate parts that must (1) each perform their own function well (i.e. rendang chilli should be spicy, weak chillies be ashamed), and (2) integrate well (i.e. rendang recipe of the right proportions — never overdo star anise!).

Rendang — food for thought and soul

In physical systems, we tend to have a fair appreciation of what Good-Cheap-Fast trade-offs look like, and they often feel reasonably linear. Our choice of stewing beef cuts range from corner-store beef chuck to oyster blade from an organic shop. Some cuts tenderise in one and a half hours, others need six hours of love. Often in navigating the trade-offs of a system, we are willing to relent on the aesthetics (rendang is definitely an ‘Ugly Delicious’ food), or some other element deemed not fully necessary for the system’s function.

With data, it’s different

Data-driven solutions are the rendang of building tech. So. Many. Ingredients: when we contemplate data-driven solutions, the decisions we make involve not only the construction of technological components, but also their maintainability, scale, reliability, testing, hardening, and ultimately, longevity in use.

A common pattern in data-driven solutioning is to associate ‘function’ (the ‘Good’ in Good-Cheap-Fast) primarily with infrastructure components; everything else is deemed ‘form’. Faced with the need to deliver value promptly, a common mantra is to ‘build lean’ and enable agility to meet changing business demands.

The first elements removed from delivery scope are often testing, documentation and maintainability plans, as the size of averted cost is unseen — the Prophet’s Dilemma is real. So minimum viable solution roadmaps are often focused on building the thin-slice pipelines: get data, transform and land data, sit models on top of data, sit interfaces on top of model outputs → voila, insights! Everyone is (very quickly) a winner.

Up to this point, our linear worldview of Good-Cheap-Fast seems to work just fine. But adopting this view supposes the data world obeys the same rules as the physical world — which in my view is a misplaced assumption. Data changes state and form across time dimensions, ebbing and flowing to the non-linear stochastics of human behaviours applied to technology. A common challenge is the re-mapping of source-to-target transformations as new tools are released — such as the introduction of new customer interfaces necessitating full ETL re-builds to ensure data warehouse tables reflect the same customer information pre- and post-launch.

Replace “algorithms” with “data”… source: https://xkcd.com/1831/

The highly variable, tightly wound nature of such tech-enabled solutions means thin-slices can be brittle in the face of moving business parts. And in the absence of automated testing, documentation or change plans, remediation can be a disproportionately expensive exercise, not to mention also a blocker to scale.

Re-thinking the ‘Good’ in Good-Cheap-Fast

The point is the old adage of “Good, cheap or fast: pick two” isn’t as straightforward when it comes to delivering data-powered solutions. The need for speed in competitive conditions can be a barrier to seeing past point solutions; further, cost-benefit-backed, scalable roadmaps are notoriously difficult to write. But the conversations in my team have me optimistic about where we are headed.

We’re currently talking about what a ‘good’ minimum threshold actually looks like. Not all scope-cost-time options are actually viable in data, because ‘good’ (or quality) isn’t exactly on a linear scale. Data that is 50% accurate probably has no value, data that is 80% accurate might have some value, and data that is 100% accurate may be worth its weight in gold — assuming intelligence and healthy skepticism in its use. The joy of unveiling a beautiful interface quickly devolves into fury if its numbers are wrong; so we’re thinking not only about developing minimum viable components, but also minimum responsible disciplines.

A real challenge in the matter is just how counterintuitive (read: un-fun) data disciplines can feel. Few things seem more obtrusive than the Data Fun Police, who at each business request for a new feature responds with a catalogue of Do-Nots-in-the-Data-Lake. In truth, interests must align on both sides; with every Do-Not, there should be What-About-This-Instead; and working rhythms should facilitate two-way value exchanges. For example, new work should be accompanied by a considered re-ranking against existing commitments; also, if our data engineer says that new feature will break an existing pipeline — it’s probably a good idea to consider alternatives.

At a broader level is the question of what a ‘good’ solution itself means — for example, what hardening in the ‘thin slice’ do we spend time on now, versus other activities that can wait? The motivation here is tech and operational debt, commensurate with digital opportunities — tends to grow exponentially. Small delays in prevention today can lead to much larger missed opportunities in a year — such as Google Research has found in machine learning. Still, it would be naive to put delivery completely on ice to search for some non-existent ‘perfect’ target state; rather, we need to continually refine the way we stage investments in data, disciplines and culture.

As a data team at Reece we are continuously examining not only what we build, but how we choose what to build. I am excited that as we continue shaping what ‘good’ looks like, we’re making better choices about how we build — hopefully towards a hundred years (and then some) more of improving the lives of our customers and people.

--

--

Michelle-Joy Low
reecetech

Econometrician, always curious, loves growing people, and helping businesses use data.