Don’t Buy the Demo: Why an Iterative Approach To Data Systems Is Better

Here’s an IT scenario you may find all too familiar: You got the green light for a large and important IT project — maybe for a large ERP system, data warehouse, migration to the cloud, or massive Hadoop implementation — you’ve vetted and selected the vendors, and even held the kick-off meeting. There’s euphoria at the start and satisfaction that the project has begun. You celebrate with your peers and vendors in the belief that the new system will boost sales, expand market reach, improve brand, or help you run the company more efficiently.

It’s a big promise, and expectations are high.

But 3–6 months in, you’re no longer celebrating. You’ve entered what I’ve heard called the “valley of despair,” a place where everything’s harder than expected, important issues were overlooked, estimates flawed, time frames too aggressive, and so on. All not good. For anyone who has remodeled a home, you know what I mean — everything is harder, takes longer, and costs more.

One reason for that is you bought the demo.

It’s tempting to buy into a project that promises the world.

The glossy brochure looks great. You see the marketecture diagram on the whiteboard, the well-crafted PowerPoint slides, and sometimes a professional video extolling the benefits of the project. Manned intergalactic space travel or human colonies on Mars appear wonderful in a brochure that fails to explain fully the risks, the costs, and the likelihood of failure.

For the human mind, it’s easy to underestimate the difficulty and exaggerate the benefits. After all, we are “can-do” humans and incurable optimists.

But if you buy the demo, you’re setting yourself up for possible failure.

Now more than ever, businesses can sabotage their chances at making productive use of the data by rushing into the latest and greatest without first considering the aims of their project.

Instead of buying software based on presentation alone, tech decision-makers should instead take a cautious, iterative approach when implementing new software — especially when it means replacing existing systems.

Here’s why:

Data systems have evolved.

I recall the data warehouse projects from the 1990s. There were so many well-intended projects and yet so many failures. What I learned was that the attendant risks rose geometrically with the scale and complexity of the project.

Fast forward to the present, and we have the Digital Transformation imperative. It’s a common topic in boardrooms. C-level executives, with little understanding of technology, are being challenged to inject radically new technologies into legacy computing environments. And it’s a daunting challenge.

Today’s marketing literature promises real-time decisioning systems using machine learning and Artificial Intelligence to lower costs, drive innovation, and reduce capital expenditure (CAPEX) by moving to the cloud, or to deliver the Holy Grail — “one version of the truth” — so better decisions can be made.

Each of these objectives, while laudable, are fraught with difficulty and risk.

Typically, a large business has a portfolio of various applications, often well over 1,000. These systems can be brand new, under development, in refurbishment, or in hospice care. Over the years, these systems were built with different technologies that might have been in vogue at the time but are incompatible now. They represent a mishmash of different technologies and are expensive and difficult to support.

Corporate data is strewn everywhere in disparate systems each with different, inconsistent update source.

So, what to do? I always suggest looking for the lowest hanging fruit with the lowest risk and best returns for the effort. Short projects that prove out the use of new technologies seem best.

But there’s always lots of learning along the way. Tract housing is much easier to build at a lower cost than something that’s completely custom. Why? Because the builder has learned how to use her tools. She’s learned what materials work best at each phase of the project and knows how to improve the economics along the way.

It’s hard to make meaningful change in data collection and analysis when most businesses are still relying on legacy systems.

The reality is, 70–80% of a corporate IT budget is dedicated to maintaining existing systems, so there isn’t much money available to do anything new.

Often, companies thoughtlessly waste this already small budget on shiny new demos.

But the most productive systems are often tiny skunkworks projects that will extract meaningful data here and there, cobble it together, and then do basic analysis. The goal is to make certain, discrete decisions — like zeroing in on fraud prevention for a bank.

Rather than ripping out old legacy data systems, consider instead augmenting legacy systems with new technology installed adjacently that can scale, reduce latency, improve reliability, lower cost and upgrade performance. That old legacy system can still interoperate with the new technology to deliver its original functionality and the new systems are simply an augmentation and improvement, breathing extended life into your team of old workhorses.

And, while you’re at it, you’ll get the feel for the newer technology. You can assess whether or not it works as advertised, learn how to deploy and maintain it, and get to know your vendor. Then, and only then, can you decide if this new technology is good for other larger projects.

Iterate, iterate, and iterate.

It’s cheaper, lower risk, easier, and a faster path to success. And you don’t have to bet your job.

While it’s easy to be wooed by sexy demos of new software systems, if you can resist and make strategic, careful decisions that are right for your company, you’ll be happy you did.