Kafka Survivors of the World, Unite!
Why we backed Memphis.dev
A prepared mind. The story of our investment in Memphis.dev, an Israeli company that is reinventing the data streaming processing infrastructure stack with a new open-source message broker, actually began about six months before we met the Memphis team. I had just met a European team working on the same problem in Copenhagen. Kafka, the Danes argued, was fundamentally broken. Kafka was initially released in 2011 and has become by far the most widely used data streaming solution in the world. But as applications and infrastructure evolved to be increasingly data-intensive, increasingly cloud-native, and increasingly complex, Kafka was becoming ever more challenging to maintain as the data streaming infrastructure to support these applications.
As I investigated this claim, it became clear that the Danes were right: Kafka was indeed broken and ripe for disruption by something more modern, more flexible, and more powerful. It also became clear to me that whatever would replace Kafka would also have to be open source. Stream processing was a core capability of whatever application it was a part of — and no customer would be willing to replace an open-source tool (however problematic) with a dependency on a new closed-source vendor. When I suggested that to the Danes, they went off to think for a while. The next time we spoke, they decided to change tack completely. “You are right,” they said. “It does need to be open source, and we don’t want to be build an open source company. So we are not going to build it at all.” They ceased work on their Kafka alternative area are now working on something else.
Welcome to Memphis, population four.
So it was with that insight in the back of my mind — the necessity for an open-source, more powerful Kafka alternative — that I met the Memphis team. From our first meeting, it was clear that Yaniv, Sveta, Idan, and Avraham were deeply committed to rebuilding the entire data streaming stack from the ground up. They were not afraid of taking on Kafka, but were thinking carefully about how best to ensure that developers understood that Memphis was both a better alternative to Kafka while at the same time also a way of making existing Kafka implementations better. That is not an easy balancing act to pull off.
From a VC perspective, the world of dev tools is often frustrating, as brilliant people can often devote themselves to building tools that are ultimately incremental or dedicated to solving a problem that is not widespread enough to represent a real market. Memphis is one of a small set of developer tool companies that represents something much bigger. The pain that Memphis addresses is deep and affects nearly every developer that has touched high volumes of data. High-volume data almost always needs to be stream processed, and the best-in-class building blocks that exist today still leave developers with a daunting amount of work to do just to get that streaming infrastructure to work seamlessly. Any attempt to add intelligence to that data streaming process just complicates matters further, resulting in even more complicated ongoing maintenance.
It’s one of those cases where a core functionality already exists and is widely adopted, but the obvious next steps to take that technology further are nearly impossible for most users. Ask a Kafka developer if they can stream a million data points from point A to point B, and they will answer yes. But then ask them if you can run a PII filter on the data, send the PII data to point C instead of point B and do all of this in a way that can dynamically autoscale across parallel streams in response to variations in the amount of data being generated, and their confidence in Kafka just plummets. Technically, this sort of thing is possible in Kafka, but it’s so difficult to implement, monitor, and maintain that Kafka has become the stuff of nightmares for developers tasked with these complex cases. As data volumes increase and real-time data streaming becomes increasingly mission-critical, this level of complexity is increasingly commonplace. The complexity and pain of attempting this with existing tools has even given rise to companies dedicated to solving it — such as the $9B Confluent, which is essentially just a managed enterprise-grade Kafka implementation.
The message to developers
I asked Yaniv, CEO of Memphis, what’s the one thing he most wants developers to know about the technology. “Memphis,” he said, “is a modern alternative to existing streaming engines like Kafka, but it can also enrich or co-live next to it and serve other use cases like real-time ML training at scale, and streaming pipelines in zero time.”
“So if you are already using Kafka, come check us out. We can improve your existing Kafka-based use cases dramatically by removing most of your client’s logic with ready-to-use functions, decrease data-level troubleshooting with an out-of-the-box dead-letter queue, real-time message tracing, automatic partitioning, and get you to production FAST.
And if you are not using Kafka (or have mixed feeling about Kafka), you will love Memphis.dev…”
Kafka survivors of the world, unite! All you have to lose is your broken pipelines! Check out the slick demo on the Memphis.dev homepage, the company’s extensive documentation, their discord, or go straight to their GitHub. This team has built a solution with a lot of love, a lot of care, and a deep understanding of the pain caused by existing tools that are just not up to the task. A new day is dawning on one of the biggest development pain points in modern data-intensive applications.
It’s an absolute joy to be a small part of Memphis.dev’s journey along with Shomik Ghosh and our friends at boldstart, our long-time partner-in-crime JFrog co-founder/CTO Fred Simon, our old friend Snyk co-founder Guy Podjarny, our new friend Ran Ribenzaft from Epsagon, and many other impressive angels such as CircleCI CEO Jim Rose, Console.dev co-founder David Mytton, and Priceline CTO Martin Brodbeck. Yaniv, Sveta, Avraham, and Idan: you guys are as good as they come. I don’t know a more mission-driven and down-to-earth team in dev tools — and I’m sure this is just the beginning of a great story.