Gigaom during its heady days

Gigaom: The Life and Death of a Venture Funded Media Startup

As most know, we lost Gigaom last week. It was a sudden passing of a widely beloved tech media company, and it’s been touching to watch the Twitter tributes from fellow media and tech readers wide and far.

But now that the memorials have largely been done and the soul(s) of the company is rising up and on to better places, it’s time to examine the body and determine the cause — or causes — of death.

As a former employee, I have a perspective that’s probably a bit more informed and nuanced than many. Not authoritative in any sense, but a perspective that is informed by the knowledge which came as a result of four years at the company as a VP and the last two as an external research analyst partner.

Like most, I was shocked when the news came out. After all, Gigaom had just topped up the tank a year ago with an $8 million funding round. Om, in a post announcing his transition to full-time VC, talked about how the company was doing well. Paul Walborsky, my former boss, talked on record about how the company was growing. All of this seemed to make sense and largely fit the picture that I was getting as a former employee semi-tied into the company through friends and ex-coworkers and through working with them as an external research partner.

The news also made me sad for what we had lost. Gigaom had gone big, had tried to build something new and different, all the while adhering to an editorial ethos that remained a part of its DNA until the very end. Despite its collapse, Gigaom’s story was a worthy one, one worth telling, and because of this I figured I would do just that, hopefully providing the necessary context to help myself and others figure out what happened.

I found as I reflected further I should not have been completely surprised by what happened. It’s not unlike when someone suddenly passes and signs of ill health that were previously ignored come into sharper focus. And while I don’t have the body in front of me to examine — no one save the company’s officers, VCs and now Silicon Valley Bank, its main creditor, have the benefit of the company’s financials — I think I can connect some dots from my own knowledge of the business and some of the evidence put into the public domain by former employees.

In the press, some had pointed at research as one of the main causes of Gigaom’s demise. I do think research and its cost model played a significant role. But I don’t think it was the only cause. Gigaom died a premature death due to many reasons, and if there’s any one overriding cause I’d point to the massive amount of VC funding and the resulting company cost model that was put in place to attempt to scale — across all of its business units — up to meet the high growth expectations that are incumbent with venture investment.

But I’m getting ahead of myself. Let’s start with the beginning.

Gigaom And The Tech Media Landscape in 2007–2009

Back in 2007, the tech blogosphere had started to come out of its early wild west days, but it still wasn’t completely corporatized. Sure, AOL had started to roll up companies like Weblogs (owner of Engadget), but they hadn’t yet bought Techcrunch, while independents like VentureBeat, ReadWriteWeb and Gigaom were cranking out good work.

Gigaom in the early days, pre-VC money

Gigaom was punching above its weight, earning lots of respect for an editorial ethos instilled by Om. That ethos — which was weaved into the company’s fiber and never went away (which is part of the reason for the sadness expressed by so many in the tech community over the past week) — meant the company was becoming a highly influential read among all the serious tech players in Silicon Valley.

This was in part because the writers were really good — Om, Katie, Liz, Stacey — but also because it contrasted so sharply with the general direction of tech media. As a nation, we had just begun to embrace our inner Buzzfeed (or one could argue, at that time, our inner HuffPo), as many blogs began to use provocative headlines and chase page views in ways that Gigaom then -and until the end — eschewed.

This avoidance of gimmicks and the seriousness of the journalists who worked for Om gave Gigaom serious street cred in Silicon Valley. As the same time, the company wasn’t widely known outside of the tech insider community, and never did cross over into a mass market tech media brand in the way Techcrunch or Engadget had. This was fine, however. Being an insider brand works, particularly if you can have influence and sway with the companies and folks who are masters of the tech universe. And Gigaom did.

Gigaom decided to try and build a business early that leveraged this premium tech insider brand, and do so in ways that went beyond simple ad dollars. The first step in this direction was events. And while many in media would say events are a fairly common first step for any fledgling media startup, most would attest Gigaom’s events were special exactly because they were such a tech insider brand.

At Gigaom events you rubbed elbows with the who’s who of the tech world, whether it was Reed Hastings of Netflix to Werner Vogels of Amazon to Jack Dorsey of Twitter/Square.

But it went beyond tech star power. Thanks to Surj Patel (Gigaom’s former Events VP) and the editorial team, these events were extremely well done. And unlike many in the events business, Gigaom avoided pay to play arrangements, meaning sponsors couldn’t buy a speaking spot. I remember how angry some would get about this policy, but the editorial/business firewall was sacrosanct and many sponsorships turned down because of it.

All this meant Gigaom’s events were profitable out of the gate. Over time, the company would expand and soon had the industry’s signature cloud computing event in Structure, and later the premier big data event in Structure Big Data.

The Gigaom team at Mobilize 2008

But even as events were successful and helped the company diversify from ad revenue, they were not inherently scalable. A startup could only do so many events and keep a blog running. Events are hard work. So Paul Walborksy, who was brought on as CEO in 2007, started to look towards new business lines that were scalable.

Like paid content.

Project Condor

Paul was brought on by Om and the board to steer the broader business ship and its strategy while Om would be the company’s editorial leader across all of our businesses. Paul is a former Wall Street guy who was adept at seeing new opportunities in information services and media more broadly, so empowered by Om and the board, he started to look around for new business opportunities.

Paul knew ad CPMs were heading down over time and, if Gigaom were to survive, it would have to so by getting paid for its content. While some blogs, like Ars Technica, had been successful in creating subscription models featuring ad-free content and long-form extras, Om and Paul felt they should keep all of the blog content and the community conversation (which, at the time before so much of the conversation around blog media moved “off-site” to Twitter, was mainly the comments section), in front of any paywall.

Paul believed that there was an opportunity for Gigaom to convert readers interested in its various technology categories like cleantech, mobile and cloud to deeper reads in the form of research reports. An early business plan was put together, called Project Condor, and a special projects editor in Celeste LeCompte, was assigned to work on what was now officially a ‘skunkworks project’.

This was in late 2008 and early 2009, and it was soon after this I was brought on board. I had early conversations with Om going back to 2007, prior to his heart attack, about Gigaom possibly heading into research, but this incarnation envisioned by Paul was much more evolved and, in my mind, disruptive.

Still in beta: early Gigaom research (then Gigaom Pro) site

At this time, the early business plan was based on a premise that research could be somewhat democratized if Gigaom could make it available to individual subscribers at a low price point. Traditional technology market research from the likes of Gartner and Forrester had been expensive and largely unobtainable for anyone without the backing of a corporate budget. A typical research report from one of these companies cost two or three thousand dollars, and a larger research service subscription could easily cost anywhere from twenty five to fifty thousand dollars.

But how would this work? It’s not like Gigaom could create research out of nothing. Someone had to write reports and Gigaom, while it has just taken a $4.5 million series C funding round in late 2008, wasn’t about to start hiring expensive industry analysts with six figure salaries.

Instead,we saw an opportunity to leverage a growing trend in market research, which was the increasing number of analysts thriving outside of the confines of traditional research companies. The arrival of push-button publishing on the web had given analysts and consultants more ways to reach customers than in the past, and many star analysts who started at ‘big research’ had started to strike out on their own. Others, many who had never worked for traditional research but had deep domain expertise, saw an opportunity to provide advisory services to companies.

We thought, what if we could provide a platform for some of these independents to reach a wider audience through a research service from Gigaom? We believed that if we could pay these independents to write reports on a freelance basis, they would get the exposure of being part of a ‘virtual analyst network’ at Gigaom and we would get to tap into their expertise without having to pay the high salaries that often come with such knowledge and backgrounds. Win-win.

And so the plan was set. Over a four month period — from the time I came on in early February 2009 to late May 2009 — we were in stealth mode, recruiting analysts, setting up research projects, and building a website. The core editorial team was just Celeste LeCompte and I, while the project and tech team involved a super capable Jaime Chen as the product lead on the site development side, with assists from Wordpress super-ninja Mark Jaquith and Gigaom original web guru, Chancey.

Analysts I approached were receptive. They liked the idea of aligning with Gigaom and also being part of a fairly new approach to market research, and, of course, they also liked that fact we would pay them to write the report.

We also thought at the time we could leverage Gigaom’s own stable of writers by having them contribute pieces on a regular basis. These pieces were called “Long Views”, which would allow the writers on the blog to stretch their legs a bit with bigger word counts and deeper analysis than was typical for the blogs at the time. This was important because we knew that the idea of individual brands mattered, and we thought readers of Katie Fehrenbacher and Liz Gannes, to name a couple, could be convinced to possibly pay money to see deeper analysis.

We launched Gigaom Research — then called Gigaom Pro — on May 28, 2009. Om wrote a big post on it, and the story was picked up pretty widely in places like the New York Times. In retrospect it’s amazing how much work we got done in that short amount of time, as we launched with tens of reports and over 20 analysts in the network.

Gigaom Pro mention in New York Times

A part of the launch that got a lot of focus was the price point. We made a decision to launch at what seemed to many a ridiculously low price of $79 per year. But because we were doing something that we believe had largely never been done before, we were trying to gain significant conversion of our readership — probably around 2–3 million monthly uniques at the time — to the research product. We thought if we could make the price so low, it would enable the ‘true fans’ of Gigaom to subscribe and support while providing immense value in the form of research.

Some would argue that by going so low, we were “anchoring” the price at a low level, which would make it difficult to raise over time. Others also felt that there was a possible perceived value problem by going so low, that by putting a price of less than a hundred bucks on an annual subscription would make research buyers think the quality would be low, that the offering wasn’t comparable to what you would find at other research houses.

These are both valid arguments, but they were risks we were willing to take. We ultimately knew we wanted to be disruptive, and we knew we could only do so trying to create a product that had never really been created before.

Scaling With Venture Capital

With the context on the blog, events and research side, it’s worth taking a quick look at the company and its financing as a whole. At the beginning of 2009, the company had $4.5 million in the bank and what would be, by the middle of the year, three lines of business.

The large majority of the employees were still on the editorial side. We had a network of blogs, each manned by an editor and sometimes two, and an some editorial staff of a couple to support this. On the research side, there were two of us — and we had a handful of freelance analysts writing reports. On the events side, it was a few folks and help from an external events team.

When you take on financing as a media startup, you’re expected to grow quickly. The investors are hoping for the usual 10x return, which in startup media is extremely hard to do. Some of the early exits for blogs were not astronomical prices — $20-$35 million for WeBlogs (about 10x revenues)in 2005 and Ars sold for about $25 million to Conde Nast in 2008. Eventually Techcrunch would sell for between $25 and $40 million to AOL.

Techcrunch’s & HuffPo’s exits to AOL were setting valuations for blog networks

When you look at these valuations, it starts to show the difficulty of getting a 10x return on a blog of Gigaom’s audience size. Gigaom had a smaller readership than these sites — in 2009 it was probably 2–3 million monthly uniques — and getting 10x off of nearly $6 million (the company had taken earlier funding rounds of $325 thousand and $1 million) means you’re looking at a $60 million payout. Add in another $2.5 million in late 2010, and now you’re looking at a $80 million plus exit.

That’s a tall order. It also tells you why Paul and the board started looking for new business models (research) to get to that type of payday. Outside of Huffington Post (which sold for over $300 million to AOL), that type of acquisition price would be near the top of blog exit valuations to that point.

Where’d The Money Go?

And now — without the benefit of detailed financials — let’s look at where the money was eventually spent. In short, everywhere.

The editorial team on the blog side grew. According to Mathew Ingram, it was a 22 person editorial staff by the time Gigaom shut its doors. On the research side, the business eventually went beyond two people and we hired sales people and some research directors. Sales and marketing grew. On the technology and web site, the team grew. In a recent tweet by Casey Bisson, one of the company’s former lead developers, he shouted out five others on site engineering and QA Throw in a couple product management types and soon your looking at 8–10 people, a development team larger than some small venture funded software startups.

The company also had significant costs locked up in real estate — when it had acquired PaidContent in 2011, the company assumed the rent of the company’s Manhattan office space. It also had office space in high-rent San Francisco.

In other words, Gigaom had a lot of overhead across all the business units. Events had the least — in part because the company always made significant use of external agencies to help pull them off — but overall the company had grown staff, had rising fixed costs in the form of real estate, and continued to service interest payments on its growing debt load over these last few years.

One of the areas of staff growth was sales people for Gigaom Research. In 2010, as individual subscriptions were not hitting our targets, we decided to go after enterprise money. Originally it was a fairly modest effort, with myself doing some of the original deals and eventually we brought on a couple salespeople.

But in the last few years that grew significantly. I left the company as a full time employee in late 2012, but after that the sales staff continued to grow. Taking a quick look at Linkedin and searching “Gigaom sales” and there looks like there were anywhere from 7–10 research sales people. There were also sales folks for events, ads. That’s a lot of sales people.

Over time, increasing the sales and marketing mix towards research did make sense. According to an interview made by Paul, he said that research made up 60% of the company’s revenue. In the same article, revenue were estimated to be $15 million, so that translates to about a $9 million research business.

Certainly, it’s worth noting that this $9 million is a hard won $9 million. There’s lots of sales people you have to pay, and over time the cost of paying freelance analysts went up as you put out more research. And no doubt, this type of business is a pretty radical departure from the original vision of Gigaom Research, which was centered around a model of high volume of individual subscriptions to consumers.

Should Gigaom have continued to invest in research? I’d suggest that with the large amount of venture capital it had taken — and the expectation of an earn out for investors — that Paul and the board decided it had no choice.

But in reality it did have a choice. We all know they could have decided to build a more modest business, one centered around the blog and events and possibly trying to increase the individual research subscription model. However, I think because the board wanted to see growth, rapid growth, to justify its investments, the decision was made to continue to grow all aspects of the business, including the now corporate-sales centric research model.

The End

Looking at all the information before us, some would say that the movement towards a corporate-sales research business was the cause of Gigaom’s eventual demise. I’d say certainly it certainly contributed to it, but I’d also say let’s not confuse cause and effect.

What I mean by this is all the over-investment and high operating expense built into in all aspects of the business — not just research — was an effect of the company taking lots of venture capital and venture debt and the expectations that go along with that. Gigaom had an editorial staff of 22 for a blog that had 6.5 million monthly visitors. It had 8–10 product and website people. It had research directors, sales people and others. They experimented with new events like Structure Europe in 2013 (an event that only happened once — a good sign it lost money). Add in freelance analysts, temporary event staff, and other costs and you have an expensive, high-opex business.

All in the name of growing and scaling to hit revenue targets that probably were not reachable. Revenue targets that were, no doubt, chased in the name of hitting a certain multiple to recoup the investment made by the venture capitalists who wanted a 10x return.

Gigaom shut down abruptly on March 9, 2015

The most confusing thing for me initially as someone who left a few years ago is the quickness of the demise so soon after an $8 million funding round. That’s a lot of money to burn through in just one year, particularly after the company had survived for over 7 years to that point on a total of $12 million.

But as I’ve thought about it and talked to others the last few days, things seem a bit clearer. I’ve been told by some that a balloon payment on debt owed to Silicon Valley Bank came due. That may be true, and if is it doesn’t change things — it’s just another sign that things caught up to them after years at running at a loss and they were never able to turn things towards a business that ran in the black, or at least kept operating losses low enough that the burn rate and debt borrowed was manageable.

Wrapping It All Up

Gigaom is a company that started going down the venture capital and (presumably) venture debt path early on. By late 2008, before I came on board, they had taken on nearly $6 million in venture funding. They were running at a loss the entire time, as they continued to take new funding up to nearly the very end.

I think that the fact they were heavily capitalized early on forced the company to look towards new monetization model outside of ads — which were and continued to experience declining value-per-reader — and research was the big bet they made. The company went big by trying to create a disruptive research model, based on a belief that a) Gigaom’s core readership could be converted at a decent rate to create a decent subscription model and b) individual subscribers would be open to buying research.

The entire company — across all the divisions (blog, research, events) — was expected to grow significantly over time to the point where revenue would outgrow operating expenses, the burn rate would lower, and the company could eventually turn a profit.

Over the years, the operating expenses grew across the blog and research — and related support staff in the form of sales, marketing, technology/development — the entire time. We do know this growing cost-base that resulted in higher operating expenses was supporting higher revenues. We also know the majority of this revenue growth was due to research — this can be deduced from the fact the revenue mix shifted from 0% of revenues in 2008 and single digits % in 2009 to to 60% by 2014.

We can assume the losses were mounting the entire time, and we could speculate that the late-stage round they took in early 2014 was, in large part, going towards debt service. And if debt was indeed a factor, the loan may have been structured where a balloon payment was just too big to overcome even with the significant final funding round.

Most of us will never know the actual specifics of the capital on hand, where it all went, and why it got to point of no return so quickly at the end. All I can say is the company was playing a dangerous game the entire time, using a combination of venture capital and debt that forced decisions across the business to continue invest heavily to result in growth, hoping that a level of scale would happen to where eventually revenues growth would outrun expenses to the point that debt was manageable and that there would be an exit.

But neither a manageable burn rate nor an exit ever happened. There’s a good chance these two things are related. I do know — it’s a pretty open secret in the tech media landscape — that Gigaom had talked to various suitors and some deals were almost consummated. I personally believe that the high amount of money invested in the company combined with the expectations of a venture board meant there was expectation of a high return in a high purchase price. It’s clear now that since no one ultimately bought the company, the difference between expected return and the value put on the company by potential acquirers remained too wide.

And so the investors, the stock option holding employees (including myself) and others with an interest in a Gigaom exit got nothing. Money paid toward exercising now worthless stock options is now gone, and some of us are just hoping to get some of the money owed to use for work we have done.

Perhaps the biggest mystery — and sadness — in all of this is the decision to simply shut the company down. I know they owe money to a creditor who wants it, but why then did keep operating ‘business-as-normal’ for pretty much the entire last year? Why couldn’t the company have gone to a much reduced staff six months ago, to manage the capital burn and keep the lights on? Why won’t they explore options such as operating under bankruptcy or something that would let Gigaom continue?

We don’t have these answers and we may never. All of this is a bit sad and hard to understand the reasoning. Having watched the outpouring of sadness over the loss of Gigaom over the past week, clearly many saw value in the company, felt that it’s reputation and credibility of creating good, thoughtful content in a Buzzfeedified media world was something worth keeping.

Apparently not.

I think a lot of the reasons Gigaom was so respected — its dedication to editorial integrity, the deep analysis across the blogs and the research, the intimate hi-touch events — all part of the reason it also ultimately couldn’t reach the audience and ultimate scale demanded of it from heavy venture capital. By not chasing page views, by not heading down the path of cheap headlines (Buzzfeed, et al) and relentless self-promotion of its products with its highest page-view editorial (Business Insider, et al), the company essentially put a speed ‘governor’ on its growth. But in the end, it was this conscious decision to chase quality, to be deliberate, to protect the brand and speak to core set of readers who wanted good and thoughtful content was what made Gigaom so treasured.

It’s too bad the company wasn’t as thoughtful and deliberate about how it managed its money.

Next Story — With Do-Anything IoT Button, Is Amazon Laying Groundwork For The Physical Interface of Consumer IoT…
Currently Reading - With Do-Anything IoT Button, Is Amazon Laying Groundwork For The Physical Interface of Consumer IoT…

With Do-Anything IoT Button, Is Amazon Laying Groundwork For The Physical Interface of Consumer IoT?

The Dash IoT Button

I must admit, I was pretty excited reading the news of the Dash AWS-powered IoT button. After all, while I’ve written a lot about how the Dash button effectively demonstrates the power of a singular, simplistic physical interface for IoT, so far Dash button have been (purposefully) limited as single-brand purchase machines.

But what if Amazon enabled the consumer to use the button to purchase anything or, even better, set into motion any action in their connected home or connected lives? Now that would be something.

But alas, this is not that, at least not yet. In a post on the The Verge, Paul Miller does a good job of lamenting what could have been with this latest Dash button.

“The real ideal would be a button we can register with an app and have it trigger any action on the internet. It would be the perfect way to make IFTTT physical.”

Exactly.

But imagine for a moment if Amazon truly did make an all-purpose, do-anything button for the Dash, one that didn’t require an AWS account? And imagine if such a button enabled consumers to connect to any number of third party smart devices, online services to initiate, engage or transact?

That would be huge and, as Miller says, probably do nothing for Amazon’s button line. And therein probably lies the problem, at least if you’re Amazon.

The Dash IoT button for developers starts down this path, but it’s not really a consumer product. It’s really a developer product and, because it’s a developer product, requires an account with AWS and all the technical acumen and hassle that comes with that.

But I’m still hopeful. By starting down this path, Amazon may be laying the groundwork for developers to experiment with the IoT button and create compelling integrations, ones which, I have no doubt, Amazon will begin to showcase as what’s possible with a do-anything button. And who knows, maybe down the road they’ll release a more consumer friendly one that consumers can simply buy, register and simply assign an action?

If I know anything about Amazon, I suspect they might be thinking exactly along these same lines.

But what about making money, you’re ask? OK, so while a general-purpose Dash button may not be as directly tied to Amazon purchases as the first generation, the Seattle online giant might realize the power of owning the one-button IoT physical interface might actually be an indirect way to becoming an even bigger consumer IoT powerhouse they’ve already started to become with Echo and Alexa. With a do-anything button, not only will the own the voice interface layer for our connected lives, but could start to pave the way towards owning the physical interface.

The benefits of having access to all the data associated with tens of millions of distributed buttons in our homes, our workplaces and everywhere would be amazingly powerful. And, yes, a little scary.

But when’s that ever stopped Amazon?

Next Story — Hacking Gladwell: Welcome To The Era of Augmented Expertise
Currently Reading - Hacking Gladwell: Welcome To The Era of Augmented Expertise

Wilson’s Connected Football Will Create New Russell Wilsons At a Faster Rate

Hacking Gladwell: Welcome To The Era of Augmented Expertise

A few years ago, author Malcolm Gladwell posited a fairly straightforward idea that fast became a core tenet of the modern self-improvement industrial complex, this belief that if anyone spends a whole lotta time doing something — ten thousand hours to be precise — they’ll become really, really good at it.

Gladwell’s idea, since coined the ‘ten thousand hour rule’, is based on research by K. Anders Ericsson who developed a hypothesis that said a person becomes a elite at something, a true expert, through years and years of deliberate practice. Gladwell took this concept and ran with it in his book The Outliers, where he pointed to people like Bill Gates and The Beatles as examples of those who have developed elite skill through logging the necessary ten thousand hours at their respective crafts over the course of many years.

Hacking Gladwell’s Law

Increasingly however, instead of doing it the old fashioned way, people are gaining expertise, or at the very least producing elite expert-like results, in much shorter time period with the help of advanced technologies. Whether it’s perfecting your shot with sensor-powered basketballs, learning how to code online with e-learning services or using an app-driven cooking device that gives you chef-like results, today’s technology is increasingly offering products that can help someone become very good at something in a much shorter time period.

One major part of this trend is the democratization of online learning. Whether it’s the availability of coding courses online that provide access to technology education that previously required a college course in computer science or the proliferation of how-to videos on YouTube, the Internet provides access to world quality expertise in ways that were never possible before.

But it goes beyond online education. In fact, one could argue that it’s the application of newer technologies to provide real, physical world guidance that is the driving force behind completely new expertise-building experiences unlike we’ve ever seen before.

Take sports. With the arrival of connected footballs, kids — whether it’s in their backyard or within a formalized sports program — will know pretty quickly how tight their spirals are and how quickly they are improving both against their own past performance and against that of others. In short, technology is bringing Moneyball level analytics to the backyard, not to mention gamifying the experience so kids don’t need an entire team to get better, but can simulate the experience of a basketball or football game with a little brother or sister.

And how about becoming a master brewer? Before, brewing great beer usually involved thousands of hours and hundreds of late nights before you got to the point where that grain concoction of your’s was good enough win an award, let alone sell at the local pub. But now, companies like Picobrew and Brewie are applying IoT technology to the process of beer brewing to not only assure consistent results, but also to enable the aspiring brewer to brew beer recipes from master brewers from almost the get-go.

Forget Augmented Reality: Here Comes The Era of Augmented Expertise

In some ways, much of this isn’t any different than what’s been happening in the world for as long as mankind’s existed: The arrival of ever more advanced toolsets, which brings about expert results and gives people repeatable elite skills in ways that had previously been out of reach. The most obvious fields here are athletics and the military, where the application of new technology results in measurable improvements in shorter periods of time for people.

But unlike before, the fusion of computing technology with the physical world has created an acceleration of understanding in what seems like every conceivable arena imaginable. We’ll call this trend augmented expertise, where the arrival of low-cost sensors, GPS-like guidance systems for nearly everything and real-time analytics are combining to accelerate learning in practically every discipline. Whether it’s future musicians using connected pianos or would-be chefs achieving restaurant like results with a new sous vide cooker, the time it takes to achieve high quality — or even expert — results across a range of disciplines is shrinking.

All this said, it must be said that God-given talent still matters. If the last few decades in the world of sports has shown us anything, it’s that amazing results happen when you combine born talent with advanced technology and training techniques. We are in an era where world records hardly stand for a year or two, let alone for a decade. Still, with the arrival of new technology and the era of ‘augmented expertise’, we are entering a new age where technology will help us amateurs go from zero to sixty much faster, and maybe even make a really good steak while we’re at it.

Follow Mike on Twitter or read his (semi) weekly blog/newsletter on smart home and IoT trends.

This post was first published in Forbes.

Next Story — Why The Kitchen Is Now The Most Interesting Battleground In The Connected Home
Currently Reading - Why The Kitchen Is Now The Most Interesting Battleground In The Connected Home

Why The Kitchen Is Now The Most Interesting Battleground In The Connected Home

When I first started writing about the connected home back in the 90s, it was apparent even in those early days that the Internet and in-home networks were a foundation for massive change. Whether it was new Internet-based phone services like Skype, streaming music services like Rhapsody or some of the early smart home products, it was clear that the advancement of computer technology and the network would change things in significant and unforeseen ways.

But for all the ways the connected home has changed our lives, it become clear in the intervening decade that the defining battle in this space was in the digital living room. Just about the time Wi-Fi became a household word, a steady drumbeat of innovation around Internet video could be heard, growing louder each year, until today where we find ourselves living in a completely new TV landscape, one where the barbarians have not only have broken down the gate, but they’ve moved in and kicked out some of the incumbents.

Just consider: Netflix is now the biggest paid video subscription provider in the US. Apple TV and Roku households number in the tens of millions. And nowadays, the biggest talents in Hollywood are more interested in doing creative deals with Amazon and Netflix than old-school TV networks and movie studios. At the same time, mobile screens have become a part of our mobile living rooms, changing how a generation of consumers consume and interact around entertainment. The end result is a hundred billion dollar plus industry has been transformed, as tens of billions have shifted from the incumbents to those companies who rethought an industry.

Is June The Roku Of The Connected Kitchen?

But more than a decade into the Netflix era, the dust is settling. Sure, there’s still lots of excitement ahead around ever-more immersive video experiences, exciting new technologies like 4K and VR (both of which are immensely interesting), but at this point we all know the old living room is not the new living room and have a pretty good idea where this story is going to go.

That’s not the case with the kitchen. In fact, this central gathering place in the home is on the precipice of a massive wave of change and, like the digital living room a decade ago, is showing many of the same early signs:

-Lots of startup activity and investment
-Incumbents actively trying
early experiments
-Some early big swings at trying to reimagine how
cooking, food delivery and the kitchen itself could look

Ok, so where exactly IS the opportunity here?

Simple: Everywhere. Just as we saw how new technology changed how people consume, acquire, store and talk about entertainment in the living room, so are we beginning to see how technology will change the entire delivery, storage and consumption of food in the coming decade.

But the comparison to the living room isn’t perfect since food and the connected kitchen is a much bigger opportunity, if for the simple reason that everyone eats. Overall food budgets are probably 5–10 times that of video entertainment, and the issues of being more efficient with food extends well beyond middle and upper class households (video entertainment’s sweet spot), but worldwide, across every region. And unlike video entertainment, those of use who cook are engaging in the act of creating something. Consumers spend massive amount of time and dollars trying to create better food, and for many of us technology will change how we do that in nearly every way in the coming decade.

This will happen as new technology and resulting business models are applied to every step of putting food on the table: shopping/replenishment, delivery, storage, preparation, serving and consumption. Part of it is the network itself, as connectivity and computing enabling us to better understand what food we have, to shop for it and how to cook it. But it’s so much more, with new cooking technologies such as RF cooking, imaging, molecular sensing, and more all making its way into our kitchens. We’re seeing professional cooking techniques democratized and now being made available to consumers, and high-end and messy hobbies becoming more and more automated to enable busy consumers to try their hand at them. And we haven’t even mentioned the widespread health benefits of being under to better understand your food through technology.

Predicting the future is hard, but I think it’s safe to say that at some point we’ll likely see all of the following become commonplace:

-Food subscriptions, connected food commerce and near real-time, automated food delivery (drone delivery of your Blue Apron fresh meal anyone?).
-More and more levels of cooking automation and advanced tools to help us take our cooking “craft” to high levels if we so desire (or we may just let the the machines to it all ; yes, in some form, cooking “robots” are coming).
Better information to help us handle our food so we don’t waste nearly as much as we do today.
-More unprocessed foods coming into the home and as technology allows those who want to reduce the amount of ”industrial” processing of the past by collapsing some of those functions into devices in the home.

I could go on and on, but the reality I expect all of this to happen. The one ingredient that’s missing is this industry still needs some companies to step up and set the pace, some leaders to set the template.

In other words, we’re still searching for the Netflix and Roku for the kitchen.

Michael Wolf is the creator of the Smart Kitchen Summit, the first event that looks at the future of the connected kitchen.

Next Story — The Coming Smart Home Shakeout
Currently Reading - The Coming Smart Home Shakeout

The Coming Smart Home Shakeout

I hate to be the bearer of bad news, but I think we’re in for a little turbulence in the smart home market.

Not because the market isn’t growing. It is. It’s just not growing in a way that can make every investor, startup and big tech company happy.

This isn’t completely unexpected. Markets — especially new consumer technology markets as they are searching for defining use-cases, form factors and hero products — take time to figure themselves out.

The smart home industry is trying to figure itself out.

In the meantime, some investing in the space are disappointed in their early results. Companies like Best Buy, Lowe’s and others that have jumped into this market with gusto aren’t always seeing the type of demand they want for the products given the amount of shelf space they’ve allocated.

The early success has come in a few product categories like cameras, thermostats, maybe a few door locks. In other words, product categories consumers get immediately. Sales of smart home “systems” that include hubs and multiple devices have been selling more tepidly.This is a market education and messaging issue, as consumers still don’t fully get the concept of the smart home and certainly aren’t convinced they need to plop down hard earned cash for one.

But let’s not blame the consumer. They can’t be expected to understand the smart home while the smart home industry is still trying to figure itself out.

It reminds me of the prehistoric days of the video streaming market in 2004 when we in the industry knew at some point consumers would stream video and other great content to their TVs and around the home, but the consumers didn’t yet know it. Startups were taking early stabs at creating new categories like the media adapter (nostalgia link: here’s me writing about one for Network World), while others like Microsoft tried to build media streaming systems around their existing strengths and ultimately failed.

But the comparison isn’t perfect, because unlike whole-home video streaming, the smart home has been around in some form or another for 30–40 years. This early market of X-10 and other tech led to a more modern generation of DIY smart home products, brands such as INSTEON and then Mi Casa Verde, businesses built upon the hard work of their founders, who through grit and determination managed to create a market and community around their products and ecosystems. Other companies like Belkin entered the market with WeMo, finding success with an approach built around Wi-Fi. And then we saw a rush of upstart efforts like SmartThings, Revolv, Wink, Iris and Staples Connect, each with different but similar approaches.

All the while, consumers weren’t paying all that much attention. When they did decide to buy a product we in the industry consider a “smart home” system, the consumer thought they were buying a network camera to watch their dog or a connected thermostat to maybe save a little money.

Disconnect.

In the meantime, we have new efforts like HomeKit and Google’s Weave/Brillo that hold some promise. We have obvious demand for products like smart home security systems because, well, consumers understand home security.

In other words, the industry is slowly figuring itself out, doing the hard, grinding work it takes to develop underlying technology that can lead to new services that consumers eventually will “get” and maybe find indispensible someday. But in the meantime, you have companies like Wink, who we recently learned seems on the brink, beleaguered by a combination of near blinding audacity of its parent company Quirky, product recalls and softer than expected demand for things called hubs. And Wink is probably the tip of the iceberg. I think there may be other companies or divisions like Wink, wholly invested in product strategies that might not be the resonating with the market, that we’ll likely learn about soon enough.

The good news is we’ll get there. We have some hero products that are seeing strong demand, we have big service providers investing tens of millions of dollars in market creation, and Apple and Google are doing what they do. We have innovative startups still innovating, in horizontal categories like interfaces, cloud rules engines and more, while there are also some exciting stuff going on in product categories like water, yards, smart kitchen and more.

Wikipedia defines a market shakeout as “a term used in business and economics to describe the consolidation of an industry or sector, in which businesses are eliminated or acquired through competition.”

It goes on to say that “shakeouts can often occur after an industry has experienced a period of rapid growth in demand followed by overexpansion by manufacturers.”

I think we’ll see more of both of these, rapid growth and consolidation. Rapid growth in hero categories and, eventually, fuller smart home “systems” as the market figures out what those should look like as the consumer tells them what they like. Consolidation too, as companies who have developed technology either don’t find enough of a market themselves or need to fall into the hands of someone with deeper pockets who think they can develop the market.

Either way, buckle up. It’s going to be an exciting but bumpy ride.

Originally published at my company blog.

Sign up to continue reading what matters most to you

Great stories deserve a great audience

Continue reading