Google is About to Enter the Field of Battle — Part 2: Here Comes the Hammer

Brooks Hamilton
12 min readFeb 19, 2023

--

Forth, and fear no AI challenge!

Arise, arise, Googlers of Mountain View!
Mice shall be shaken, keyboards shall be splintered!
An AI-day, a Code Red day, ere the sun rises!

Code now, code now, code!
Code for victory, code for Google, and the world’s transformation!

(Inspired by Théoden, King of Rohan, and produced via ChatGPT)

A Tale in Three Parts

Part I: Bad Demo Day tackles the demo and stock sell-off events and sets context to inform what really drives these two companies. Part II: Here Comes the Hammer addresses Google’s capabilities, challenges, and opportunities. Part III: Predictably Unpredictable delves into why Bing’s seemingly unpredictable behavior was actually predictable.

The Fight Has Barely Started

Let’s return to the original questions about Google’s core ability to innovate and compete:

  • Does Google have the resources to compete?
  • Did Google see this coming or miss it entirely? Why was it not leading?
  • Is the team capable of succeeding?

My Take

Google has not yet engaged the enormous, highly-specialized war chest it has carefully accumulated in preparation for this exact confrontation.

I would absolutely not count Google out or think it is asleep at the wheel by any stretch of the imagination.

This will be a challenge for the team but not an insurmountable one. It could even be the existential challenge that galvanizes the organization.

Let’s dig into what Google brings to the negotiating table.

Hagar the Horrible, by Gary Hallgren, on negotiation strategies

Does Google have the Resources to Compete?

Oh, baby, do they ever. Google brings a ton to the party: an AI research team without peer, an innovative engineering team accustomed to working at tremendous scale, enormous existing user base, and shipping containers of greenbacks.

Research Team

Google’s research team dwarfs others. Google’s research arm is spectacularly larger and better resourced than all other A.I. research groups on the planet (excluding China because no one knows exactly what is going on there). Google has the accumulated research horsepower, experience, hardware, and budget to out-compete any company or national government today in the field of AI (excluding China, maybe). Lemme explain why.

Google has not just one, but three highly capable research units:

  • Google Brain. This is Google’s in-house AI focused research team based in Mountain View with 20 remote offices. It is a component of the Google AI team, which handles both hardware and software. In Fall 2022, this team had 3,500 people in it. That is approximately 10x larger than OpenAI’s headcount of 375 (late January ‘23) at that time.
  • DeepMind. Google directly owns this quasi-independent unit based in London. Remember that team who beat the world champion Lee Sedol at the ancient game of Go? That’s them. Remember that team who figured out how protein folding works and then sequenced every protein in the human body over Christmas break? Yeah, that’s them too. Those are just their popular projects.
  • Anthropic. The third arm is different from the other two. In this case, Google is a major investor ($400 million Quahog clams) in this AI research group but does not outright own it [Bloomberg]. This parallels the relationship Microsoft has with OpenAI. Anthropic is an extremely high mental horsepower team. For example, one senior member is the inventor of the transformer model, the innovation that kicked this wave into high gear with the release of the 2017 paper, “Attention is all you need” (FYI, this work was done while he was at Google. Just sayin’). Anthropic has done groundbreaking work on transformer understanding and value-aligning AI systems.

Google defines the AI research agenda globally. What does all this investment buy you? One dimension to measure thought leadership is the number papers accepted at major conferences. Google, along with DeepMind, produced more accepted AI papers at major conferences (NeurIPS and ICML) than the rest of the corporate field combined (30%+ more).

Academic labs are now minor players. The major universities (not included) publish quite a bit of research but have paltry budgets. At this point, major universities contributes theoretical work but they are not equipped to meaningfully compete with the AI industry labs.

Model building expertise. The Google AI team does more than write papers though. They train models. Big models! Here is a quick size comparison of the models. The size is number of parameters (think of those kind like neurons; more neurons means more capabilities).

Courtesy of LifeArchitect.ai: OpenAI is red, Google (light blue), Microsoft (yellow), Facebook (mauve).

Source from LifeArchitect.ai’s overview and catalog of language models, “Inside Language Models”.

Google’s models dominate the research field. Notice how two of the three largest models are from the Google / DeepMind team: PaLM (540 billion parameters) and Gopher (280 billion parameters). For comparison, GPT-3 the source of ChatGPT, has 175 billion parameters. The Research Team does not just sit around and think big thoughts. This is a group that has trained models far larger than OpenAI’s and developed, from scratch, much of the hardware and software to do so. They are not just thinkers, they are doers.

Special sauce. The person who leads the Google Brain team, Jeff Dean, was directly responsible for inventing and developing a number of those technologies. In fact, there is a website dedicated to collecting Chuck Norris-style anecdotes about him, except they are true:

  • “Jeff Dean can optimize a piece of hardware by dropping it on the floor.”
  • “He wrote an application to help the CDC manage specialized statistics for epidemiologists. It’s still in use today. He wrote it as a summer intern in high school.”
  • Likely not true: “Jeff Dean writes directly in binary. He then writes the source code as documentation for other developers.”

OpenAI’s contributions. While this is about Google, I would like to call out OpenAI’s researh contributions. OpenAI is far smaller and does not produce as many papers, OpenAI has made two strategic contributions.

The first contribution was to test a theory that, at the time, seemed pretty wild and speculative — if they increased the model size to a ludicrously large number — novel behavior would emerge. Yes, you read that right. The thinking was, “If we make it really big, then new capabilities just…happen.” And it worked. They increased the model size from 2 billion parameters to 175 billion parameters (GPT-3, Language Models are Few-Shot Learners) and may have just nudged the course of human history.

The other is crucial research done by physicist Jared Kaplan to work out the specific relationship between a model’s parameter size and reduced error rates down i.e. the smarter it gets (Scaling Laws for Neural Language Models). Crucially, he also identified that there is no observed end in sight to making these models more capable.

Engineering Team

In addition to AI researcher talent, Google has some of the best technical engineering on the planet with the talent and experience to implement AI solutions at massive scale. Other organizations may be able to pull off the research but few can scale to hundreds of millions of users constantly banging on it.

A few things Google’s Engineering team has delivered or is working on:

  • World class extreme scale products: Gmail, Maps, Docs, Android, Google Fiber, YouTube…you know, stuff you depend on like a utility.
  • AI Hardware: Tensor Processing Units (TPU’s): When Google realized that modern CPU and GPU computer chips were not best to train AI models, they designed and built their own chips and super computers.
  • AI Software: TensorFlow and Keras. Two of the most broadly used software languages in the AI industry. Jeff Dean was one of the major contributors to TensorFlow via his work on Distbelief with Geoff Hinton in 2009 and Andrew Ng in 2012. The author of Keras is the brilliant François Chollet, a Google Researcher who has also done very thought-provoking work on how to evaluate intelligence.
  • Core Cloud Technologies. The team developed software that can be decentralized across thousands of computers. Their work broke the conceptual logjams to usher in the rise of cloud computing: BigTable, Spanner, Apache MapReduce, etc. Guess who was a big contributor? Jeff Dean.
  • Magic. Quantum Computing. Yeah, so they’ve got that going for them.

Gigantic User Base

I struggle a bit with even how to measure this since it varies considerably by product category. For example, Android has 2.5 billion users. To focus on the search business —Google Search is used by approximately 4.3 billion users annually.

Cash and Other Things People Like

Cashola. Similar to some other tech companies, Google has a massive cash position and giant equity base to spend to build teams or acquire companies. Their cash position is $114 billion dollars (give or take a few hundred million).

Equity. As of mid-February, Google’s market cap is $1.2 trillion dollars (give or take a few hundred billion).

Did they see this coming?

Why didn’t Google move first?

Google did not have a publicly available, AI-enhanced search product released to the market before Microsoft released the “enhanced” Bing. With all of those expensive toys we just reviewed, why not? There are four reasons why Google did not come to market before another entrant:

  1. They were aware of the PR shitstorm that would likely accompany a first mover.
  2. It would erode their high margins.
  3. The user interface for “AI enhanced search” isn’t clear yet (see point #1).
  4. No one fully understands these models (see point #1, again).

First Out of the Gate Steps on the Rake

Good decisions come from experience. Where does experience come from? Bad decisions. Google has been down this path before, is familiar with how AI can surprise you, and just how badly it can burn. One of the better known incidents was the gorilla image labeling debacle. Very, very bad.

The search business could be a fickle one. The search engine you choose is 100% voluntary. If consumer opinion shifts en masse to see Google in a bad light and individuals use an alternative search method, marketing budgets will be spent elsewhere. This isn’t just a “high moral stance” but also an existentially financial one as well.

Demo Law: 60% of the time, it works every time. At this stage, all of these products are prototypes. Google likely understood the dynamics of combining a new technology that behaves in surprising ways with hundreds of millions of user and use cases. The result is a public relations disaster. That does not help you sell ads.

Let others be the lightning rod.

Doc from Back to the Future about to get beamed up

Self-Induced Margin Erosion

Getting ahead of yourself? Let’s shift perspectives from product strategy to the CFO. To preemptively launch this product, they would need to tell Wall Street, “Hey, we are going to decrease our margins on our primary money making engine to address a competitive concern that doesn’t exist today. You all good with that?”

Organizational Lego. Instead of pursuing financial self-immolation, Google took a pragmatic and flexible approach. They have developed the core pieces (research, engineering, users, and cash) and are ready to configure them in whatever way needed to address the competition.

You Want Me to Talk to My Search Results?

There is an open question as to whether placing a chatbot as an intermediary between you and your search results is necessarily the best user experience (even when the chatbot isn’t arguing with you).

Surprise, Surprise!

The facts are that no one fully understands every aspect of these models nor how they will respond under diverse scenarios. Google may not want to expose their revenue stream to the predictable outcome that the model will behave in unpredictable and undesirable ways. We are certainly seeing it play out with Microsoft and Bing. Live and in full color.

Can the Google Team Execute?

This is the area where I have some concern. Google has its work cut out for itself to transform its leadership, its culture, and manage public company market expectations while it shifts to compete across its major lines of business.

Time for Transformation

Here is a question that may inform our decision on whether Google will weather this: Why didn’t they have an AI enhanced search engine that was just sitting on a runway, fully gased-up, and ready to roll? Instead, they apparently threw something together at the last minute for a demo. As Andy Grove said, “Success breeds complacency. Complacency breeds failure. Only the paranoid survive.”

Peacetime vs. Wartime Consigliere

Different competitive landscapes call for different leadership personalities and cultural styles to thrive. Google may need to transform its leadership culture to weather this fight. It will likely require hard decisions in the days ahead.

“You’re out, Tom.”

“Why am I out?”

“You are not a wartime consigliere. Things may get rough with the moves we are trying.”

“Are you going to come along with me with these things I have to do, or what?” Coppola. Godfather II, 1974.

Wartime Consigliere. Ben Horowitz crystalized and transformed the term “wartime consigliere” for the business world in his post “Peacetime CEO / Wartime CEO”. The concept is borrowed from Mario Puzzo’s “The Godfather II” and influenced by Andy Grove’s management writing. Here are a few nuggets from that post:

  • Peacetime CEO spends time defining the culture. Wartime CEO lets the war define the culture.
  • Peacetime CEO knows that proper protocol leads to winning. Wartime CEO violates protocol in order to win.
  • Peacetime CEO aims to expand the market. Wartime CEO aims to win the market.
  • Peacetime CEO knows what to do with a big advantage. Wartime CEO is paranoid.

Founders’ Permission. Many times founders are tapped to transform their creations. For whatever reason, they are endowed with the authority to make difficult changes and for the organization to accept those changes. I would not be surprised to find founders Sergey and Larry spending more time on campus. Recall that this is the same team that figured out how to consistently own 80%+ marketshare on one of the largest, highest profit streams on the planet.

Culture. No large organization is immune from bureaucratic bloat. At the end of 2022, Google employed approximately 190,000 people though that number may be down 10k after recent layoffs. That is an enormous number of people to motivate and shape.

Public Company Handcuffs. Herein lies the golden handcuffs of the public company. Google needs flexibility to react quickly and make bets that will not pay off for multiple years whereas Wall Street penalizes for missed expectations on a quarterly basis. This is the challenge of differing investment horizons between the public markets and operating companies like Google. When companies return results below expectations, the Street rarely responds with smiles and the stock price may suffer.

Equity is a big part of compensation for employees. The higher up the hierarchy one goes, the more likely equity is a larger component of compensation. Some people are going to look for greener pastures and easier roads.

While Google is absolutely enormous, I wonder if it would make sense to take it private. It would lose some ability to acquire other companies and retain talent but it might need that latitude and freedom of operation. In a brilliantly creative move, Dell went private, acquired a much larger public company, then went public again. Made money each time. Michael Dell, every bit as much as Steve Jobs, can create reality-distortion fields.

Next Move on the Board

Google has declared a Code Red, is drawing resources together, and will respond in the short term. In addition, they will likely green light the product teams to engage with the research teams more to develop a deep pipeline of products to release over the next quarters and years. There are likely some very astute strategic minds grinding on what the AI campaign will look like over the next decade.

Hi! My name is Jeff Dean. I’m here with the Coders of the Westmark. Jackson, Lord of the Rings: Tale of Two Towers, 2002.

Google has an interesting window ahead of it. Microsoft is experiencing significant negative feedback in the public and press with the “enhanced” version of Bing. Google now has an opportunity to shape the conversation and lead from high ground.

Game on. Jackson, Lord of the Rings: Tale of Two Towers, 2002.

I’ll get the popcorn.

Next up: Why is Bing’s rather unpredictable (and shocking) behavior rather predictable.

--

--

Brooks Hamilton

Austinite. AI Entrepreneur. 15+ years building and implementing enterprise machine learning products.