Sitemap
Predict

where the future is written

Press enter or click to view image in full size

“Navigating the AI Stocks Landscape” by Dr. Brian Scott Glassman

9 min readOct 28, 2024

--

Oct 28th 2024

Artificial intelligence (AI) holds immense potential for revolutionizing work practices; however, it is also a rapidly evolving, intricate, and dynamic industry in which only those deeply involved can accurately identify the current leaders. In particular, a select number of billion-dollar-valued companies (AI originators) are creating the technology that powers AI applications and provides those tools to the large slate of AI startups. This article aims to provide a high-level overview of various AI originators and their market categories, then make predictions about the industry’s future, and finally discuss the publicly traded companies available for investment. Note: Stock growth figures represent one year from October 28, 2024.

Types of Generative AI

To understand the AI market landscape, it is crucial to grasp the function and capabilities of AI, particularly the most advanced and powerful subset known as generative AI models. These models can interact with users in various ways, including text, voice, images, and videos. Generative AI for text input/output was popularized by OpenAI (Private VC Backed) of which Microsoft (NASDAQ:MSFT — One Year Growth 27%) owns a 49% stake. Beginning with the release of ChatGPT 3.0 in June 2020, its popularity surged with the introduction of ChatGPT 3.5 in March 2022. These models enable users to engage in chat-based interactions to create and manipulate text for a broad range of purposes, from simple grammar and spelling corrections to composing entire books and developing new concepts. Currently, the text-based generative AI market remains the most expansive area of ongoing development and competition among major players with the focus to make the AI smarter.

Simultaneously, generative AI for image identification, creation, and manipulation was popularized by companies like Midjourney (Private VC Backed). This innovation inspired both creative professionals and a new wave of amateur creators to produce stunning AI-generated art collections. By 2024, conversational generative AI applications had emerged, allowing users to communicate with AI in natural, multi-language conversations, with minimal lag between questions and answers, creating a seamless user experience.

Generative AI for video, however, remains a niche area, primarily explored by smaller creators and studios due to its high operational costs from its electrical power requirements. Lastly, spatial AI applications, which operate in 3D environments, are currently limited to Tesla’s (NASDAQ:TSLA — One Year Growth 34%) Full Self-Driving system, still in its beta phase, and AI robotics developed by companies like Tesla and Boston Dynamics.

Tesla’s solutions is in a key position to dominate the future of self-driving due to its camera-based solution that can be more easily integrated into other OEMs’ platforms. Further, in the last year, each release has demonstrated increasing levels of ability and safety, putting it on track to achieve human or better levels of driving safety in 2025. With the opportunity to sell Tesla’s solution via their current vehicle lineup, license it to other OEMs, or launch a fleet of delivery or robotic taxis, Tesla is in a market-leading position for AI enabled self-driving.

AI Silicon Hardware Providers

The development and operation of these computationally intensive generative AI programs depend heavily on AI hardware providers and cloud infrastructure. Among hardware manufacturers, Nvidia (NASDAQ:NVDA — One Year Growth 243%) stands out, offering high-end graphic cards ranging from $20,000 to $50,000. These GPUs serve two main functions: training AI models and running them for end users, also known as inference. It is projected that training AI accounts for only 10–20% of computational usage (both in electricity and hardware), while the majority — 80–90% — is consumed by inference operations or running the AI for customers.

Training an AI model from scratch is akin to compressing centuries of knowledge into a compact software algorithm in a matter of months, requiring thousands of high-end GPUs. This process demands meticulous planning and significant electricity consumption, making it feasible only for companies with substantial financial resources. Nvidia currently dominates this space, and their lead in the market — thanks to their specialized chip designs and supply chain partner TSMC (NYSE:TSM — One Year Growth 127%) — Nvidia is expected to remain unsurpassed for the next two years.

However, the landscape changes when it comes to running AI models. By design, AI models are created to be compact and quick, utilizing the vast knowledge embedded in them to generate answers in real-time. These models can run on a variety of hardware, though Nvidia and AMD’s optimized GPUs remain preferred. Yet, companies like Amazon (NASDAQ:AMZN — One Year Growth 42%) and Groq.com (Private VC Backed) are developing a new class of computer chips known as AI-optimized Tensor Processing Units (TPUs), which are five to ten times more efficient at running AI models than Nvidia’s high-end GPUs. TPUs are expected to be the future, and while they are still in limited production due to long manufacturing timelines, they could significantly reduce Nvidia’s dominance in the inference market, but this will not occur for at least till end of 2025.

AI Software Landscape

On the software side of AI, there are three high-level categories: foundational AI models, AI framework providers, and AI knowledge hubs. Foundational AI models are large, complex software algorithms costing tens to hundreds of millions to program. These models can be further tailored (aka fine-tuned) to suit specific enterprises’ needs. Due to the immense cost of building foundational models from scratch, only a handful of players with billions in financial backing can compete in this space with unique offerings. Next, AI frameworks provide enterprises with the tools to leverage these models; most of these players are below $3B in market cap and privately held. Finally, open-source AI models and resources are shared through platforms like HuggingFace.com and GitHub.com (both private VC backed). Let’s now take a close look at the key players in foundational AI models.

AI Foundational Model Players: Open-Source Competitors

Meta (NASDAQ:META — One Year Growth 91%) leads the open-source AI foundational model space with its Llama series, which encompasses text, voice, and image-based models. These models are the default choice for enterprises looking to customize AI solutions. Meta offers access to these models through its Facebook app and Meta.ai platform. However, the monetization strategy for these models is still unclear, one possible use is to support Meta’s core Facebook Platform. Meta has allocated $40 billions to AI development in 2024, with a potential strategy to extract licensing revenue from enterprises using their models. However, not much news about Meta’s AI licensing revenues has emerged.

Google (NASDAQ: GOOG — One Year Growth 34%) is close behind with its Gemini series of AI models, but the company faces an existential challenge. AI-driven competitors like OpenAI’s ChatGPT and the privately VC-backed Perplexity.ai have demonstrated effective web search capabilities, posing a threat to Google’s core search advertising revenue stream. In response, Google has integrated AI-generated results into its search platform, incurring operational losses. Additionally, AI-powered search tends to divert users’ attention away from paid search results. Google hopes that reducing inference costs in the near future will make AI-powered searches more financially viable. Another key player is Elon Musk’s X.com (formerly Twitter) with its large, high-quality model Grok-2. Other companies developing open-source AI models include Snowflake (NYSE: SNOW- One Year Decline of 18%), Mistral, Bloom, and more. A trend appears to be emerging, with large technology companies building their own foundational AI models to avoid being left behind; however, their monetization strategies remain largely unclear.

Closed Source Foundational AI Models

In the closed-source arena, key players include private companies like OpenAI, with its ChatGPT models, and Anthropic, known for Claude, both privately backed by venture capital. These companies have well-defined monetization models through consumer subscriptions and enterprise usage plans. For instance, OpenAI is projected to generate $3.7 billion in revenue in 2024 and $11.6 billion in 2025, although it is expected to incur losses of $5 billion in 2024, primarily due to operational costs, salaries, and overhead. OpenAI and Anthropic have also spurred the growth of numerous startups that develop custom AI applications leveraging their enterprise usage plans. Both companies’ pricing is positioned at the higher end of the spectrum. While their AI models are powerful and highly attractive, medium to large enterprises may initially consider lower-cost open-source AI models as a way to mitigate high operational expenses.

Apple (NASDAQ:AAPL — One Year Growth 37%) is also expected to release its own AI, called Apple Intelligence, which is optimized for Apple’s ecosystem. Designed to run on iPhones, iPads, and Mac devices, Apple Intelligence handles simple queries on-device while sending more complex ones to Apple’s servers. Given Apple’s reputation for quality and reliability, as well as the infrastructure needed to support concurrent queries from millions of users, the delay in releasing this AI was expected. If Apple Intelligence succeeds, it could strengthen Apple’s iOS offerings and positively impact its stock price.

Similarly, Microsoft is piggybacking on their partner OpenAI’s expertise to deploy scalable generative AI via Microsoft Copilot across their operating systems and MS Office suite of products. With a first-to-market enterprise solution, they are the default market leaders for enterprises seeking to foray into AI with low risk easy to adopt solutions.

Open Source vs. Closed Source

The artificial intelligence landscape is currently witnessing a face-off between open-source and closed-source models, each presenting unique advantages and challenges.

Open-source models provide large enterprises with the flexibility to download, fine-tune, and deploy AI models on their own infrastructure or cloud services. This approach offers greater control over costs and allows organizations to maintain data within their own systems, which is particularly beneficial for industries with stringent regulatory requirements, such as banking and healthcare. While this option is feasible for large corporations, small and medium-sized enterprises may find it challenging due to the need for specialized AI talent to maintain these systems — a skillset that remains in limited supply.

Conversely, closed-source models, often more powerful and expansive, offer simplified access through enterprise-level connections. However, they typically come with significantly higher costs compared to their open-source counterparts. This pricing structure makes closed-source solutions like ChatGPT and Claude particularly attractive to consumers, startups, and mid-sized businesses. Further the closed-source model provide value added features outside of their core models that make their offering highly attractive, like AI powered databases, and in AI memory. Notably, closed-source providers are continuously innovating to bridge the gap between their offerings and open-source alternatives in terms of fine-tuning, and data privacy.

Large Cloud Service Providers

As open-source and customized generative AI models require specialized hardware, cloud providers such as Amazon AWS, Microsoft’s Azure, and Google’s Cloud are ramping up their inventories of AI-capable servers to meet the growing demand and will profit from increased resource usage associated with AI services. Amazon and Azure are taking the lead in offering servers for hosting open-source AI models, while Azure also offers the higher-power ChatGPT models via their partnership. Clearly, enterprises with privacy and security in mind will opt for AI powered by a provider like AWS or Azure. Large cloud providers are expected to be winners in the AI game as a significant percentage of future applications are being planned with AI integrations in mind deployed via cloud service providers.

In Closing

The AI market is experiencing significant momentum, with hundreds of billions of dollars in investments flowing into the sector. While numerous large enterprises across various industries may claim to be AI innovators, the true originators of the AI technology used by today’s leaders are limited to a select group of publicly traded companies and billion-dollar, VC-backed AI firms. The AI landscape is considerably clearer than it was two years ago; however, substantial progress is still needed before definitive market leaders emerge in each segment. Investors considering entry into the AI market should conduct thorough research into these publicly traded firms. Given the rapid rate of advancement in the field, it would not be surprising if major breakthroughs continue to shift the leadership positions of AI players. One thing is certain: the world will continue to adopt and be astounded by AI’s technological advancements.

About the Authors

Brian Glassman Ph.D.

Dr. Brian Glassman, Ph.D., has a background in innovation and engineering and over 20 years of leadership experience in commercializing disruptive technology in enterprise software, where he has led large engineering and product teams. He is currently the Chief Product Officer at a Generative AI products and Consulting Firm. An alumnus of Purdue University and Duke University, Dr. Glassman is also a former professor at NYU. Learn more about him at https://drbrianglassman.com.

--

--

No responses yet