When Your Board Asks About Your AI Strategy

Peter
AIoD
Published in
6 min readFeb 20, 2024

Will you have an answer?

Dalle-3 Generated Image

“In a world of well-defined problems, directors are required to exercise influence over volatility, manage uncertainty, simplify complexity, and resolve ambiguity in the 21st-century digital environment. “

Pearl Zhu, Digitizing Boardroom: The Multifaceted Aspects of Digital Ready Boards

Introduction

As companies work to figure out how to leverage AI to enhance their business, it’s imperative for business leaders and CEOs to be prepared to answer questions about their AI strategy. Board members are likely to inquire about how companies are utilizing Large Language Models (LLMs), what their strategy is, and how they are managing the risks associated with AI. This means CEOs and business leaders will need to educate the Board members and their senior management team on what LLMs are, how they work and how they can be leveraged to help the organization. They will need to establish an ongoing dialog with the Board about AI and how the company will utilize it.

Board-Level Awareness of AI is Growing

Over the last year, we’ve witnessed the rapid expansion of LLMs such as ChatGPT, and other forms of Generative AI. While the underlying techniques are not exactly new, the rapid growth in capabilities and availability has captured the public imagination, and it’s caused companies everywhere to scramble to respond. Companies need to figure out how they can use LLMs, as well as other forms of AI, to enhance their business and empower their teams. They need to think about how they will manage the risks of AI and ensure they create safe and effective solutions.

As board members exercise their responsibilities for oversight and strategy, they are going to have questions for executive leadership about how they’ll approach these challenges and opportunities presented by modern AI. Board members should be asking many questions such as:

  • “What are we doing with LLMs?”
  • “What’s our AI strategy?”
  • “How are we keeping up with competitors?”
  • “Should we be investing heavily in AI or should we have a more measured pace?”
  • “What impact will this have on our customers?”
  • “Will AI help us create new products and services, or will it be used mainly to improve operational efficiency?”
  • “What’s this going to cost, and what returns will we see?”
  • “Do we have the right talent to drive AI initiatives?”

Board members also monitor reputational, political, financial and other risks and how the company responds to and mitigates those. CEOs should anticipate board members asking about the risks of AI and how the organization will effectively control them. There are numerous risks that come with all forms of AI, as well as unique risks that have emerged with LLMs and Generative AI, and board members are likely to have concerns such as:

  • “How do we know that an AI system isn’t biased or unfair?”
  • “Can we ensure the accuracy of the AI models we build or use?”
  • “What can we do to prevent toxic model behavior and hallucinations?”
  • “Are our customers going to balk at the ways we may use AI models? How will we be transparent and accountable to them?”
  • “How will our employees feel about this? Can we allay their fears of being replaced by AI?”
  • “How will we navigate the uncertain regulatory landscape around AI?”
  • “How do we manage the risks around AI models embedded in vendor products we deploy?”
  • “What about the legal issues? Can we protect our proprietary information? Can we be sued for copyright infringement if we use an LLM?”

And of course, the board is also exercising a fiduciary duty to ensure the financial health of the company. They will no doubt have questions about the costs and financial benefits of AI investments. Training and executing AI models can be expensive, and the “hidden” costs, such as making reliable data available, can escalate rapidly. Business leaders should anticipate queries from boards, including:

  • “How much is building and training AI models going to cost?”
  • “Who’s going to do this work? Will we need a new team, or can we up-skill existing teams to do this work?”
  • “What kind of training will be required for technologists and business users?”
  • “Where are we going to get the data to train AI models, and how much effort will that require?”
  • “Are we prepared to support the ongoing operations and monitoring of these AI systems?”
  • “How does the ‘promise’ of AI translate into quantifiable financial benefits, e.g., cost savings or new revenue growth?”

Creating an AI-Savvy Board

It is imperative that CEOs and other business leaders are prepared to answer those questions and engage in a constructive dialog. To accomplish this, they may have to educate the board members, as well as the rest of the senior management team, around what LLMs really are and how they work.

One of the most important concepts for business leaders to understand is the relationship between an LLM’s output and “truth.” Large Language Models do not have an inherent concept of truth. Rather, they should be thought of as a (very) advanced form of mimicry. They can produce very well-written and authoritative sounding responses that might also be entirely inaccurate. These “hallucinations” can be both entirely lucid _and_ entirely incorrect.

There are ways we can reduce hallucinations in LLMs, but they cannot be eliminated entirely. So one of the hedges against hallucinations is the “human in the loop” approach: having a person review and fact-check LLM output in situations where it really matters. Not only is this expensive and time-consuming, it’s also subject to failure. The output generated by LLMs can sound so authoritative and compelling, humans are prone to defer to AI-generated answers more than they would defer to answers from their peers.

Senior business leaders and board members will need to be familiar with concepts like hallucination, and others. Businesses will have to invest in educating boards on these topics, and establishing regular communications and updates with their leaders. It’s certainly not reasonable to expect every business leader and every board member to become an AI expert, but some familiarity with the topic is a necessity.

AI as a Board Focus

Boards will be increasingly involved in their companies’ AI strategies and management of AI risks. While this may initially sound unorthodox, there’s actually a historical precedent for this: cybersecurity. There are some lessons to be learned from cybersecurity risk, and how in just a few years it’s gone from an arcane niche technical expertise to a board-level concern. If you look back several years, boards weren’t particularly well-informed about cybersecurity; it was viewed as the domain of the CIO/CTO and the technology department to manage.

As corporate boards became more aware of the severe damage that could be caused to a company if there was a significant security breach, they started to closely track cybersecurity risks and companies’ investments in enhancing their security. Today, it’s not uncommon for boards to have a technology committee; one of the domains that a technology committee monitors or oversees is cybersecurity. And I’ve known many CISOs (Chief Information Security Officers) who have to make regular board reports on their cybersecurity readiness, recent incidents, and security enhancement initiatives.

We will see the same thing happen with AI, as it becomes a topic that the entire board or a subcommittee of the board is tracking. Technology and business leaders who are implementing and governing AI in a company will have to make regular reports to the board on how that’s being done and how those risks are being managed.

Engaging your Board

If you are a CEO or a business leader and have not fielded a request from your board on AI strategy, you probably will soon and you should be prepared. And even better, get ahead of the Board and go to them with information about what your AI strategy is, how you’re going to create business value with AI, how you will prepare your company to use it, and how you’re going to manage its risks.

To be proactive, business leaders should:

  • Begin a process of educating the entire board on general AI topics, including LLMs and Generative AI. Some of your board members may already be experts in this space, but it’s important for the entire board to share a basic level of understanding.
  • Develop an AI strategy, or update your existing AI strategy, to reflect emerging opportunities and concerns driven by Generative AI; brief your board on the strategy and get their buy-in.
  • Regularly include updates on AI-related topics in board packages
  • If you don’t already have a Technology Committee on the board, consider establishing one and giving it primary oversight of AI.
  • Designate a senior business leader (or business-savvy technology leader) as the primary point of contact with the board on AI topics.

Having a proactive approach to collaborating with your board on AI can ensure that your board is kept current on critical opportunities and issues. It means the business can benefit from the board’s input and collective experience to guide your AI efforts.

--

--