Plugging in AI, part 2: The solutions — better planning & pre-construction, sophisticated orchestration, and scaling behind-the-meter generation

Innovation Endeavors
Innovation Endeavors
12 min readMay 2, 2024

--

By Josh Rapperport and Sam Smith-Eppsteiner

In Part 1, we laid out the history and inflection point behind how we arrived at the emerging electricity bottleneck. Now, we will discuss some of the potential solutions. These proposals will focus on areas where technology and startups can play a key role.

Planning and pre-construction

If we are going to make it through the “electricity gauntlet,” we need to have sophisticated processes and workflows before anything gets built. This planning extends from the management of the interconnection pipeline, discussed previously, through to the energy developer that actually builds projects. Given some of the utility workflow is difficult to intersect with technology — due to slower adoption cycles and regulatory limitations — much of the heavy lifting, qualification, and planning for projects fall to the energy developers.

Today, energy developers are woefully underserved with modern technology and software. For the average utility-scale project, soft-costs (cost beyond the actual materials and construction) are around $400k per project, around 12% of total system costs, and climbing. This represents $10B of soft costs annually for U.S. energy developers, and that figure is expected to grow multiple times over as the market expands and we continue to accelerate the deployment of clean power, energy storage, and data centers.

The challenge for energy developers is largely one of messy, decentralized data across a multi-step workflow. The complexity lies in the combination of land parcel information, grid data, local permitting challenges, and the boots-on-the-ground human behavior in any given municipality. There are more than 20,000 townships in the U.S., all with unique codes, siting constraints, and permitting requirements. This massive amount of data is, at best, hidden away in hundred-page PDF documents on government-maintained websites and, at worst, not digitally recorded at all. The grid is an even more vexing data problem. Not only are public records of grid transmission lines unreliable and outdated, but the load capacity maps that indicate where there is free capacity are often not public and require expensive pre-applications to unlock.

At a fundamental level, the rate of project delays and cancellations is not improving, but is instead climbing. Developers see 80%+ failure rates today — potential projects that die somewhere along the multi-year journey prior to construction. If you ask a big developer about their pipeline, they will gladly share the gigawatts of projects they have underway. But if you ask about their risk-adjusted pipeline — accounting for permitting, siting, feasibility, interconnection, and financing risk — they likely don’t have a precise answer. Ideally, developers would spend their time on the most valuable 20% of projects that will actually get built, but having this type of continuous pipeline analysis is difficult.

Given the rapidly expanding ecosystem of small energy developers (aka wildcatters), the development process is becoming even more cumbersome as some developers resort to buying up as many easements and land options as possible and then sitting on them and waiting for buyers or future projects. Given big tech companies have excellent credit, strong cash flows, and are distinctly price-insensitive and time-sensitive in getting their systems up and running, we can expect these dynamics to escalate as they try to get an edge in the interconnection queue for their data centers.

Ultimately, this macro challenge all comes down to understanding the best use of any given parcel of land, based on real-time constraints of electricity supply and demand. Where should we build data centers vs charging vs battery storage vs solar? It’s untenable to brute force the problem with more transmission alone, given the $50B price tag in California.

Today, developers use a wide swath of point-solution tools to try and answer these questions and manage the multi-year project pipeline.

  1. In GIS (a decades-old mapping tool), they maintain buildable acres maps including data like flood plains, endangered species, slope, and indigenous land.
  2. Then, they search for the interconnect data across public grid capacity maps, and they often need to manually draw 3-phase transmission lines and map the substation capacity.
  3. Next, the developer collects land parcel data across county, municipality, township etc, looking at ownership data, permitting, zoning, setbacks, and siting.
  4. Finally the human element comes in, which is distinctly difficult to capture with data. Developers work with land agents, send out mailers, visit city council members, and meet with local politicians and lawyers to find opportunities where individuals are open to solar or data centers.
  5. Finally, geotechnical and environmental consultants are needed to sign off and green-light the project.
  6. Many of these steps are run in parallel to keep projects moving forward, making it difficult to keep the numerous workstreams aligned given the complex project management process required.

Startups and what we hope to see

Numerous exciting new companies have emerged to address this painful and inefficient workflow. Paces aims to build a much easier-to-use platform that captures all these layers of data in a single source. Furthermore, Paces hopes to capture the complex interdependencies across projects and geographics, some of which are counterintuitive. For example, most energy developers are all looking at the same hundreds of counties to build projects, depending on where there is favorable permitting, regulatory frameworks, and available grid capacity. One might think assessing neighboring counties would provide the most greenfield opportunity, but these geographies actually become less receptive to development as local sentiment changes, and NIMBYism kicks in. Defining and surfacing ever-evolving project risk also requires highly up-to-date data on the grid and quickly changing regulations, such as the new ruling last week announcing the Energy Department will take over as the lead agency in charge of federal environmental reviews for certain interstate power lines. This intersection of land information and grid constraints is a compelling opportunity for startups, with folks like Kevala, Transect, Neara, and Spark all building software solutions. For permitting specifically, tools like PermitFlow and Pulley are gaining traction within energy development and more broadly. To win, these tools must be 10x better than what developers use today, given a strong build vs. buy bias within the largest players.

We expect to see new kinds of solutions both in business model and technology:

  • Grid modeling: Given that utilities remain the least innovative stakeholder, there may be an opportunity for third parties to provide advanced modeling and scenario planning for the grid. Building a data lake for utilities and running the multi-dimensional analysis of planning and best-use is very compelling, though it’s worth noting utilities can be challenging customers for a swift go to market.
  • Energy developer soft-cost automation: Another interesting potential innovation is to build a more full-stack vendor to offload much of the manual and expensive soft costs from developers. We imagine a third-party standalone company that can provide the extensive suite of required paperwork and analysis such as FEMA studies, habitat and wetlands studies, wildlife studies, geotechnical analysis, surveys, title search, title insurance, easements, lien search, historical designation studies and more. This kind of work is well-suited for automation and LLMs.
  • Government data API: Given the amount of government-related data — from regulation to permitting and siting to subsidies — required not only in grid-scale energy development, but also in small residential projects (e.g., heat pump install), we are envisioning a data infra layer that surfaces all relevant information to developers, contractors, and financing stakeholders. There may be an opportunity to build horizontally here to support the myriad end use cases that require an understanding of government context.

Crypto was one of the first technologies to get down to this level of detail, building bitcoin mining operations near the cheapest 24/7 power or using curtailed power like solar. This brings us to the next component, the orchestration of electricity with compute and other time-dependent loads.

Orchestration

To avoid hundreds of billions being spent on grid upgrades, we need to capture the maximum value of flexible assets — assets that can time-shift electrical supply and demand — like batteries, charging, and compute. Time-shifting can be thought of as the ability to temporally coordinate electrical demand and supply down to very granular levels and smooth out the sharp peaks throughout the day. Historically, electrons on the grid are consumed in real-time the moment they are generated, meaning that the entire grid is designed to accommodate the peak load in a given year, which is typically a hot summer day as cooling systems kick in. In summer 2019, electricity prices in Texas spiked to a record $9000 per MwH (200x the median price), and similar peak events also occurred in the winter of 2021 with the massive Uri storm freezing the state, causing hundreds of billions in damages. With warming summers, more extreme weather, and increasing grid unreliability (both with increasing demand for electricity and increasing supply intermittency), these events will only continue to escalate in frequency and severity.

The good news is flexible loads have huge potential to solve this problem. For example, in mid-August last year, a 1.8 GW discharge from batteries onto the grid in Texas quickly reduced wholesale electricity prices from $5,000 to $2,700 per megawatt hour. Energy storage, largely in the form of batteries today, is the ultimate flexible asset as it enables optimization of charge and discharge. However, vehicle charging and some kinds of data centers also have very compelling time-shifting potential on the demand side. This is because compute doesn’t always need to be running 24/7 at constant intensity (i.e., inference demand — which constitutes 85% of all data center AI workloads — coming mostly during the busiest hours of the day) unlike crypto mining operations, for example. Certain computing workloads can be timed to when prices are cheapest, cutting operational costs by up to 50%, as can a significant part of vehicle charging.

It’s difficult to say exactly how much electricity is time-shifted today, but it’s likely in the single digit percent. Many experts believe that number can be closer to fifty percent or higher. To get there will require sophisticated technology to orchestrate millions of vehicles, batteries, high-performance computing systems, and more.

Startups and what we hope to see

The critical components to make advanced orchestration work are data and controls, and financial infrastructure. We are seeing exciting innovation in both dimensions, and expect many generational companies to be built here. Orchestration, as discussed here, is inclusive of programs like demand response, virtual power plants, and synchronization, and is paired with the increasingly decentralized nature of electricity.

On the data and controls side, the key is to have real-time awareness of as many electrical loads as possible, accurate forecasts of both generation and load, and the ability to optimize the dispatch across as much of the grid as possible. On the more commercial side, companies like Equilibrium Energy are building advanced models of the grid to unlock situational awareness and then using large battery portfolios — enabled by a tolling fund or agreement — to optimize price and performance across consumers. Relatedly, Verse is building tools for corporates to unlock the PPA-for-storage model, which will allow customers to orchestrate and optimize their own energy needs. On the consumer side, companies like Branch Energy and David Energy are reimagining the energy retail model, which has historically been infamous for its low NPS, by orchestrating batteries and home devices to provide cheaper, cleaner power.

On the financial side, for energy markets to become more highly orchestrated and efficient, we need the type of advanced financial market infrastructure that many other asset classes have had for decades. Today in the U.S., tens of billions of dollars of power are traded daily using antiquated, over-the-counter approaches that only allow for large power blocks and 4–8 hour time windows in day-ahead markets. We see an opportunity for real-time, granular futures products in energy that would enable sophisticated hedging and risk management for asset owners and operators (more exciting news from Innovation Endeavors to come here soon!).

We are keen to meet teams rethinking this massive orchestration challenge across data, storage, time-shifting, and financial instruments:

  • Grid enhancing technology: This includes innovations across a variety of systems, such as dynamic line rating, which uses better data to understand the real-time circumstances of the grid and allow for more power to flow, optimizing beyond today’s conservative line ratings. Transformers as a grid-enhancing technology have also not been updated in decades. They are both supply-chain constrained and not designed for more advanced orchestration, contributing to conservative line ratings.
  • Bi-directional EVs: We hope that EVs can be fully utilized and orchestrated as part of the solution to time-shift power, both on the demand response side and offering capacity and power to the grid, with vehicle-to-home and vehicle-to-grid approaches.
  • Beyond bi-lateral energy: More generally, as far as financial instruments, the current Power Purchase Agreement (PPA) model is a fairly brittle and crude system, and we welcome conversations with teams innovating beyond bi-lateral agreements in power.

Building more behind-the-meter generation

Perhaps the most fundamental solution to the backlog on the electricity grid is to not rely on it at all. This is what behind-the-meter energy assets represent: instead of consuming power from a far away generation system and patiently waiting in the interconnection queue to gain access to the grid, energy developers and consumers can build the generation and consumption in one location. This means they can use their own power, on-site and “behind” their meter, and scale their infrastructure accordingly. There are important cost considerations given that behind-the-meter can be more expensive, but to put it simply, as much energy as possible should be behind-the-meter.

Behind-the-meter is a return to the earliest use of electricity, long before the complex network of transmission and distribution was built, where all generation and transmission was more locally linked. Modern innovations like microgrids have seen meaningful uptake, allowing customers to build their own infrastructure on their side of the meter. This approach can be difficult to scale, but it provides a compelling alternative to large-scale grid upgrades.

Startups and what we hope to see

More recently, many core innovations on both local generation and behind-the-meter compute have taken off. Most fundamentally, the self-supply of energy is beginning to realize its huge potential as modular reactors become cheaper and much more efficient. Small modular reactors (SMRs) and microreactors for nuclear fission, long only used in maritime applications, are being pushed forward by companies like Radiant. Microsoft has hired a director of nuclear, and plans to make major investments in scaling nuclear-powered data centers. Thermal and battery energy storage can typically reduce carbon emissions most effectively when deployed behind-the-meter, with companies like Antora bringing more scalable and cost-effective thermal storage solutions to market and Form Energy developing long-duration batteries. Advanced geothermal energy is getting traction, with companions like Fervo and Quaise offering the ability to build behind-the-meter baseload.

AI brings a unique opportunity to optimize across the entire hardware stack, from energy through to the chips themselves. Companies like Cruso have pioneered utilizing what would be wasted behind-the-meter energy, in the form of flared gas, to power data centers. Verrus spun out of SIP last month — they plan on using microgrids, sophisticated energy management, battery storage, and advanced chip sets to build cheaper and more optimized data centers.

  • Behind-the-meter full-stack: We welcome conversations with teams building next-gen behind-the-meter systems that combine on-site generation with energy use, storage, or compute.
  • Behind-the-meter analytics and software: Many energy and data center players won’t want to build software layers for integration, controls, and orchestration, and may outsource, creating opportunities for startups. Generally speaking, AI training clusters can be deployed essentially anywhere in the world that makes economic sense—as opposed to locally constrained assets like charging — so we are interested in analytics and infrastructure-intelligence software to help find untapped value. Oftentimes, sitting behind-the-meter is about understanding where there are already sunk infrastructure costs and innovating in how we can repurpose them for an expanding an electric and AI future.

Closing thoughts

It’s worth noting that much of our ability to navigate the electricity gauntlet will depend on policy, which is not discussed here, given it remains generally outside the scope of technology innovation. Structural changes like “first-ready, first-served” for the interconnection queue or policy that mandates that data centers participate in demand response programs would have a major impact.

Additionally, huge amounts of development effort are going into designing more energy-efficient chips and compute-efficient models, which will hopefully help mitigate the ballooning electricity demand for AI.

Independent of better policy, chips, and models, there is an enormous opportunity to reimagine the power sector so that we can continue accelerating the decarbonization of our global economy and scale AI infrastructure. At the most basic level, energy needs to be at the center of any techno-optimistic vision of the future, and we look forward to supporting founders working toward this ambitious goal.

Thank you to Evan Caron, Aram Shumavon, and James McWalter for reviewing this.

--

--

Innovation Endeavors
Innovation Endeavors

Investing in visionary founders, transformational technology and emergent ecosystems for a new world. For more follow: https://medium.com/innovationendeavors