Building for the real world

Next-generation tools for hardware engineering

Sam Smith-Eppsteiner
Innovation Endeavors
10 min readJun 24, 2024

--

By Sam Smith-Eppsteiner and Thilo Braun

Hardware is everywhere, from the phones we use and the planes we travel on to the furniture we use daily at home. The pace at which we develop and build hardware drives how fast we can deploy critical new technologies in key areas such as clean energy and defense technology. However, despite significant improvements over the past decades, including rapid prototyping technologies, developing and making hardware largely remains slow and tedious. This is in stark contrast to software engineering, where modern tools, copilots, and continuous testing enable rapid product development. To reach the sci-fi utopias we’ve imagined of abundant energy, flying vehicles, and space exploration, we definitively need new approaches to building hardware. How do we make engineering atoms look more like bits?

We believe that we are at an inflection point — where the Super Evolution will have profound consequences on how we design and make things. We envision a future world of “intent to action,” where a product designer can describe his design intent and, with little to no manual intervention, receive a physical product. While this world remains in the future, many of the building blocks to get us there will solve real and meaningful problems today.

“Intent to action”

Several drivers put us at an inflection point that will accelerate progress toward this future:

  1. Generative AI: While Generative AI has already had profound impacts across many industries, its use in hardware engineering and manufacturing remains in its infancy. Significant use cases exist across the entire value chain.
  2. Agile robotics: While robotics is nothing new, they have largely been constrained to predictable and repetitive tasks. New generations of AI based controls and sensing hardware will enable much broader economic deployment of automation.
  3. Changing macro environment: ‘Friend-shoring,’ a severe industrial workforce shortage, and extensive investment in innovative hardware-driven industries including climate tech and defense will drive adoption of new technologies.

This post will focus on the engineering process for discrete products. In a future post, we will do a deep dive into manufacturing.

Status Quo

Today, hardware engineering is largely a linear process with information primarily flowing downstream. Decisions early on drive the bulk of a product’s cost.

From Systems Engineering Framework for Integrated Product and Industrial Design Including Trade Study Optimization

Several studies across different industries found that 70+% of life-cycle cost are committed by the end of the concept engineering phase when requirements are frozen, while only a small fraction of lifecycle cost are incurred up until this point. Today’s engineering processes are prone to suboptimal early decisions and are slow to correct these, if at all.

The predominant challenges from today’s hardware engineering process include:

  • Difficult collaboration: Most engineering software is inherently ‘single player.’ Both within and across functions, collaboration is stymied by a lack of tools. There is no equivalent of Figma for hardware engineers. Particularly given the high degree of specialization that often exists within hardware engineering organizations, with experts in different disciplines of engineering needing to work together, it is essential to create paths for collaboration.
  • Design for X: Given the outsized impact that design and early engineering have on product cost and manufacturability, finding ways to incorporate supply chain and manufacturing considerations as early as the industrial design phase is critical (DfX). Today, collaboration between suppliers, manufacturers, and engineering teams is often over E-mail, Dropbox, and phone calls. Cost models are largely in Excel, and it is difficult to quantify the impact of design decisions.
  • Antiquated software tools: Software tools ranging from CAD (3D-design), CAE (simulation), and PLM (data management) are complex and difficult to use. In fact, most of these tools were conceived of in the 1970s and ’80s, and there are entire undergraduate and graduate-level courses dedicated to using these tools. While they are very powerful, extensive time is used to navigate these systems rather than on constructive engineering work. In addition, these tools were not built for a world of electronics and software-defined hardware, where there are infinite configurations and products are updated throughout their life.
  • Slow simulation: Over the last decades, simulation has become core to the hardware engineering process. While the first Boeing 747 was designed with slide-rules and wind-tunnel tests, new aircraft such as the Boeing 787 are designed almost entirely relying on simulation. However, physics simulation remains computationally intensive and slow, taking hours to days to simulate complex systems. Furthermore, systems are not set up to close the loop between simulation results, test results, and the real world, leaving blindspots in both simulation and testing.

Future vision

The quality of a design process can be measured in both its efficiency and effectiveness. Efficiency is how quickly a design is created and how many resources are required for the process. Effectiveness is how well the product meets its requirements and optimization functions, including weight, durability, manufacturing cost, maintainability, embedded carbon footprint, etc.

Humans are not particularly good at structured design space exploration. In 1997, Deep Blue beat world champion Garry Kasparov in chess. The computer was able to achieve this by systematically evaluating the universe of future moves — the ‘design space’ of chess. In hardware engineering, we anticipate that computers will similarly outperform humans in systematically exploring a design space, evaluating different design choices against the optimization functions defined in the design intent. Such optimization functions specify the priority of different design trade-offs, for example specifying a certain $ value a product may cost more if an additional kg of weight is shaved off the design.

The necessary pieces to achieve near-autonomous engineering design

Working backward from a future where computers are able to conduct systematic design space explorations for complex engineering systems, several key pieces are needed:

  • Generative design
  • Simulation
  • Verification and validation
  • Design for X (or DfX, where X stands for the respective categories. Today, most engineering projects consider design for manufacturing and assembly (DfM or DFMA), Design for cost (DfC or DtC), and design for maintainability)
  • Collaboration and design orchestration

Generative design

Generating new engineering designs based on requirements and text inputs. Several companies, including Quilter and nTop, are building generative design tools at a component level, while Zoo and others are working on platform Text-to-CAD tools. In the future, generative design will likely span from industrial design to engineering considerations. These tools must integrate aesthetic design with functional engineering and systems reasoning.

We anticipate new foundation models built for 3D systems will be needed to achieve the full potential here. This would enable generating and iterating on complex 3D assemblies based on input functional requirements. The key challenge to overcome is the aggregation of training data to develop such a foundation model.

We’re excited to see new generative design tools to solve real engineering pain points today, as well as creative solutions to building training data sets.

Simulation

Physics-based simulation today is slow and computationally intensive. Physics-informed machine learning based simulation has the potential to accelerate simulation by several orders of magnitude, achieving near-perfect results in seconds to minutes rather than days to weeks. Accelerating simulation will enable more systematic and exhaustive explorations of design spaces, enabling multi-variate optimizations of systems with little human intervention to find system optimums, rather than slow human-driven iterations driven by human intuition. Example companies in this space include Navier AI, Beyond Math, PhysicsX, and SimScale. We’re still in the first inning and look forward to novel approaches to simulation.

Verification and validation

Verification and validation (V&V) includes the process of ensuring that, if the requirements are met, the product will perform its intended use (’Are you building the right thing?’) and that these requirements have been met (’Are you building it right?’). V&V can make up 10% or more of project time and cost for complex engineering projects, but is lacking dedicated modern software tooling. This process includes planning and executing tests, interpreting results, and documenting these for safety and certification reasons. Today, these workflows are largely done in a combination of the requirements management systems (Polarion, Doors, etc.), document management systems, word, excel, etc. Dedicated tools could not only help make these workflows more seamless, but with Generative AI could also automate significant parts of extracting information from regulations, drafting verification plans, and interpreting results. This will allow engineers to focus on more intellectual parts of the job while reducing product risks from oversight in the verification process such as Rivian’s recent recalls for various non-conformances with headlamp and backup lamp regulations. Startups operating in this space include Nominal, Stell, Cadstrom, and Valispace.

An additional component to verification is ensuring that the product performs as intended in real world use. Today, warranty data is often encrypted in broad codes and handwritten notes. Other information from usage may make its way back to an organization in the form of telematics, Jira tickets captured by customer representatives, or hidden in online message boards such as Reddit. We believe that new tools will help connect the dots and capture issues faster, as well as ensure that findings are implemented in future product iterations. Companies working on this intersection include Axion Ray and Pull Systems.

Design for X

Today, design for X considerations are often an afterthought. Design engineers may have hundred page design guideline documents to work with and will manually review designs with manufacturing engineers. There is little quantification of the cost impact of design decisions, bar some excel models (or rudimentary tools such as McKinsey’s Cleansheet and Teamcenter product costing) for high level cost estimates. Better tools are needed to evaluate the impact of design decisions and enable informed design trade-offs across manufacturability, assembly, supply chain, and cost considerations. This includes everything from initial tolerance stacks and manufacturing process selection for new products, to quantifying where the highest ‘bang for buck’ is when doing design iterations to reduce product cost. Beyond the design, there is a lack of tools to support translating designs into manufacturing. This includes creating the assembly sequence (M-BoM), drawings and work instructions. These activities are tedious and engineers often see these as a distraction from their core work. Startups tackling this space include Threaded, Dirac, and Drafter. The interface between design and manufacturing remains underserved and we’re keen to see new solutions.

Design orchestration and collaboration

Putting it all together is key. To enable faster design iterations and achieving better engineering designs, pulling together data and orchestrating the various parts of the design process is critical. We believe that design engineers will increasingly spend time on orchestrating design processes, deciding design trade-offs, and refining design inputs. We are excited by new design orchestration platforms that have the potential to become core to the hardware engineering process. Players working on this include Generative Engineering and Synera. As the tools described above develop, we envision engineers will spend an increasing amount of time in these orchestration platforms, where specialized services such as design and simulation are run by agents.

In the meantime, in today’s world, with various disciplines and stakeholders needing to work together, new spaces for collaboration are needed. In particular, data is distributed across various formats and locations, shared across mails, calls, and PowerPoint presentations. This hinders effective information flow and decision making. We see immediate opportunities to improve the data fabric, documentation and collaboration for hardware engineering. Some companies tackling this include SygmaHQ, Violet Labs, and Quarter20.

What is needed to build a large company

The industrial landscape is littered with a graveyard of startups that failed to reach scale. Nonetheless, it is a large market, with over 1,000,000 hardware engineers in the U.S. alone (BLS). Through our experience working with companies in this field, we have identified several key tenants for building a successful software company serving industrials:

  • Avoid rip andreplace (…at least to start): While legacy systems including CAD, CAE, and PLMs often leave a lot of room for improvement, companies are unlikely to replace these with a startup’s product. Instead of attempting to replace legacy solutions, build in the gaps of underserved users and processes.
  • Business benefit: This feels obvious, but find solutions that can have significant business benefits. While there are significant opportunities for bottom-up sales motions, establishing tools with individual users who love the product, large contracts will need to be justified top-down underpinned by real business benefits. Solving problems that could significantly accelerate development, reduce product cost, or otherwise have a quantifiable business benefit are most likely to scale.
  • Human in the loop: While we believe in a future with high automation in the design process, keeping the human in the loop is critical in the interim. Engineers are inherently skeptical, and the cost of getting it wrong is high in these industries. Start with 80% solutions that augment and delight engineers rather than attempting to replace them.

Closing thoughts

We’re at a critical inflection point for hardware engineering — to get where we want to go, we need to build in the real world. And we need to do it faster, more effectively, and more sustainably than ever before. Legacy industries are redefining products for a carbon free world. The war in Ukraine has made the need for a new generation of defense systems clearly apparent. At the same time, on the enabling technology side, Generative AI and physics informed ML simulation are enabling a new generation of engineering tools.

We’re excited to talk to builders building for builders — we see a huge opportunity in enabling and accelerating the future of hardware.

--

--

Sam Smith-Eppsteiner
Innovation Endeavors

VC @ Innovation Endeavors. Tech for the real world, people, infrastructure, and the climate.