Role of Machine Learning in Chip Design

Darshan C Ganji
AITS Journal
3 min readJul 28, 2019

--

AI has been applied to the design of computer chips, and in fact, was one of the first applications of AI. Some techniques taught in Intro-to-AI courses are routinely used by computer-aided design tools every day. Modern chips are so large that doing all the design by hand would be hopelessly slow. And verification is way too complex for doing it by hand. So, many tasks have been automated over the last 40 years, and additional tasks are being automated. Every time someone comes up with a new way to design chips (e.g., new interconnect fabrics, new combinations of memory and computation, etc.), the design process needs to be automated. Intel and IBM have internal CAD organizations for these purposes. Several companies (Cadence, Synopsys, Mentor Graphics) sell CAD tools, which can be viewed as applications of AI to chip design.

Machine learning has recently been applied to chip design in several ways. You can try to learn the best configurations of software tools to improve the results of chip optimization. You can learn how to predict likely places on a chip that may experience manufacturing defects. If you can predict the load on a CPU, then you can dynamically scale down the voltage to save energy.

ASIC design is about:

->RTL code (HDL).

->Verification (simulation).

->Synthesis, HDL to Netlist translation.

->Static Timing Analysis.

->Design for Test.

->Pre- and Post-layout simulation.

->Formal Verification of RTL vs. pre-layout Netlist (scan inserted) and post-layout (clock tree, buffering).

Applying machine learning could be interesting for:

->RTL code: Analysis of a lot of code to detect and correct problems for scan insertion or for coding guideline violations.

->Verification: regression analysis (to identify the most interesting test cases to run).

->Synthesis: process netlists before and after layout to detect issues with floorplanning or congestion early.

Machine Learning in EDA:

Machine learning is beginning to have an impact on the EDA tools business, cutting the cost of designs by allowing tools to suggest solutions to common problems that would take design teams weeks or even months to work through. This reduces the cost of designs. It also potentially expands the market for EDA tools, opening the door to even new design starts and more chips from more companies.

Collaborative work with semiconductor companies to develop new ML-based EDA products.

Cadence and other EDA companies like Synopsis are already applying machine learning to improve tools and design flows and to improve intellectual property cores or to target ML with cores. They are working to automate the routing and tuning of devices to improve reliability, circuit performance, and resilience; improve power, performance, and area (PPA) result by using machine learning, analytics, and optimization. And their programs will accelerate roadmap toward realizing intelligent design flows for the next big leap in design productivity. These initiatives will set the stage for enhancing the entire span of analog, digital, verification, package and PCB EDA technologies, providing our customers with the most advanced system design enablement solutions.

Promises:

With long term advances in VLSI, computer hardware and AI, it is reasonable to expect performances from VLSI design tools that exceed human capabilities. It is reasonable to assume that a machine can eventually do a better job of evaluating complex trade-offs and selecting the best design from among many design attempts. With more powerful programming paradigms comes the ability to create more powerful tools. Synthesis tools such as a complete general-purpose silicon compiler will emerge in the long term. This tool will leverage scarce engineering resources tremendously and greatly shorten VLSI design times.

Conclusion:

From this brief survey of the VLSI design and AI fields, it is evident that AI technology will significantly alter the way VLSI design is done today. Many human design tasks will be automated, leaving designers to deal with the most difficult and obscure design problems. These advances will pave the way for major revolutions in computing hardware and AI research.

Thanks for Reading :-)

--

--