close By using this website, you agree to the use of cookies. Detailed information on the use of cookies on this website can be obtained on OneSpin's Privacy Policy. At this point you may also object to the use of cookies and adjust the browser settings accordingly.

Utilizing More Data to Improve Chip Design

By Ann Steffora Mutschler, Semiconductor Engineering

Different ways of collecting, analyzing and applying that data to improve efficiency and reliability.

Just about every step of the IC tool flow generates some amount of data. But certain steps generate a mind-boggling amount of data, not all of which is of equal value. The challenge is figuring out what’s important for which parts of the design flow. That determines what to extract and loop back to engineers, and when that needs to be done in order to improve the reliability of increasingly complex chips and reduce the overall time to tapeout.

This has become essential in safety-critical markets, where standards require tracking of data analysis. But beyond that, analyzing data can help pinpoint issues in design and manufacturing, including issues stemming from thermal and other physical effects to detecting transient anomalies and latent defects. Along with that data, new techniques for data analysis are being developed, including the use of digital twins, artificial intelligence and machine learning.

[...]

“In the verification space, analytics data and metrics are used both to make better progress on a particular project and to improve tools for future projects,” said Dominik Strasser, vice president of engineering at OneSpin Solutions. “In fact, analytics and metrics are at the very heart of modern functional verification. Coverage metrics and verification results are the keys to knowing what has been verified, highlighting gaps in verification, and determining when the process is done. Metrics and results from formal analysis can be integrated with those from simulation and emulation with the assistance of new commercial tools. Results can be back-annotated to the original verification plan, and coverage results can be used to determine a minimal set of simulation tests. These steps converge on full verification and functional sign-off more quickly.” 

During formal verification runs, additional metrics can be gathered on which formal engines are most effective for different types of designs and assertions, Strasser pointed out. “Modern formal tools have many such engines. Experts may wish to control them directly but most users just want the best possible results (proofs and bugs) as quickly as possible.”

And now, with machine learning, the algorithms that select which engine to run get better and better based on what is found to work best. This, in turn, speeds up the verification process on a given design. But the real win is that over time the collective wisdom from multiple projects improves the formal tools themselves, Strasser said.

 

Back

Related Links