Integrating Results And Coverage From Simulation And Formal
By Tom Anderson, Technical Marketing Consultant
Not so long ago, formal verification was considered an exotic technology used only by specialists for specific verification challenges such as cache coherency. As chips have grown ceaselessly in size and complexity, the traditional verification method of simulation could not keep pace. The task of generating and running enough tests consumed enormous resources in terms of engineers, simulation licenses, and servers. Yet even unlimited simulation capability provided no guarantee of functional correctness. Constrained-random simulation, by its very nature, is probabilistic and has little chance of exercising enough of the design to find deep, corner-case bugs.
For these reasons, verification engineers turned increasingly to formal. They highly value the ability to exercise 100% of design behavior, find obscure bugs, and prove mathematically that no bugs remain. Today, virtually every chip development team makes at least some use of formal techniques. Applications (“apps”) handle many common verification tasks automatically, standardized assertion languages provide portability across tools, formal engines (“solvers”) are more powerful, and debug tools have improved. Methodologies provide guidance and give even new users the confidence to be successful.
Despite all this progress, there was an important aspect of using formal that was not well addressed until recently: integration of its results and coverage metrics with those from simulation. A unified view of verification status is essential for engineering leads and project managers. They need to account for the effort expended on formal verification using measures and methods familiar from the simulation domain. Adopting formal enables more complete verification and reduces overall verification effort. Managers making decisions on tools and methodologies look for proof of these benefits.
Formal tools have always generated results and coverage, but looking at this data independently from simulation doesn’t provide the project-level view that managers need. Results and coverage from formal and simulation must be integrated to provide this view. Integration boils down to three main requirements: using formal coverage analysis to make simulation more efficient, presenting side-by-side views of formal and simulation structural coverage, and annotating both formal and simulation results into a comprehensive verification plan.
One of the reasons that simulation alone is insufficient for verification is that tests may run for days or weeks trying to hit coverage targets that are actually unreachable. Fortunately, formal verification can analyze coverage targets, determine unreachability, and pass this information to the coverage viewer. This prevents the verification team from wasting any time on simulation trying to reach these targets, a major increase in verification productivity. Further, the coverage viewer can adjust the metrics so that unreachable targets do not artificially depress coverage completion percentages. Overall, formal unreachability analysis significantly speeds up the process of coverage closure.
The second way that formal can integrate with simulation is to contribute its own coverage metrics. Simulation coverage shows whether any test reached a particular target. This is helpful for identifying areas of the design not being exercised. However, just hitting a coverage target does not mean that a bug in that part of the design could be detected. The only way to resolve this is to use mutation, in which errors are inserted into the design and tests are run to see whether the errors are detected. This adds a huge amount of overhead to simulation and is rarely used as part of the verification process.
Formal, model-based mutation analysis is automatic and much more efficient. No changes to the design code are required; there are no instrumentation or re-compilation steps. The resulting metrics show the parts of the design where errors would not be detected by any assertion. This information is helpful in determining whether the design has enough assertions and where additional assertions should be added. Model-based mutation coverage allows for a meaningful integration with simulation coverage. The verification team and project managers can view formal and simulation coverage results side by side, reduce effort overlap, and accelerate signoff, thus maximizing the return on investment (ROI) of the formal verification effort.
Finally, it is vital that formal results be reflected in the verification plan. Modern planning tools allow for the definition of design requirements linked back to the product specification. Requirements are broken down into design features and then into the verification steps to test those features. Simulation results are annotated back into the plan so that managers know when all tests have been run successfully. Similarly, formal tools can annotate their results back into the plan to show when their part of the verification process is complete. This provides a unified view of verification progress, helping to determine next steps and deciding when the design is ready for tape-out.
Over the past few years, vendor-specific solutions providing all three of these integration points have emerged. However, verification teams do not want to be restricted to a single vendor’s flow. This inhibits innovation and makes it impossible to choose best-in-class tools from multiple EDA vendors. This limitation was removed recently by the introduction of OneSpin’s PortableCoverage solution, which allows its formal tools to integrate with simulators, coverage databases, coverage viewers, and verification planning tools from any vendor or collection of vendors. The final requirement for mainstream formal verification has finally been addressed.
More information can be found at: http://bit.ly/2PO0Ezt.
About the author
Tom Anderson is technical marketing consultant at OneSpin Solutions. His previous roles have included vice president of marketing at Breker Verification Systems, vice president of applications engineering at 0-In Design Automation, vice president of engineering at IP pioneer Virtual Chips, group director of product management at Cadence, and director of technical marketing at Synopsys. He holds a Master of Science degree in Computer Science and Electrical Engineering from M.I.T. and a Bachelor of Science degree in Computer Systems Engineering from the University of Massachusetts at Amherst.