In the news

EDA Challenges Machine Learning

Semiconductor Engineering logo

By Brian Bailey, Semiconductor Engineering

People are looking at other areas in which machine learning may provide value. “Electronic system design, verification and project management data could be leveraged to improve the overall workflow efficiency or manage project risks,” points out Raik Brinkmann, president and CEO of OneSpin Solutions. “This would require the collection and consolidation of data along the design process across multiple tools. On a smaller scale, the effectiveness of individual methods could be further improved by combining advanced data analytics with new verification workflows, such as formal verification methods. In particular, gathering performance data during runtime over several episodes allows building predictive models for fine-tuning heuristics or projecting tool runtimes and verification results. Although the underlying predictive models will be specific to the task at hand, they share the requirement for good training data during their construction.”

Read more

Could Liquid IP Lead To Better Chips?

Semiconductor Engineering logo

By Brian Bailey, Semiconductor Engineering

I know this is good for verification. Every one of those trial-and-error runs means you have to verify it. When we talk about a move to abstraction, this is a huge move and makes verification more efficient and effective. You can break the verification problem into a high-level functional job and equivalency checking. Equivalence checking means checking consistency between the high-level model and RTL, which is hard.

Read more


Which Verification Engine?

Semiconductor Engineering logo

By Ed Sperling, Semiconductor Engineering

Experts at the Table, part 2: The real value of multiple verification engines; cloud-based verification gains some footing, particularly with internal clouds.

The cloud is a great approach. We had a cloud solution early on and it didn’t go anywhere. There were two issues we saw. One is the legal IP issues. Companies don’t want to send their IP to a cloud. Even if the engineers do, they don’t want to go talk to the company’s lawyers to sell them. However, companies have their own clouds now, which is very easy to do. You can have emulators in huge rooms with fans and that solves that problem. The big problem is the business model. Cloud provides a pay-per-use model, which is very effective for verification. You can get some core verification-based licenses, and then use a pay-per-use for a bulge when you need the extra runs. We’ve employed that quite successfully recently. On the big data side, if you’re dealing with engines running very quickly and trying to speed them up, or if you’re dealing with a massive amount of data coming out those engines, it’s still the same issue. You have to find smart ways of dealing with verification collaboratively between the customers and the folks building platforms. If you look at Portable Stimulus or formal, you can think of these as smart verification platforms that can be configured to solve problems, whether they’re working with post-process data or processing with the engines directly. It doesn’t matter.

Read more

Doc Formal: the crisis of confidence facing verification III

Tech Design Forum logo

By Dr. Ashish Darbari, Tech Design Forum

In the first two parts of this series, I described the verification crisis, explained how it came about, and began to describe the pillars and, within them, components of a responsive design verification (DV) flow.

Part One defined a verification meta model as a way to describe the key aspects of the crisis and laid out high level ideas for alleviating it.

Part Two considered two of the four main pillars of an optimized DV flow: ‘Requirements and Specifications’ and ‘Verification Strategy and Plan’.

Read more

The Week In Review: Design

Semiconductor Engineering logo

OneSpin returns with another holiday puzzle, this year challenging people to use formal tools to solve what may be the world’s hardest Sudoku grid. The deadline is Jan. 7th.

Read more

Big Challenges, Changes For Debug

Semiconductor Engineering logo

By Ann Steffora Mutschler, Semiconductor Engineering

Indeed, as silicon geometries continue to shrink, SoC platforms on single devices become larger and more complex, reminded Dave Kelf, vice president of marketing at OneSpin Solutions. “The debug complexity of these devices increases at an exponential rate to design size. Furthermore, the error conditions that can occur may be due to complex corner case problems, which are hard to track down. Innovative debug techniques are required, and these might make use of unexpected alliances between different tools. For example, a fault that becomes apparent during SoC emulation, can be debugged using bug hunting techniques applied with a formal tool, with assertions being created that exhaustively analyzes the specific condition. The continued shrinkage of geometries essentially results in inventive and diverse combinations of tools, stretching their capabilities to unexpected requirements.”

Read more

The Uncontrolled Rise Of Functional Safety Standards

Semiconductor Engineering logo

By Sergio Marchese, Semiconductor Engineering

Over the past 30 years, advances in software and hardware have made it possible to create sophisticated systems controlling crucial aspects of complex equipment, from rolling and pitching in aircrafts, to steering and braking in cars. The processes and methods defined in functional safety standards are crucial to ensure that these systems behave as expected and safely, even when certain parts –– such as a microprocessor or other hardware component ––malfunction. Standards often require strict processes to identify potential hazards on the final product, assess the associated risk, mitigate it with appropriate safety measures and provide evidence that the residual risk is acceptable.

Read more

Press Contact

portrait of Nanette Collins

Nanette Collins
» nanette@nvc.com
» +1 617 437 1822