Here is the concluding part of my discussion with Dr. Walden C. Rhines, chairman and CEO, Mentor Graphics.
Getting billion-gate design correct
In EDA, is there now some chance of getting a billion-gate design correct on first pass?
Dr. Rhines said: “Absolutely! Today’s methodology is up to the task and customers have already reported “billion gate equivalent” designs, i.e., 4 billion transistor, correct on first pass. Correct logic is a much easier challenge than full production readiness on first pass!
“Achieving targeted power dissipation and timing has been more of a challenge but that’s where recent tool improvements are having their greatest impact. Almost all designs of this size now go through exhaustive verification, including power analysis, using emulation. That change in methodology has increased the cycles of verification by more than three orders of magnitude.
“Beyond simply achieving functional silicon with acceptable power and timing, more and more companies are now using EDA tools to assure a rapid ramp to high yield in production. This requires use of a whole new generation of “design for test” tools directed at defect driven yield analysis.
“By our measures, some of the top semiconductor companies analyze more than 500,000 defective parts every day to identify design and process problems.”
Standardization of SoC verification flow
Next, what is the status of the standardization of SoC verification flow today?
He said that Mentor Graphics has long worked on providing leading functional verification products. “We are doubling down on perfecting tools that are part of an enterprise platform where common testbench stimulus, verification IP, and standard verification languages can be used up and down the tool chain. However, the flow belongs to the customer.
“We do not try to enforce a “standard verification flow”. We are happy to accommodate unique customer needs and trust our customer to know the unique requirements of their own markets.
It would be interesting to know what has been happening regarding the coverage and power across all aspects of verification?
According to Dr. Rhines, power management debug has permeated all aspects of traditional HDL based verification. For large SoCs, debugging power-management related problems is a very difficult task. Power is managed wholly or in part by software. Increasingly, validation of power managed designs, including power estimation, requires hardware accelerated solutions such as emulation and prototypes.
New releases of the UPF standard include lots of new capabilities that help verify power usage but that do require additional effort to analyze. Examples include dynamic power related messages, automatic power specific assertion generation and support for the entire flow from simulation through emulation and prototypes.
In addition, lots of designs now use new tools for power management verification, static analysis, rule based power checks and power-aware logic equivalence checking.
Similarly, what is happening in active power management today?
He said that active power management creates the need for functional verification. Traditionally, power has been managed via clock gating, power gating and dynamic voltage and frequency scaling.
The first two methods (clock and power gating) directly impact functionality necessitating the need for things like isolation with clamp values on inputs or outputs to a power gated block of logic, retention registers and gating logic for clocks, as well as the associated control signals or registers and the state machines, which manage the transitions from one state to another.
Verification of the active power management logic and control states necessitates the need for UPF support in verification solutions. The challenge in debugging power management issues drives the value in dynamic checks to ensure valid power down and up sequences, save/restore or resetting/write-before-read behavior of registers in power domains and proper activation and de-activation of isolation logic values.
Intelligent software driven verification
Let us also look at the status of intelligent software driven verification today .
According to Dr. Rhines, to bring automated tests to environments beyond transaction-oriented, block-level environments, a standardized input-specification language is needed to specify these tests. That is why Accellera has launched a working group, titled the Portable Stimulus Working Group (PSWG).
The goal is to collect requirements, garner technology contributions, and specify a standardized input language to specify test intent that can be targeted to a variety of verification platforms.
Mentor is helping to drive the activity in the PSWG by contributing technology and expertise to the standardization process. A key area of focus is on developing a new specification language called the “Portable Stimulus Standard” (PSS).
Since the PSS has yet to be formally released, Mentor has created its own version of the PSS language and a methodology around the Questa inFact Intelligent Testbench Automation solution. It is one of the leaders in this space.
The value of a portable stimulus specification is that it can generate tests that drive verification in simulation, emulation or prototypes and allows verification engineers to work at a higher level of abstraction.
It isolates the specification from the implementation of how the test is generated and driven to the design allowing higher levels of reuse. Test specifications could be implemented as a SystemVerilog-UVM sequence or as C code run on the processor subsystem of an SoC design in emulation or prototype.
In addition to the value of portable specifications, the standard allows vendors to optimize the generation of tests to more quickly to close on coverage of all the test sequences that can be generated from the specification. A 10x efficiency in test coverage is a key value of the Questa inFact solution.
Integrated simulation/emulation/software verification environments
What has been going on in the integrated simulation/emulation/software verification environments that have emerged?
As per Dr. Rhines, recognizing the demand for a seamless verification environment, Mentor’s Enterprise Verification Platform (EVP) combines Questa advanced verification solutions, Veloce OS3 global emulation resourcing technology, and Visualizer, a powerful debug environment, into a globally accessible, high-performance datacenter resource.
It features global resource management that supports project teams around the world, maximizing both user productivity and total verification return on investment. As a result, the verification process is completely abstracted from the underlying verification engines from first design thoughts, through silicon, to final product.
This eliminates the barriers to hardware acceleration, combining the functionality and observability of simulation-based verification with the speed of emulation.
Mentor has also expanded its portable test and stimulus technology across the EVP to enable users to verify more with the same resources. Now RTL design and verification engineers can use the same advanced test techniques across block, subsystem, and SoC level testing, and execute these tests on simulation, emulation, and post-silicon testing platforms.
The Portable Test and Stimulus technology provides a single description of scenarios to be exercised that can be reused by all stakeholders across the various stages of the verification cycle: from architects to RTL verification engineers to post-silicon validation engineers and software teams on a variety of platforms from simulation and virtual platforms to FPGA prototypes, emulation, and even post-silicon.
The value of integrated verification flows is time – getting product to market sooner. Time savings come from the ability to move quickly from one verification engine to the next while reusing as much from the verification and validation environments and tests as is relevant and desired.
For example, Mentor has defined a way to use UVM such that your testbenches are easily reusable and guaranteed portable from simulation to emulation where tests can achieve a 500x increase in throughput. We also collect coverage information from all verification engines, including Formal into a common database to enable verification teams to easily see what coverage has been achieved and what coverage holes remain.
Mentor’s CoverageCheck solution uses that database to determine whether a coverage hole is reachable given the specific configuration of the device under test. Users also experience time savings through continuity of solutions throughout the flow, eliminating the need to learn different tools with different capabilities.
Mentor’s Visualizer solution provides a single, high power and high productivity debug environment for simulation, formal and hardware acceleration. Mentor’s Enterprise Verification Platform can save weeks to months in getting a product to market.
Emerging trends in chip design
Finally, what are the emerging trends in chip design today?
Dr. Rhines noted: “As usual, Moore’s Law is driving new challenges. Rapid development of 7nm technology now makes possible a big increase in design complexity but it comes at the cost of a lot of process complexity.
“Initial generations will not benefit from EUV so double and even quadruple patterning is increasing the number of mask levels substantially and thus driving up the cost of prototyping.
“One is the emergence of multi-technology designs implemented in multi-chip, or single chip, packages. IoT requires combinations of analog, digital, RF, MEMS or photonics; “system” simulation of these functions is a real challenge.
“Another is the increasing number of mixed signal designs that place unique requirements on both integrated simulation and physical implementation. Finally, the move to emulation continues to gather momentum as verification of complex digital chips exceeds the capabilities of traditional simulation.”
Finally, with Siemens buying Mentor, is it STILL going to be business as usual, or some change?
Dr. Rhines concluded: “It will be business as usual, but, Mentor will have access to more resources. You can expect an increased investment in targeted areas of IC design as well as EDA applied to system design. I’ll continue in my role.”