Dr. Wally Rhines
Here is the concluding part of my discussion with Dr. Walden C. Rhines, chairman and CEO, Mentor Graphics.
Getting billion-gate design correct
In EDA, is there now some chance of getting a billion-gate design correct on first pass?
Dr. Rhines said: “Absolutely! Today’s methodology is up to the task and customers have already reported “billion gate equivalent” designs, i.e., 4 billion transistor, correct on first pass. Correct logic is a much easier challenge than full production readiness on first pass!
“Achieving targeted power dissipation and timing has been more of a challenge but that’s where recent tool improvements are having their greatest impact. Almost all designs of this size now go through exhaustive verification, including power analysis, using emulation. That change in methodology has increased the cycles of verification by more than three orders of magnitude.
“Beyond simply achieving functional silicon with acceptable power and timing, more and more companies are now using EDA tools to assure a rapid ramp to high yield in production. This requires use of a whole new generation of “design for test” tools directed at defect driven yield analysis.
“By our measures, some of the top semiconductor companies analyze more than 500,000 defective parts every day to identify design and process problems.”
Standardization of SoC verification flow
Next, what is the status of the standardization of SoC verification flow today?
He said that Mentor Graphics has long worked on providing leading functional verification products. “We are doubling down on perfecting tools that are part of an enterprise platform where common testbench stimulus, verification IP, and standard verification languages can be used up and down the tool chain. However, the flow belongs to the customer.
“We do not try to enforce a “standard verification flow”. We are happy to accommodate unique customer needs and trust our customer to know the unique requirements of their own markets.
It would be interesting to know what has been happening regarding the coverage and power across all aspects of verification?
According to Dr. Rhines, power management debug has permeated all aspects of traditional HDL based verification. For large SoCs, debugging power-management related problems is a very difficult task. Power is managed wholly or in part by software. Increasingly, validation of power managed designs, including power estimation, requires hardware accelerated solutions such as emulation and prototypes.
New releases of the UPF standard include lots of new capabilities that help verify power usage but that do require additional effort to analyze. Examples include dynamic power related messages, automatic power specific assertion generation and support for the entire flow from simulation through emulation and prototypes.
In addition, lots of designs now use new tools for power management verification, static analysis, rule based power checks and power-aware logic equivalence checking.
Similarly, what is happening in active power management today?
He said that active power management creates the need for functional verification. Traditionally, power has been managed via clock gating, power gating and dynamic voltage and frequency scaling.
The first two methods (clock and power gating) directly impact functionality necessitating the need for things like isolation with clamp values on inputs or outputs to a power gated block of logic, retention registers and gating logic for clocks, as well as the associated control signals or registers and the state machines, which manage the transitions from one state to another.
Verification of the active power management logic and control states necessitates the need for UPF support in verification solutions. The challenge in debugging power management issues drives the value in dynamic checks to ensure valid power down and up sequences, save/restore or resetting/write-before-read behavior of registers in power domains and proper activation and de-activation of isolation logic values.
Read the rest of this entry »
It has always been a great pleasure chatting up with Dr. Walden (Wally) C. Rhines, chairman and CEO of Mentor Graphics. It has been a while since we discussed the global semiconductor industry in such detail, and therefore, the latest experience is even more memorable.
First, I asked Dr. Wally Rhines how is Mentor predicting the global semicon industry to perform in 2017?
He said that the growth of the semiconductor market has averaged anywhere between 3-5 percent throughout the 2000’s. In most recent years, since 2008/2009, IC units have grown consistently at a rate of 6-8 percent,, indicating continued demand for semiconductor technology.
ASPs have been drifting downward for a long time with only a handful of exceptions since 1995. The trend appears to be moderating in the last decade, but the downward pressure in pricing has pushed the overall revenue to 3-5 percent growth rate mentioned earlier.
For 2017, most industry research firms and analysts are expecting a relatively positive year (the average forecast across 10 semiconductor research firms is currently at 5.5 percent). Mentor Graphics expects that number to be very conservative due to changing dynamics in the semiconductor industry and how the market is actually measured.
Dr. Rhines said: “Within the last decade there has been an emergence of systems companies designing chips for internal consumption. Those chips are typically not measured at all, or are only partially measured if they are produced by a foundry company. We see the trend in several important areas like smartphone manufacturers.
“Apple and Samsung have been manufacturing their own application processors for some time. However, there is an ever increasing list of other market leaders following with their own designs.
“We are also seeing a number of companies involved with cloud services or other data center intensive companies designing chips for their internal consumption. Unless something unexpected occurs, 2017 should be a good year for the semiconductor industry with growth above average.”
Next, how has been the growth in EDA for 2016? How will the performance be in 2017? Where does Mentor come in all of this?
According to Dr. Rhines, EDA had a strong year in 2016 with total growth of about 9 percent overall, including SIP. Tools alone had growth of about 7.5 percent. Demand for tools continues to look strong for the industry as there remains a strong focus on the leading-edge driving the need for advanced design technology.
Moore’s law continues to drive the industry into smaller nodes as companies are preparing for 10nm, 7nm and 5nm nodes. Advanced nodes continue to drive EDA tool adoption in manufacturing and design.
Additionally, increased challenges and methodologies in functional verification drive technology adoption in enterprise verification including areas like emulation, which had a resurgence in 2016 after several flat years.
Dr. Rhines elaborated: “We are seeing increased activity in areas like ESL as well with ESL Synthesis having its best year ever. PCB design is also enjoying a resurgence in growth as designs become more complex and new design methodologies like System of Systems design begin to emerge.
“In 2016, Mentor Graphics reported an all-time record of $1.285 billion, an 8.6 percent growth or a full point higher than industry tool growth of 7.5 percent. The proposed merger with Siemens should bring additional resources to Mentor’s R&D and customer support capability so our fourth quarter results suggest that customers are pleased with the outlook.”
Part II of this interview appears in April where I will be discussing standardization of SoC verification flow, billion-gate design, power management, etc.
Thanks are due to Raghu Panicker, country sales director, and Veeresh Shetty, marketing manager – Europe and India, Mentor Graphics.