PSS complementary to UVM: Dr. Wally Rhines

Posted on

Here is the concluding part of my discussion with Dr. Walden Rhines, chairman and CEO, Mentor, A Siemens Company.

Has the PSS been formally released? What are its implications?

RhinesDr. Rhines said: “Accellera released an Early Adopter spec for public review at DAC in June, 2017 and is currently working on completing our work in preparation for a 1.0 release in 2018. Accellera plans to have a “1.0 Preview” version available in February, 2018 (@DVCon US) for another 30-day public review period. Then, they will do one more cleanup pass, and submit to the Accellera Board for approval in May 2018.

“The expectation is that the Board will approve the Portable Stimulus Standard 1.0 version in June, 2018, prior to the DAC. Mentor plans to have Questa inFact fully updated by then, to fully support the new standard when it comes out.

“As for the implications, we expect the Portable Stimulus standard to be the next advancement in abstraction and productivity for SoC verification. It is not expected to replace UVM, but rather be complementary to UVM to improve coverage closure, verification efficiency, and effectiveness at the block level.

“The ability to re-use the verification intent expressed in PSS from a block-level UVM environment to a software-driven, embedded-processor SoC environment, on multiple platforms (simulation, emulation, FPGA prototyping, etc.), will provide a quantum leap in productivity.

“Since the Portable Stimulus specifications are declarative, tools can fully analyze the verification-intent description at the system level and generate multiple correct-by-construction implementations of use case tests, on multiple platforms, from a single specification without requiring the verification team to rewrite the tests in UVM for the blocks and C for the system.

By the way, are the semiconductor/EDA companies re-looking at designs, rather than analyze more than 500,000 defective parts every day to identify design and process problems? If yes, how?

He said: “With today’s increased design complexity – they do both – re-look at designs before manufacturing and analyze afterwards. The complexity of today’s designs and manufacturing process requires multiple approaches to achieve high yields in each new node that is rolled out.

“Design for manufacturing and for yield are a must. However, the knowledge of the specific design practices that need to be followed for a new node is developed in multiple stages: in pre-silicon, test chips, first production design and when chips reach high production volume.

* Pre-silicon: Simulation models are used for initial design rules. Many assumptions are made and care must be taken to balance the benefit with potential overdesign for a process that will mature over time.

* Test chips: Early test chips try to mimic the major features of a real design, however, limited complexity and volume means some design rules can’t be discovered at this stage.

* First production design: Additional complexity of a real design and increased volumes expose more issues that need to be fed back to design for future revisions or the next design on a node.

* High production volume and additional designs introduced: High production volume and each subsequent design can benefit from the learnings at the previous stages. Many issues during this phase are resolved with process improvements, but continuous learning still remains key.

“The challenge is not eliminating the later learning phases, as this will never go away. Rather, the challenge is for the industry to maximize the learning at each phase and establish a continuous improvement cycle in design to take advantage of the knowledge gained. This is the foundational idea in closed-loop DFM, which is a process to maximize the design for manufacturing benefit throughout all phases.

Let’s also look at verification. What is the latest regarding coverage and power across all the aspects of verification?

Dr. Rhines added: “Actually, the recent trends have expanded to multiple concerns that cut across all aspects of verification, beyond coverage and power, such as security and safety. One driving force behind these trends is the convergence of computing, networking, and communications technologies. This is driving new markets, such as the Internet-of-things (IoT) ecosystem and automotive.

“A common theme across these emerging systems is the need for security, safety, and low power–whether you are talking about IoT edge devices or high-availability systems in the cloud. These new challenges have opened innovation opportunities, enabling us to rethink the way we approach verification. For example, concerning coverage, new statistical metrics have emerged providing deep system-level analysis capabilities that leverage data analytics techniques. This insight has become essential for system-level performance analysis.

Trends in chip design
The industry folks would like to know about the new, emerging trends in chip design for 2018?

Dr. Rhines said: “We are now seeing multiple opportunities to leverage data analytics and machine learning to solve a variety of engineering problems—ranging from significantly reducing standard cell, memory, and I/O characterization time to providing deep functional and performance insight through system-level analysis.

“In addition, high-level design has finally taken off, as algorithmic experts develop AI and pattern recognition approaches, and implement them in the data paths of chips, or FPGAs, to produce order of magnitude improvements.”

For that matter, besides AI/robotics and IoT, what are the top trends that you foresee in 2018?

He added: “EDA tools traditionally worry about two-dimensional routing problems on an integrated circuit or printed circuit board. We are now successfully applying basic EDA technology to design everything from integrated circuits to full systems. For example, automotive design has traditionally been driven by mechanical design. Now, the differentiation and capability of cars is increasingly determined by electronics. That’s why total system simulation has become a requirement.

“The electronic wiring in a car or plane is a three dimensional problem. With the increase in electronics comes increasing concern about the safety of the electronics. Automotive electronics developers must deal with new regulations like ISO 26262 that pertain to electronics safety, in addition to environmental requirements.

“The basic task of “sensor fusion” has stimulated new electronic architectures, as we attach more and more visual, radar, LiDAR, and other sensors to the car. There is no way to reliably design vehicles and aircraft without virtual simulation of electrical behavior.

“Beyond the automotive and aerospace markets, there are a host of new technologies that require new electronic and integrated circuit (IC) capabilities. These include the Internet of Things, Artificial Intelligence (AI) and machine learning, to name a few. These technologies will drive new computer architectures, innovative packaging, and the next generation of semiconductor technology.

Thermal analysis, which was once considered important only for system design, has now become a key part of IC design. We are capitalizing on our long history with thermal and computational fluid dynamics. Finally, new IT companies like Google, Alibaba, Facebook and the others are designing their own integrated circuits. They will need a complete flow solutions for sensor design, and today, Tanner EDA provides the oldest and best capability.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s