Design and verification

How to build future verification engineers?

Posted on

At the Verification Futures Conference 2023, Francois Cerisier, CEO, AEDVICES Consulting, presented on building the future verification engineers.

We need more verification engineers! Fewer students go to electrical engineering universities. The future of microelectronics involves more than 1 million additional skilled jobs by 2030. We also have the global semiconductor talent shortage.

How many of these jobs will be about verification? Let’s assume 15 percent, or 150,000 verification engineers to find in the next 6 years. That is about 25,000 verification engineers per year.

There are several challenges. How to attract students and other engineers to verification? Can we teach the ‘verification mindset’? If not, how can we identify it up front, before we invest in trainings? How to accelerate the ramp-up phase on the techniques and verification methodologies with the big M? Can we train verification engineers to new paradigms, such as the use of AI to help in the verification process?

To attract engineers to verification, we need to improve the perception of verification. Next comes the employer brand, followed by testimonials. Will it be enough?

Can we accelerate the ramp up? Where are the universities? They need to teach computer science, architecture, SoC, etc., along with RTL design, FPGA, backend, process, and so on. And, what about verification? How many hours are spent in learning verification? What verification do they teach? Is it about VHDL test-benches? SystemVerilog language, UVM classes, and UVM state-of-the-art architecture are important.

There are good trainings on languages, tools and more. So, what about methodology? There is very little about methodology in UVM. What about ‘how to write a verification plan’? What about asking the right questions? What about the verification mindset?

AEDVICES Consulting has worked with STMicroelectronics to develop a full program. This definitely accelerates the study of verification, but it is still not enough!

Verification mindset
In microelectronics, a verification mindset refers to the approach of thoroughly verifying the function and performance of an electronic design, before it is manufactured and deployed. It is an essential step in the design process to ensure that the final product works as intended, and meets the required specifications. Even ChatGPT knows about it.

To have a verification mindset requires having a systematic and rigorous approach to verifying the design through series of tests and simulations. It involves analyzing design at every level of abstraction, from high-level system design to individual components and circuits.

And, what about AI? Is verification job going to move to AI? Is AI going to help, or are we going to be replaced? If AI can verify, why can’t it design without bugs? AI can accelerate things. It can compare tons of data with your design, help in debug, and accelerate coverage closure. If the job is going to change, how are we going to be trained to these?

There are some directions. We can improve marketing and communications. We can get the universities involved, and address the need of verification engineers. Technical trainings are not just enough. We need to create the verification mindset. We also need to recognize the expertise.

CHIPS Act and its impact on design and verification markets

Posted on Updated on

Design and Verification Conference, DVCon USA 2023, started yesterday at DoubleTree by Hilton Hotel San Jose, USA.

Accellera Systems Initiative hosted a luncheon, featuring Bob Smith Executive Director, SEMI ESD Alliance, who spoke on “The CHIPS Act and Its Impact on the Design & Verification Markets.” SEMI is said to have over 2,400 members globally.

Bob Smith began with a view of the design ecosystem today. The electronic systems and products market is currently worth $2 trillion+. The fabless/foundry and IDM market is said to be around $419 billion. The equipment and materials, and OSATs/C-Subs market is worth about $154 billion. Finally, the design automation and IP/services market is said to be about $13 billion today.

Bob Smith.

Chips Act is about Creating Helpful Incentives to Produce Semiconductors. The official name stands as the CHIPS and Science Act of 2022. It was signed into law in Aug. 2022.

The Chips Act’s mission is: to strengthen and revitalize the semiconductor R&D, semiconductor manufacturing, and investment in American workers. Priorities include meeting the economic and national security needs, ensure long-term leadership in the semiconductor sector, strengthen and expand regional clusters, catalyze private sector investment, generate benefits for a broad range of stakeholders and communities, and protect the taxpayer dollars.

The US Department of Commerce (DoC) will oversee $50 billion in investments to expand domestic manufacturing of mature and advanced semiconductors. The budget allocations for Chips Act stand at $52.7 billion, with $39 billion in manufacturing — to provide incentives to spur the development of new semiconductor fabs in the USA. $11 billion is allocated to R&D programs and workforce development. $2 billion goes for legacy chip production and microelectronics common.

For domestic manufacturing, there are incentives to develop domestic semiconductor manufacturing including ‘legacy’ chip production. For Commerce R&D and workforce development, there are plans for the National Semiconductor Technology Center (NSTC), National Advanced Packaging Manufacturing Program (NAPMP), and other R&D and workforce development programs.

Microelectronics Common (CHIPS for America Defense Fund) includes University-based prototype to fab semiconductor technology transition, DoD unique applications, and workforce training. CHIPS for America International Technology Security and Innovation Fund will co-ordinate with foreign governments to co-ordinate security, supply chain and communications. CHIPS for America Workforce and Education Fund will have 90,000 new domestic workers needed by 2025.

Design and verification focus
Where do design and verification fit in? There are four entities under DoC. NSTC will serve as the focal point for research and engineering throughout the semiconductor ecosystem, advancing and enabling disruptive innovation to provide US leadership in the industries of the future. NAPMP will strengthen the semiconductor advanced test, assembly, and packaging capability in the domestic ecosystem. It includes heterogeneous integration, tooling and automation, wafer/panel and substrate technology.

NAPMP target areas include co-design and verification, chiplets, pilot packaging facilities, tooling and automation, and materials and substrates. We also have the Metrology R&D (NIST), and Manufacturing Institute(s) USA.

Industrial Advisory Committee (IAC) provides advice and recommendation to Commerce and NSTC. It has members from the industry, academia, federal laboratories and others. It has 24 members, including representatives from AMD, Applied Materials, ASML, Ford, Intel, Micron, Microsoft, Stanford, Synopsys, Texas Instruments, UCB, etc.

IAC R&D Gaps Working Group has the R&D Vision for Chips Act program with grand challenges. These include capabilities with ecosystem gaps, and applications with research gaps.

IAC recommendations 1 include establish easily accessible prototyping capabilities in multiple facilities, with the ability to rapidly try out CMOS+X at a scale that is relevant for the semiconductor industry. There is need to create a semiverse digital twin. We also need to establish chiplets ecosystem and 3D heterogenous integration platform for innovation and advanced packaging. We need to build an accessible platform for chip design, and enable new EDA tools that treat 3D (monolithic or stacked) as an intrinsic assumption. We also need to create a nurturing ecosystem for promising startups.

The IAC’s role is to provide input and make recommendations. The IAC does not make policy or rules – it is only an advisory body. But, the IAC clearly recognizes the need for new design and verification automation tools that will be needed to support the missions of the NSTC and NAPMP (among others).

First order effects are the direct effects that the CHIPS Act would have on the design ecosystem (EDA, IP, etc.). Second order effects are positive benefits and opportunities that arise from the US focus on re-tooling domestic semiconductor manufacturing capabilities.

In first order effects, the CHIPS Act recognizes that the future of semiconductor design is moving to chiplets and heterogeneous integration. The need for new automation tools for design and verification is clearly recognized as being essential. Opportunity exists for new technologies and commercial solutions. It is important to follow the direction of the NSTC (hub for CHIPS Act activities and organizations) and the NAPMP.

Potential second order effects include WFD programs that should help address the chronic shortage of talent across our industry. Expansion of domestic manufacturing capability should lead to more on-shoring of design activities, which would require automation tools.

R&D programs in various projects under the CHIPS Act/NSTC will require tools, and may offer partnerships or collaboration to solve critical challenges in advancing the state-of-the-art. There are likely many others since the scope of the CHIPS Act is so broad.

May I take a moment to thank my dear friends, Bob Smith, ESD Alliance, Ms. Barbara Benjamin, HighPointe Communications, and Ms. Nanette Collins, NVC, for inviting me to be part of DVCon USA 2023.

Verification challenges, trends and their practical adoption

Posted on Updated on

Madhav Rao, SVP, VLSI BU, Tessolve Semiconductor Pvt Ltd, presented on verification challenges, trends and their practical adoption – an ASIC manager’s perspective, at the recently-held DVCon India 2022 conference in Bangalore, India.

An ASIC project manager’s key verification expectations include adherence to project milestones, approved project budget, and quality for first-pass silicon success. Typical verification tasks and challenges include a verification plan, IP/block verification, SoC verification, functional and code coverage, gate level simulations, and tapeout to fab.

Verification planning involves using a well-defined flow, define re-use strategy, do as much as possible at lower hierarchy levels, and right choice of tool and methodology essentials. At IP/block level, we look at whether the verification is audited, formal checks, re-use from previous projects, is proven VIP available and affordable, and protocols compliance. At SoC level, we need to look at how does it save time, PSS abstract models and objective, CDC checks, connectivity, and x-propagation, etc.

We can maximize re-use through PSS, by raising the level of abstraction from stimulus and code. to scenarios, micro-architectural configuration and constraints. We can map tests to abstract actions (e.g., configure an IP, write data from IP1 to IP2). We can have actions mapped to real actions at test bench level (e.g., map them to existing UVM and C-code).

PSS can combine IP scenarios at higher levels to create new scenarios (e.g., at subsystem level and SoC), and ensure that porting across hierarchy, platforms and projects should be easier. We need to ensure bug avoidance, hunting, absence, and analysis. As always – think re-use first. Is AI/ML to the rescue? It is still in infancy and evolving. We can apply ML to regression suite pruning/optimization. ML can accelerate coverage closure.

For gate-level simulations, we need to ensure speed, testbench re-use from RTL simulation, and which tests to run. A way is the hybrid GLS (RTL + Netlist) solution. Netlists are introduced in parallel to their respective RTL modules re-using RTL testbench environment.

As for infrastructure challenges, planning upfront for the right number of compute servers and storage is a challenge. Engineers always complain of infrastructure resource shortage in the final phases towards tape-out. We can transition to secure private cloud for compute and storage. There are emerging EDA tools incorporating PSS models, formal methodologies, and AI/ML algorithms to help alleviate some of these challenges

Standards evolve to incorporate emerging technology: DVCon India 2022

Posted on Updated on

Design and Verification Conference (DVCon) India 2022 was recently held in Bangalore, India. Dave Rich, Verification Architect, Siemens EDA, presented a keynote on the emerging design / verification technologies and standards–which comes first?

Why do we have standards? It is more than just layers of bureaucracy! It can promote efficient use of resources, and see that you do not keep re-inventing the same wheel. There is communication – languages, protocols, common terminology, with focus on public health and safety, process and tracking, understanding the value of competing solutions, share qualifying metrics, etc. Besides, competition promotes innovation!

There are challenges in developing standards. To gain interest/momentum, we need to identify clear goals. For achieving consensus, we need to use existing technology, and develop by committee. We need to ensure that feature creep does not step in, making a standard look overly complex and lacking stability. It should not be too long to publish and becomes obsolete. He referred to a NASA example from 1999, where the cost of not following standards, or not agreeing on which standard to adopt can be astronomical.

Today, we have new technologies with practical applications largely unrealized, vs. the continuing development of existing technology. There is a dilemma between innovation and stability.

There has to be investments in the ecosystem. It can be in capital – systems used to build/test/manufacture, content – existing IP available for re-use, training and education – time and people resources, compatibility – migration effort, and confidence. He gave an example of SystemVerilog.

New technology is not always a new standard. Standards evolve to incorporate emerging technology. It is difficult to get people together with cross-disciplinary knowledge. E.g., low-power concerns cut across many technology domains, such as simulation, synthesis, mixed signal interfaces, and clock/reset domain crossing.

Analog verification has emerged a as technical challenge. We can use digital verification techniques in real number modeling. This can be across constrained randomization, functional coverage, and assertions. Standardizations efforts are across Verilog-AMS, SystemVerilog-AMS, SystemC-AMS, UVM-AMS, and VHDL-AMS.

Impact of AI/ML to real productivity boost is not yet measurable! ML is not a panacea to any problem!! Accellera and UCIS made a great start to coverage standards. Momentum had waned due to lack of customer demand. ML needs lots of data for training.

Other emerging technologies with standards concerns include functional safety, with fault modeling across different domains (dynamic vs. formal) and lifecycle management. Also, there is encryption, with cryptography not the weakest link in security. There are constraints for clock and reset domain crossings, where IP needs a common language to go with the descriptions.

Standards participation at Siemens EDA has been going well. There are functional safety: Accellera and IEEE P2851 Functional Safety Working Groups, Accellera IP Security and IEEE 1735 Working Groups, IEEE 1801 Design and Verification of Low-Power, Energy-Aware Systems (UPF), IEEE 1800 SystemVerilog Hardware Design and Verification Language Standard, IEEE 1076 VHDL Hardware Design Language Standard, IEEE 1066 SystemC Language Standard and several Accellera SystemC Working Groups, IEEE 1800.2 UVM Standard and Accellera UVM Working Groups, Accellera Portable Stimulus Working Group, and Accellera UVM-AMS Working Group, respectively.

Old or new technology is always evolving. We need to balance revolution and evolution. We can appreciate the value of standards to the users. Vendors also co-operate on standards — and compete on technology.