Introduction to UVM

Digital design verification is a time and energy consuming task. In an attempt to streamline it, Accelera created UVM, which stands for Universal Verification Methodology. The UVM standard focuses on interoperability and Verification IP re-use. It can be summarized in 3 points:
  • a methodology
  • based on SystemVerilog (SV) but not limited to it
  • a set of SV classes to support the methodology
The benefits of UVM are:
  1. Predictability : benchmarks can easily be defined
  2. Proven / Complete : used by many companies in many projects
  3. Independent / Open : not linked to a tool vendor
  4. Existing : do not re-invent the wheel
  5. Ease of maintenance : because standard and based on SV
  6. Reuse / Scalable : through verification components but also test cases
  7. Interoperable : verification assets can be reused (eVC)
“It exists, it is supported, it is well defined, there is no alternative.”

SystemVerilog

UVM is primarily based on the SystemVerilog language but is not limited to it. SystemVerilog has numerous advantages as a language, amongst which:
  • It is a standardized language
  • Tool / vendor independent (now built-in in all major tools)
  • Easy to learn (it is a superset of Verilog)
  • Object oriented with extra data types compared to Verilog
  • It supports Constraint Random Generation
  • Assertion Based Verification to check the design + Scoreboard to check output data
  • Coverage Driven Verification (functional coverage)
  • It provides easy backdoor accesses
  • DPI to other languages (more efficient than PLI of Verilog)

Building a CDV environment

The following picture depicts a classical Coverage Driven Verification.

a classical Coverage Driven Verification

Starting from a design specification or a features list, a test plan and verification plan are created (eventually, the verification plan is the features list). The DUT (Design Under Test) interfaces are exercised by verification components (the blue box). Some passive monitors can be plugged on the other interfaces. The monitored data are checked by scoreboards and functional coverage is covered. Radomization is used in the test generation and finally reports are automatically generated based on coverage information and verification plan.

This is for the decorum. Let's now have a look to the flow.

What is the flow?

The first focus must be on the Verification Plan that defines what to test. The items of the plan are translated into coverage points, in a language understandable by the simulator. Then a first version of the verification environment is developped. It is the framework defining how the DUT is exercised. Start with a few highly random test cases and analyze the coverage. The analysis should reveal the coverage holes that can be targeted by adjusting the constraints and adding new test cases. We can already see how important the Verification Plan is: it must be broad but detailed enough.

At this point, the coverage model can be enlarged in order to explore new combinations and if necessary, possibly go back to the verification plan.

Add comment


Security code
Refresh

Ads