Sun Grid vs. IBM Grid
$10 For a Chance to Blow Up Busch Stadium?!?!?!?

Mentor Makes the Pitch for SystemVerilog with Questa

Earlier this week I attended a seminar put on by Mentor entitled "Advanced Verification and How to Get started with Questa".  Tom Fitzpatrick, a Verification Technologist at Mentor was the presenter.  Though much of my verification background has up to this point involved e, SystemC, or straight C++, I've been doing research lately on SystemVerilog and was interested to hear Mentor's pitch on why I would want to use SV over the multitude of other options that exist for chip design and verification.  The presentation itself was an excellent overview of why teams should be using assertions, functional coverage, and constrained random verification techniques - perhaps the best I've seen in a long while. I'd like to take a little time to explain in a little more detail what the talk was about and my opinion about how SystemVerilog will be evolving over the next two to three years.

It's really great to see the big EDA vendors such as Mentor out evangelizing what I consider to be the correct way to do verification. There are a lot of companies out there still writing Verilog or VHDL testbenches, blissfully unaware that the times, they are a changin'...

According to Mentor, Questa provides the tools required to enable several types of verification:

  1. Assertion-Based Verification
  2. Constrained-Random Verification
  3. Coverage Driven Verification

Assertions

Assertion-based verification using SystemVerilog Assertions (SVA) or PSL (both supported by Questa) is a way to allow designers to have a more active role in the verification process by letting them specify properties that should not be violated during the operation of their particular module.  They can also specify directly the types of scenarios they'd like to see covered (more on that later).  Assertions can be added in-line with the design code (a good rule of thumb is to add them wherever you would otherwise put a comment).  They are always active during simulation runs and will cause the simulation to fail immediately if they encounter any problems.  Since assertions that fail are likely to be very close to the source of the problem they can speed up debug times dramatically.  Kelly Larson from Analog Devices gave an interesting talk about the benefits they've seen by having designers add PSL assertions to their code at the Austin Verification luncheon held earlier this week.  Whether you're using SVA, PSL, or some other language everyone should be seriously considering using assertions on upcoming designs.

Constrained-Random

Constrained-Random verification using SV is an area that people used to doing verification using basic Verilog testbenches should find very exciting.  It's possible to modify existing Verilog tests to enable random activities that would be difficult or impossible to do in straight Verilog.  The true benefit comes, of course, when building up a verification environment from scratch to use constrained-random testing methodologies.  The methodology issues involved in setting something like this up can get ugly in a hurry if you've never done it before.  That's where Mentor hopes to come to the rescue with their "Getting Started Guide" which should be available to the public soon, along with a set of open source examples.  Their goal is to provide a set of guidelines that will make it easier for newcomers to adopt the constrained-random verification techniques espoused by SystemVerilog.  In this respect they are following in the footsteps of Verisity (eRM) and Synopsys (RVM).  However, by making their methodology guidelines public I believe they will be able to make a big impact on the industry as a whole.

Speaking of Verisity, those who have spent the last several years using Specman may wonder how the SystemVerilog constrained-random methodology compares with that in e.  As would be expected, SV doesn't currently support Aspect Oriented Programming (AOP).  For those of you unfamiliar with AOP, it's basically a mechanism that allows a user to add fields, methods, or constraints to code that was previously written (perhaps by a 3rd party), potentially only under certain circumstances (ex. we're using PCI-X instead of PCI-Express so change the contents of the write() method of a bfm to reflect the change).  Method bodies can also be overridden, appended to, or prepended to without the need to modify the original source code.  If you really want to learn more about AOP do an Internet search or ask your friendly Cadence representative.  The lack of AOP can be a royal pain in the ass for those of us who have used it to quickly put together complex, reusable testbenches.  Another feature that is missing from SV is soft constraints.  Soft constraints are constraints that are usually true but can be violated in order to satisfy a hard constraint.  For example, let's say I have a packet that can be between 64-1518 bytes.  I might write a soft constraint like this:

struct packet {
   length : uint;
   keep soft length in [64..1518];
}

Later (in e), I might come along and say something like this:

// Fixed length jumbo frame
extend packet {
   keep length == 3000;

}

Cliff Cummings defends the fact that there are no soft constraints in SV in his 2003 SNUG paper entitled SystemVerilog - Is This The Merging of Verilog & VHDL?.  Since each constraint is named, any individual constraint can be turned off for a particular instance of a class, effectively making it possible for any constraint to be soft.  I need to do some more experimentation with this feature before I can say for sure whether I agree with Cliff or not.  I like the fact that constraints are named and are easy to turn off.  I'm really struggling with the lack of soft constraints though.  I really missed them in SystemC (yes, I know they exist in SystemC but try violating one and see what happens to the rest - unless that's been fixed in a recent version of the SystemC libraries).

Coverage

The final piece of the verification puzzle covered by Questa is coverage.  Questa has tools available to calculate both functional coverage (using PSL or SVA) or code coverage.  The tools seemed to work well in the demo but personally I still prefer the way functional coverage is collected in e.  Again, just a spidey-sense thing at the moment.  I need to back this up with some additional research.

So what's not to love?

The big question mark in the whole Questa thing is that it's unclear to me how robust the tool will be or how the missing features (as in, features that are part of the SV spec but not implemented by Mentor) will affect the usability in real world situations.  My experience last year trying to build a verification environment from scratch using SystemC and SCV has taught me that though a tool might look good on paper, the devil is in the details.  How quickly will teams be able to pick up the new language?  How hampered will they be by the bugs that are inevitable when new features of this magnitude are added to vendor tools?  My guess -- the tools will be much more robust 2 years from now.  It is to be seen whether they are robust enough today for a full scale design and verification effort.

Comments