User Environment Woes
So Long Norton, and Thanks for All The Fish

A SpecManiac's Take on Vera

At Verilab we have consultants who are experts in all sorts of languages used in hardware verification.  The big ones at the moment are Specman, Vera, SystemVerilog, and SystemC.  Having experts in all of these languages makes it interesting when we discuss verification issues internally, and allows us to provide advice to clients who are trying to decide which verification language to use on future projects. My languages of choice on my last several projects have been Specman and SystemC.  In preparation for future projects I decided to mix things up a little and ramp up on Vera. 

This is the first time I've really spent a great deal of time trying to come up to speed on Vera.  All of the high level verification languages have their fervent supporters, many certain that supporters of one of the other languages are Satan's spawn (or worse!).  However, I've found that each language tends to have its positive and negative aspects - it all depends on what you're using it for. 

One of the most difficult things for someone moving between languages is the mental shift that must be made in order to write code that is optimized for a given HVL.  For example, one of the biggest differences between Specman and Vera (actually, between Specman and all other HVLs) is that in Specman all struct members are automatically randomized unless you specify otherwise.  In Vera, you need to specify the "rand" or "randc" qualifiers in order for a variable to be randomized, and you must call .randomize() in order to set off the randomization process in the first place.  Many people who come from traditional C++ or Verilog test environments find it difficult to understand why you would want so many variables to be randomized.  Those of use who've spent more time with Specman, on the other hand, can't quite fathom why you wouldn't want everything to be randomized.

Another difference between Specman and Vera is the focus on Aspect Oriented Programming (AOP).  Vera has had AOP capabilities for the last couple of years but I would say most people don't use it as frequently in Vera as it is used in Specman.  I personally find AOP quite useful when designing testbenches, but it requires one to think about a problem in a completely different manner.  People who are pretty familiar with OOP techniques tend to balk at the "spaghetti code" that can result from poorly implemented AOP code.

The first thing I noticed about Vera after taking randomization and AOP into account was the documentation.  Synopsys and Verisity (now Cadence) present their documentation in dramatically different ways.  The Synopsys docs are presented in PDF format.  The Specman docs are available both in PDF and in an easy to browse and search web format.  I've seen other tools (not verification related) use the same doc format as Specman.  I think it is far superior to the Synopsys, Mentor, or Cadence (think NCVerilog, Incisive, etc) docs.  Perhaps others should be encouraged to give it a try. 

Another area that struck me was the number of methods available to control randomization in Vera.  It is interesting that Vera appears to have more functions for controlling the way randomization occurs than Specman given everything is random in Specman.  There are some obvious differences in the random generator as well.  My understanding is that the generator in Specman doesn't do backtracing (meaning it won't iterate through constraints over and over again until the problem is finally solved), at least, not without setting some special debug flags.  That means in Specman you can't always solve all constrained problems, even if a solution exists.  Vera is different.  Vera's solver is guaranteed (if I read the manual correctly) to find a solution if one exists.  Depending on the complexity of the problem it may take awhile for it to come to a solution though. I ran into this last year when I was ramping up on SystemC.  There is a puzzle the Synopsys apps engineers like to pass around that can only easily be solved in Vera.  The solution is tricky both in e and SystemC because neither one supports backtracing. 

I've still got some learning to do when it comes to Vera, but I feel it's important to share my initial observations as soon as I can.  Newcomers to a language always bring with them an interesting perspective that starts to be lost as they become comfortable their new setting.  However, it is exactly this perspective that is the most useful to people trying to decide whether to take the plunge into a new verification language. 

Comments