The Next Chapter

Around this time back in 2004, I was discussing the possibility of joining Verilab's just-opened Austin office. My goals were pretty basic - I was unhappy in my job at a local startup, and was looking for a change. I figured I'd get some experience with international travel, and maybe learn a thing or two more about verification. Little did I know then where that journey would take me. 

I made some great friends and learned a ton along the way, but after 9 years, I decided that it was again time to try something new. So back in August, I joined Cadence as a Sr. Architect. In my new role I'll be working across multiple groups and divisions at Cadence, and I may be popping up at various locations around the world as I ramp up on my position and Cadence products and services. I look forward to getting a chance to meet more of you face to face than I may have had an opportunity to do while at Verilab. 

You may be wondering - why Cadence? Did something happen to turn me to the Dark Side of the Force? Possibly. But another explanation is that I was presented with an opportunity to make a positive impact on the EDA industry in collaboration with a group of people I have a lot of respect for. Is everything perfect here? No. But if it was, what would be the point of coming into work every day? 

For my friends at other EDA companies, I hope we can still stay in touch. Things should be easier now. Many of you always harbored deeply-seated suspicions that I was on the Cadence payroll due to my never-ending passion surrounding Specman (except for folks at Cadence, who thought I was on the payroll of some other EDA firm, go figure!). And though those suspicions about Cadence were ill-founded in the past, they are certainly true now. So we all now know where we stand on the matter ;).  

My writing on Cool Verification has been pretty limited over the last couple of years. I hope to keep going with the site, though it's possible my interests may change moving forward. 

There are some significant challenges coming over the months and years ahead. If we're still designing and verifying chips using today's techniques 10-15 years in the future, we'll be in a world of hurt. It's exciting to have a chance to work on defining the future, and I'm looking forward to collaborating with all of you both inside and outside of Cadence to do so.


Interview: Thoughts on Verification

My colleague Alex Melikian and I sat down for a chat about verification back in June. The results have been posted on the Verilab blog over the last few weeks. If you happened to miss them, you can check them out here:

Verilab is pleased to introduce “Conversations About Verification”, a monthly publication featuring a discussion on VLSI verification topics. In this inaugural edition, Verilab consultant Alex Melikian discusses first experiences and adoptions of modern verification technologies with JL Gray, Vice President and General Manager, North America of Verilab.

In part 1, JL and Alex discuss about their first experiences involving advanced verification languages and methodologies. They also discuss why and how ASIC/FPGA development centers adopt and integrate Hardware Verification Languages (HVL) and related methodologies into their workflow. In addition, they also discuss some of the impediments as to why others hesitate to make the adoption.

In part 2 of 3 of this conversation, JL and Alex talk over risks/reward involved with adopting an HVL workflow, as well as the diverging perspectives from management and engineers in a company. Also, they discuss the state of HVL technologies today and what might evolve from it next.

In part 3, JL and Alex discuss some of the methodologies, outside of, but complimentary to HVL technologies, such as continuous integration. Typical mistakes and growing pains of adopting HVL methodologies are also reviewed. Finally, JL discusses about his verification blog, along with the various discussions and debates it has generated.


Public Discourse and Open Standards

Back in May I announced that Verilab had joined Accellera and that specifically, I was a member of the Verification IP technical standards committee.  Later that month, I posted some comments about the discussions going on in the VIP TSC.  Then... nothing.  Of course, there have been several conference calls and a face to face at DAC since then.  Why the radio silence? 

Continue reading "Public Discourse and Open Standards" »


Cadence Offers to Buy Mentor - A Verification Consultant's Perspective

Update: June 18, 2008: Added links to additional commentary about the merger at the bottom of this post.

Update 2, June 18, 2008: What is Certe?  Clarified language and made reference to comments of this post for more discussion.

Last Thursday, after DAC had pretty much come to a close, my colleagues and I headed over to the California Adventure theme park for a bit of fun.  One of the highlights was the Tower of Terror, based on the old TV series, The Twilight Zone. "The Twilight Zone" always opened with the following intro from Rod Serling:

"You unlock this door with the key of imagination. Beyond it is another dimension. A dimension of sound. A dimension of sight. A dimension of mind. You're moving into a land of both shadow and substance, of things and ideas. You've just crossed over into...The Twilight Zone."

I felt like I'd entered into "The Twilight Zone" this morning when I opened my email to see mail from current and former colleagues sharing the announcement that Cadence has offered to purchase Mentor Graphics for $16 per share.  The impact such a purchase could have on the EDA industry are enormous and span many technical and business areas.  As it turns out, I'm only really qualified to comment on one - verification, so the rest of this post will be focused on that topic.  How would a merger affect the lives of those of us who do hardware verification for a living? 

Continue reading "Cadence Offers to Buy Mentor - A Verification Consultant's Perspective" »


The AOP vs. OOP Saga Continues

A couple of weeks ago I posted a link to an article from Mentor describing how OOP techniques make it unnecessary to use AOP, and supposedly do an even better job than AOP in many cases.  The topic has picked up now over on the Verification Guild - where Adam Rose from Mentor, Janick Bergeron from Synopsys, and a cast of others have been responding to the age old question - "What's the difference between AOP and OOP?"  Check out responses from my co-consultant in crime David Robinson and myself.

Also, if you'd like, I'd recommend checking out a couple of interesting articles written by people using AOP as part of the AspectJ programming environment:

[email protected]: AOP myths and realities

In the article, Ramnivas discusses in great detail several myths about development in AOP, some of which directly address points made in this discussion thread:

  • Myth 1: AOP is good only for tracing and logging
  • Myth 2: AOP doesn't solve any new problems
  • Myth 3: Well-designed interfaces obviate AOP
  • Myth 4: Design patterns obviate AOP
  • Myth 5: Dynamic proxies obviate AOP
  • Myth 6: Application frameworks obviate AOP
  • Myth 7: Annotations obviate AOP
  • Myth 8: Aspects obscure program flow
  • Myth 9: Debugging with aspects is hard
  • Myth 10: Aspects can break as classes evolve
  • Myth 11: Aspects can't be unit tested
  • Myth 12: AOP implementations don't require a new language
  • Myth 13: AOP is just too complex
  • Myth 14: AOP promotes sloppy design
  • Myth 15: AOP adoption is all or nothing

I'd also like to point out another article by Nicholas Lesiecki, a software engineer at Google:

[email protected]: Enhance design patterns with AspectJ, Part 1
AOP makes patterns lighter, more flexible, and easier to (re)use

It took years for the software development community to understand how to use OOP effectively.  It will probably take years more for AOP techniques to fully take hold.  That doesn't mean the feature doesn't add value - in my experience, it adds tremendous value.  It also doesn't mean that OOP techniques are obsolete.  If the tools, languages, and libraries most commonly used for hardware verification development weren't under the control of the Big Three EDA companies, we might actually be able to get past this perpetually silly AOP vs. OOP argument and focus on figuring out how to apply the right solutions where appropriate in order to be successful at our primary goal - taping out reliable products as quickly as possible.


They Just Don't Get It

I'm on the "Verification Horizons" mailing list from Mentor Graphics.  Today, one of the items caught my attention.  It was a link to an article entitled It's a Matter of Style: SystemVerilog for the e User.  The article describes how, given the lack of AOP in SystemVerilog, a user can implement some of the features available in Specman.  Technically speaking, they are exactly correct.  Anyone using SystemVerilog (especially if you're used to using e) should read the article and follow the recommendations.  However, the conclusions they draw - namely that there is either no difference between SystemVerilog and e or that SystemVerilog is inherently better - are completely false! 

I can't believe that anyone at Mentor has ever written a serious testbench in e.  The article deserves a point-by-point analysis which I don't have time to write up at the moment.  I'll give it a shot over the next week or two.  In the meantime, check out the article, and let me know what you think!


Creating Corporate Standards? Beware...

As chip design organizations grow and mature, they inevitably start to look at creating a global set of standard tools and methodologies.  There are several benefits to this approach, but there are some major drawbacks that need to be taken into account when developing a corporate strategy for design and verification.

Continue reading "Creating Corporate Standards? Beware..." »


The Wisdom of Crowds

I was fiddling around with my original Blogger-based blog this afternoon and noticed something new.  At the top of the page there was an icon called "Flag".  Apparently Google has instituted a mechanism through which they can unlist objectionable content (the blog is still available but won't be publicized through Blogspot).  They reference a book entitled The Wisdom of Crowds by James Surowiecki.  According to Surowiecki, "decentralized decisions can be vastly better than experts (or the public) expect them to be, or even than any one expert can make."  There are some caveats though -- if the decision structures in place are too centralized, too decentralized, or too imitative in nature they can distort the wisdom of the crowds. 

Interestingly enough, we seem to be going through an inflection point (to quote Andy Grove) in the verification industry related to what types of tools and methodologies we use.  Are we acting as a wise crowd would -- soon to come up with some outstanding solutions to the worlds verification issues -- or are our thought processes being distorted by the structure of the EDA industry and the marketing gurus at the Big Three?  Government agencies have talked about using futures markets to predict things like where terrorists are going to strike or where hurricanes will make landfall.  Maybe we need something similar to help us wade through the muddied-waters of the verification marketplace and focus on the most promising new technologies.