I've been having a discussion on Twitter over the last couple of days that the rest of you may find interesting. The question I posed was:
"Can someone give me a reason why I would ever use ovm_transaction instead of ovm_sequence_item? I can't think of any..."
Paul Marriott suggested you should use ovm_transaction to signify intent not to use the data item in a sequence. Dallas McNally suggested the OVM should have used composition instead of inheritance to add the sequence functionality to an OVM transaction. What do I think?
As best as I can tell there is absolutely no compelling reason to ever use ovm_transaction. If you use that class for your data item, it can never be used as a sequence. You might think, "Hey, no one would want to use this packet data class in a sequence." But a year or two later (or a group or two later) someone might want exactly that, but they would be prevented from doing so because the class was based on ovm_transaction. If you have control of the code that's easy to fix, but if not (think VIP) you are stuck.
The only things that are added are additional functions to deal with sequences. It's not clear to me that there would be any performance penalty at all. If I really don't want something to be a sequence item, I would just derive it directly from ovm_object. For example, Paul suggested on Twitter that a configuration value should be an ovm_transaction, not an ovm_sequence_item. That sounds bogus to me - it should just be based on ovm_object if it really is just a generic data structure!
However, I'm happy for someone to prove me wrong. Anyone want to take a shot? :-)
Also, remember to vote for EDA's Next Top Blogger!