I received a link to the paper "Large-Scale Hardware Simulation: Modeling and Verification Strategies" from a colleague last month, but just got around to reading it today. It's an interesting analysis of verification strategies written by Douglas Clark from DEC back in 1990. In the paper, Doug discusses the importance of using bug rate instead of checking boxes off of a test plan as the determining factor for when verification is complete. He also discusses the importance of random verification, the benefits of simulating the real design instead of a model, and the value of doing everything possible to trade engineering time for CPU cycles.
The thing that strikes me the most about this paper is that it is still relevent, even after 16 years. The other thing about the paper that I find interesting is that some people feel directed testing is fine for the small designs of today. If you think about it, the large designs in 1990 were potentially even smaller than the small designs of today. Anyways, take a look at the paper and let me know what you think!