Source: EDACafé
A panel in DAC’s technical program this year continues to yield returns. I looked over my notes the other day and found that the moderator and the five panelists identified a few trends that are outside the scope of the traditional verification as known for many years.
Trend #1: Engineering and verification teams are becoming more strategic. They are looking more carefully at the objectives to determine which verification engine is best suited for the task.
Trend #2: Verification today encompasses much more than just simulation, engineering and verification teams acknowledge. It includes hardware-based verification, whether in the form of emulation or FPGA prototyping, and formal verification. It involves pre-silicon verification and post-silicon validation.
Trend #3: Emulation is playing a bigger role in verification today than it ever did. While that’s not news to me, it was interesting to hear it from an emulation user and confirmed by other panelists.
Trend #4: Verification engineers are using actual silicon for post-silicon validation with emulation bridging the gap between the two. It’s become a platform to leverage verification and find bugs that escaped pre-silicon.
Trend #5: Coverage is a key verification area and there is good work going on in the coverage space. EDA companies are making assertions more effective and more available to general engineering.
Trend #6: Coverage has moved to hardware emulation. In the past, verification engineers used system-level hardware for verification and simulation only. System-level hardware only referred to the highest simulation levels, which is still hardware with no software. Performing these tasks on a simulator meant that the engineer needed a system-level environment with integrated lower level environments. It was costly to maintain because even a minor problem in one of the lower-level environments could break the unified environment. By moving it to emulation and using post-silicon tools, engineers reduce the development costs associated with merging environments.
Trend #7: The industry is shifting into the era of synthesizable VIP that is programmable and able to take on any number of verification tasks.
Trend #8: Verification engineers are using formal to verify fixed-point and floating-point digital signal processing.
Trend #9: Formal engines and simulation-based engines are better integrated. Coverage closure is a good example. The dynamic simulation and formal engines create a rich set of coverage points. The formal technology can be used to determine whether coverage is reachable given a configuration of the IP block with the design in the mode of operation.
Trend #10: Verification teams now include formal experts who are teaching other verification engineers what’s possible with formal. Many verification engineers are focused on UVM simulation environments and use some formal to write properties. They are experiencing the power of formal and how they can work out bugs that otherwise make it into simulation and emulation downstream.
Trend #11: Educational information about verification –– simulation, emulation and formal –– is far more available and accessible than ever. Four examples would be Mentor Graphics’ Verification Academy, FormalWorld.org, TV&S Formal Day and Oski Technologies’ Decoding Formal Club events.
In case you were wondering, the panel was titled “The Great Simulation/Emulation Faceoff” and moderated by John Sanguinetti of Adapt-IP. Panelists were Mentor Graphics’ Stephen Bailey, Dave Kelf of OneSpin, Ronny Morad from IBM, Cadence’s Frank Schirrmeister and Alex Starr of Advanced Micro Devices.