Here are the answers to the five most common questions posed by chip designers and verification engineers to Dr. Lauro Rizzatti in 2016.
Source: EETimes
Here are the answers to the five most common questions posed by chip designers and verification engineers to Dr. Lauro Rizzatti in 2016.
Business travel took me to several different continents in 2016, and gave me the opportunity to talk with chip designers and verification engineers, many of whom had questions about design verification strategies and hardware emulation. Here I answer the five most common questions posed in 2016.
1. Why is emulation suddenly the foundation of design verification flows? It once was considered too difficult to use and too costly for all but the most complex chip designs.
For two unrelated reasons. One, the ever-increasing demand on performance of verification tools, and two, the remarkable progress in hardware emulation technology. The convergence of the two has propelled hardware emulation to the position of prominence in the verification toolbox of any design team.
Today, SoC designs consist of two soaring domains: staggering hardware complexity and escalating software content. Only hardware emulation can handle the demanding task of verifying the integration of the two, and trace design bugs across their boundary.
The invention of virtualization technology in support of hardware emulation, pioneered by IKOS Design Systems in the late 90s, opened up the path to new deployment modes, including the creation of emulation data centers. Emulation data centers magnify the return on investment (ROI) of emulators.
2. Emulation is now considered a data center resource. How is this possible and why is it significant?
As mentioned above, it was the ingenuity of virtualization that made emulation data centers possible. Hardware emulation was conceived three decades ago to support the testing of the design-under-test (DUT) via real-world traffic in place of the slow and limited software testbench. The testing was called in-circuit-emulation or ICE. The remarkable idea came with a bag of thorns, including expensive single-user resource, heavy dependency on physical world non-deterministic behavior, costly acquisition and maintenance, and cumbersome and unfriendly deployment.
Replacing the physical world with a functionally equivalent virtual environment removed most of the drawbacks of ICE. Today, emulators are concurrent multi-user, multi-project resources. They can be accessed remotely from anywhere in the world. Despite their perceived high cost of ownership, their ROI is at least two orders of magnitude better than in the old days.
3. The “apps” model has come to hardware emulation and they are available for ICE, power, and design for test. Emulation isn’t a cell phone. How is this possible?
Borrowing the “apps” idea from Apple was one of the best boosts in emulation over the past couple of years since it expanded the emulator’s use into new verification territories to solve verification challenges not normally addressed by emulation. The implementation of the “apps” approach required the re-architecting of the software supporting the emulator. It called for the creation of an operating system to play the same role as that of the OS in a general-purpose computer. The OS encapsulates the underlying hardware and de-facto separates it from the application software. When properly architected and designed, the OS can accommodate different generations of the hardware and ensure that any software application would run across all of them.
As in a computer, “apps” can be developed with limited dependency on the underlying hardware protected by the OS.
Today “apps” are available for power analysis and estimation, design-for-testability (DFT) and for removing the non-deterministic behavior of ICE. More “apps” will be devised in the years to come.
4. What would you consider to be emulation’s value proposition?
Whether we like it or not, market laws rule our lives. They can generate wealth and they can destroy fortunes. Miss a market window for a new product in a highly competitive market at your own risk — it could kill your product and take down the company.
In the electronic design world, missing a market window is typically due to a silicon re-spin, but generally, it is due to a poorly scheduled roadmap with inadequate resources in manpower and design tools.
The more advanced the technology process node is, the higher the cost of the re-spin. But no matter how costly a re-spin can be, the late-market entry is vastly more expensive. Three months late wipes out 1/3 of the total potential revenue.
What it boils down to is this: It is mandatory to remove the risk of missing a market window. Hardware emulation is the best de-risking verification tool for the task. By virtue of its thorough and fast hardware/software verification capabilities, it can eliminate re-spins, accelerate the roadmap schedule and, at the same time, increase the quality of the product.
6. What kinds of announcements about hardware emulation can we expect to see in 2017? What about support for new markets, such as IoT, automotive, safety, and security?
In 2017, hardware emulation will continue to be adopted even on a larger scale. Emulation data centers provide optimum return-on-investment and will become more-and-more popular. We will also see new emulation “apps” targeting new verification tasks in emerging industries, like IoT and automotive. Safety is going to be paramount in the automotive industry.
In light of the recent internet hacking, security will be a pervasive requirement in the IoT industry. Expect to see announcements about hardware emulation’s support for these markets.
Dr. Lauro Rizzatti is a verification consultant and industry expert on hardware emulation (www.rizzatti.com). Previously, Dr. Rizzatti held positions in management, product marketing, technical marketing, and engineering. He can be reached at lauro@rizzatti.com.