Source: EDACafe
Emulation is enjoying its moment in the spotlight and none too soon. Design complexity of all types has conspired to make chip verification an arduous task. These days, the fabric of system-on-chip (SoC) designs includes several processing cores, large sets of specialized IP, a plethora of peripherals and complex memories, routinely pushing the design size into the hundreds of million gates. Embedded software now exceeds the complexity of the hardware.
Consider that for each hardware designer there are at least five software developers. No surprise that chip verification and validation has become an overriding concern for all project teams, particularly when hardware and software integration is concerned. Here is where the rubber meets the road, and where the verification challenges reach their peak.
Hardware emulation has moved to the center stage of many verification strategies and for good reason. Gone are the days when processor and graphics designs were the only candidates for hardware emulation. Those days left a bad taste and the harsh expression “Time-to-Emulation,” meaning the time required to prepare a design for emulation. It was measured in months, often surpassing the time when first silicon was released from the foundry.
Today’s hardware emulation solutions have the capacity to support the largest designs ever developed. The time-to-emulation is measured in days. Full visibility and interactive debugging is achieved without compiling instrumentation inside the design. Overall functionality has improved substantially. The execution speed has been increased to the point where software developers are now using it for tracing hardware issues when booting an operating system, or processing software drivers.
With emulation’s new-found celebrity, it seemed fitting to put together a panel discussion to explore in more detail why it’s so popular. Please join me for Emulation –– Why So Much Talk? It will be held Tuesday, June 9, at 4 p.m. during the 52nd Design Automation Conference (DAC) in the Mentor Graphics Booth #1432. Panelists include Alex Starr of AMD, Laurent Ducousso of STMicroelectronics, and Jim Hogan from Vista Ventures.
Now considered the most versatile and powerful of verification tools, hardware emulation also enjoys the distinction of being the least expensive on a cost-per-verification-cycle basis. While the traditional in-circuit emulation (ICE) mode is still in use, new modes have expanded its usage. From simulation acceleration, to transaction-based acceleration, to stand-alone emulation, users can pick and choose what is best for the task at hand. Project teams use it for a range of verification/validation tasks, from hardware debugging, to hardware/software co-verification or integration, system-level prototyping, low-power verification, power estimation and performance characterization.
Even more interesting is emulation’s move into the design datacenter. Emulation enterprise servers are increasingly getting traction, enabling a centralized team of emulation experts to support a multitude of users in different continents and time zones. Because they can be accessed remotely without manned supervision, several concurrent users can use them to debug large designs or any combination of large and small designs.
New architectures will assure longevity to hardware emulation. New ideas will expand its deployment beyond today boundaries. Hardware emulation driven design verification, hardware-software integration and design validation is here to stay and lead the way into the future.
The panel and I plan to cover these use models and where emulation is headed. We look forward to seeing you at DAC.