平成22年度 VDEC高位設計セミナー

2011年2月2日(水) 10:00-20:00(懇親会17:00-20:00)
東京大学 武田先端知ビル 5階 武田ホール


Abstracts

Keynote: Acceleration of Verification and Verification of Acceleration
Oskar Mencer (CEO, Maxeler Technologies)
Abstract: Acceleration and Verification are mutually important components. Acceleration of individual computer applications via special hardware/software extensions "benefits" from verification, i.e. making sure that the accelerated application still produces the correct result for all relevant input patterns. At the same time verification can take a lot of time if there are very many such relevant inputs, and as a consequence acceleration is of key value. Maxeler provides acceleration solutions and we encounter a range of verification approaches, depending on the domain and people involved. In addition, the acceleration of key verification algorithms such as SAT will show an instance specific acceleration approach using FPGAs.

Efficient and Practical Prevention of X-Related Bugs
Pranav Ashar (CTO, Real Intent)
Abstract: Designers must ensure that their gate level netlist produces same results as RTL simulation results. X-propagation is a major cause of differences between gate level and RTL functionality.?It is a painful and time consuming process to identify X sources and chase their propagation from RTL to Gate. Logical equivalence checkers ignore X-propagation and gate level simulations are very slow. Such "X-Prop" issues often lead to dangerous masking of real bugs. This presentation explains the common sources of X' s, shows how they can mask real bugs that affect functionality and why they are difficult to avoid. It also discusses the challenges that Real Intent overcame in developing a unique and efficient solution to assist designers in catching bugs caused by X propagation.

Functional Qualification of Verification Environments for Digital Logic Design
Bindesh Patel (Technology manager, SpringSoft)
Abstract: Today's leading-edge designs are verified by sophisticated and diverse verification environments, the complexity of which often rivals or exceeds that of the design itself. Despite advancements in the area of stimulus generation and coverage, existing techniques provide no comprehensive, objective measurement of the quality of your verification environment. They do not tell you how good your testbench is at propagating the effects of bugs to observable outputs or detecting the presence of bugs. The result is that decisions about when you are "done" verifying are often based on partial data or "gut feel" assessments. These shortcomings have led to the development of a new approach, known as Functional Qualification, which provides an objective measure of the quality of your verification environment and guidance on how to improve it. If used effectively, Functional Qualification can help you in the early stages of verification environment development. This seminar provides background information on mutation-based techniques - the technology behind Functional Qualification - and how they are applied to assess the quality of your verification environment. We'll discuss the problems and weaknesses that Functional Qualification exposes and how they translate into fixes and improvements that give you more confidence in the effectiveness of your verification efforts.

Acceleration of numeric calculations on FPGAs
Akira Fukui (Graduate student, The University of Tokyo)
Abstract: Two topics of acceleration with FPGAs are introduced. First, implementation of Smith-Waterman Algorithm on Virtex-5 SX240T which acquired 138 times speed up compared to SSEARCH on CPU(Xeon X5570) is introduced. The speed was acquired with an implementation of pipelining, parallelizing and some optimizations. This speed shows us the FPGA's ability to deal with numeric calculations. Secondly, discussions about acceleration of TUNAMI Simulator is done. The success of speed up may enable real-time simulation. TUNAMI project has just started, and is still in the discussion phase. The flow of the whole program and some Bottlenecks of implementations are introduced.

Assertion Synthesis: Enabling Assertion-Based Verification For Simulation, Formal and Emulation Flows
Yunshan Zhu (CEO, Nextop)
Abstract: Assertion-based verification (ABV) helps design and verification teams accelerate verification sign-off by enhancing RTL and test specifications with assertions and functional coverage properties. The effectiveness of ABV methodology has been limited by the manual process of creating adequate assertions. Assertion synthesis leverages RTL and testbench to automatically create high quality functional assertions and coverage properties, and therefore removes the bottleneck of ABV adoption. The synthesized properties can be seamlessly integrated in simulation, formal and emulation flows to find bugs, identify coverage holes and improve verification observability.

Innovative Efficiencies for Understanding SystemVerilog Testbench Behavior
Bindesh Patel (Technology manager, SpringSoft)
Abstract: The adoption of SystemVerilog as the core of a modern constrained-random verification environment is ever-increasing. The automation and sophisticated stimulus and checking capabilities are large reason why. The supporting standards libraries and methodologies that have emerged have made the case for adoption even stronger and all the major simulators now support the language nearly 100%. A major consideration in verification is debugging and naturally, debug tools have to extend and innovate around the language. Because the language is object-oriented and more software-like, the standard techniques that have helped with HDL-based debug no longer apply. For example, event-based signal dumping provides unlimited visibility into the behavior of an HDL-based environment; unfortunately, such straight-forward dumping is not exactly meaningful for SystemVerilog testbenches. Innovation is necessary. This seminar will discuss the use of message logging and how to leverage the transactional nature of OVM and UVM-based SystemVerilog testbenches to automatically record transaction data. We'll show you how this data can be viewed in a waveform or a sequence diagram to give you a clearer picture of the functional behavior of the testbench. For more detailed visibility into the testbench execution, we will also discuss emerging technologies that will allow you to dump dynamic object data and view it in innovative ways was well as using this same data to drive other applications such as simulation-free virtual interactive capability.

What You Need to Know for Effective CDC Verification
Pranav Ashar (CTO, Real Intent)
Abstract: The complexity of clock architecture is growing with larger designs. Functionality that was traditionally distributed among multiple chips is now integrated into a single chip. As a result, the number of clock domains is increasing and Clock domain crossing (CDC) verification has become increasingly important and complex. For the effectiveness of CDC analysis tools it is required that designers/verification engineers have good knowledge of a design's clock/reset architecture so that complete and accurate constraints can be provided to CDC tools. This knowledge also helps designers/verification engineers understand CDC analysis results meaningfully and efficiently. This seminar discusses what designers/verification engineers need to know in order to perform effective CDC verification.

VLSI Design and Education Center (VDEC), The University of Tokyo