Abstracts
Test's Changing Role in the Late-Silicon Era |
Tim Cheng (University of California, Santa Barbara) |
Abstract: Test will continue to play a critical role in the late-silicon era, but it must be part of a total system validation and reliability solution, instead of an isolated task solely for the purpose of ensuring individual component quality. Traditional test solutions have been application- and system-independent, which results in increased demand for test resources at the component level that may not ultimately contribute to system quality and robustness. Also, test must share the same DFX (Design for X) resources with other critical quality assurance tasks -- where “X” includes verification, post-silicon validation, testability, fault diagnosis, and yield improvement -- for overall cost reduction and quality improvement. In this talk, I will show specific examples and trends in both digital and mixed-signal/RF domains to illustrate the changing role of test. |
IFRA: Instruction Footprint Recording and Analysis for Post-Silicon Validation of Robust Systems |
Subhasish Mitra (Stanford University) |
Abstract: IFRA, an acronym for Instruction Footprint Recording and
Analysis, overcomes major challenges associated with post-silicon bug localization,
a very expensive step in post-silicon validation of processors. Post-silicon
bug localization is a major cost in chip development, and involves pinpointing
a hardware bug location and identifying the instruction sequence that exposes
the bug (also called exposing stimulus) from a system failure, such as
a crash. Major benefits of IFRA over traditional techniques for post-silicon
bug localization are: 1. IFRA does not require full system-level reproduction of bugs. 2. IFRA does not require full system-level simulation. Hence, IFRA can overcome major hurdles that limit the scalability of traditional post-silicon validation methodologies. Results on a complex open-source superscalar processor and also on a commercial state-of-the-art complex processor demonstrate that IFRA is effective in accurately localizing electrical bugs with very little chip-level area impact. This talk will also discuss the applicability of IFRA for multi-core and many-core processor SoCs. |
Testing of 3D Integrated Circuits: Challenges and Emerging Solutions |
Krishnendu Chakrabarty (Duke University) |
Abstract: Three-dimensional (3D) integrated circuits (3D) promise to overcome barriers in interconnect scaling, thereby offering an opportunity to get higher performance using CMOS technology. Despite these benefits, testing remains a major obstacle that hinders the adoption of 3D integration. Test techniques and design-for-testability (DfT) solutions for 3D ICs have remained largely unexplored in the research community, even though experts in industry have identified a number of test challenges related to the lack of probe access for wafers, test access to modules in stacked wafers/dies, thermal concerns, test economics, and new defects arising from unique processing steps such as wafer thinning, alignment, and bonding. In this talk, the speaker will present an overview of 3D integration, its unique processing and assembly steps, testing and DfT challenges, and some of the solutions being advocated for these challenges. The talk will focus on the use of through-silicon-vias for 3D integration, and related processing steps such as via-first/via-last assembly, face-to-face bonding, and face-to-back bonding. The implications of these processing steps on testing will also be discussed. |
Issues and Challenges of Analog Circuit Testing in Mixed-Signal SOC |
Haruo Kobayashi (Gunma University) |
Abstract: "Cost" is the most important criterion for LSI testing, and it makes "issues and challenges of analog circuit testing in mixed-signal SOCs" to be clear and logical. This talk will discuss analog circuit testing issues - from the cost viewpoint - such as low cost testing, analog BIST, ADC testing, VCO testing, on-wafer testing, ATE technology as well as management strategy, and introduce some research topics. |
A Characteristic Function Based Method for Identifying a Deterministic Jitter Model in a Total Jitter Distribution |
Takahiro Yamaguchi (Advantest Corporation) |
Abstract: A new method for identifying a deterministic jitter (DJ) model in a total jitter (TJ) distribution is introduced in this paper. The new method is based on the characteristic function and identifies the DJ model from the given TJ PDF contaminated by an unknown DJ PDF. Benchmark testing using sinusoidal jitter provides validation of the new method for high-performance identification of the DJ model. Experiments on a variety of TJ PDFs also validate the very low false alarm probability of the new method. |
Signature-Based Testing for Adaptive Mixed-Signal Systems |
Mohamed Abbas (University of Tokyo) |
Abstract: In this presentation, I will introduce a cost-effective test methodology for adaptive mixed-signal systems which have digitally-implemented adaptation engine. In these systems, the convergence trajectories of the adaptation unit' coefficients carry information about the health of both analog and digital parts. Utilizing this information, we can test such systems for fault detection. As an example, the testing of adaptive equalizers which follow the digitally-assisted analog design style will be explained in details. In these designs, the states in the digital adaptation engine can be non-invasively observed during the adaptation process in response to the test stimulus. We explore the use of dynamic signatures to differentiate between the good and faulty devices for fault detection. With the application of specific test stimuli, the dynamic signatures are derived from states sequences in the digital adaptation engine sampled during the adaption processes. By comparing the dynamic signatures the device under test (DUT) with the expected fault-free signatures, the health of the adaptive equalizer can be determined. In addition, an automatic technique to generate the test stimuli which maximize the difference between the dynamic signatures of the fault-free and faulty circuits for a better detection margin is also presented. The experimental results clearly show that our method has a higher and comprehensive test coverage compared with the existing approaches. |
Manufacturing Test of Nanometer Integrated Circuits |
Shawn Blanton (Carnegie Mellon University) |
Abstract: Research work in the Advanced Chip Testing Laboratory (www.ecec.cmu.edu/~actl)
of the Electrical and Computer Engineering Department at Carnegie Mellon
University is centered on extracting valuable information from the data
generated from testing integrated circuits (ICs). Since yield is not 100%,
the main objective of test is to determine if the IC is good or bad. Today,
however, test is being used to provide more information about failing chips,
answering questions about whether the design, the process or some combination
of the two is responsible for failure. The information extracted is, ideally,
used to improve design, fabrication and even test itself. In this talk, an overview of all the research in ACTL will be described followed by a detailed description of a diagnosis-based approach for effectively evaluating test metrics and emerging fault models without performing expensive silicon-based test experiments. Silicon results from the application of this methodology to chips from LSI, IBM and Nvidia will be presented. |
On Compositional Observational Equivalence Checking of Hardware |
Zurab Khasidashvili (Intel Haifa) |
Abstract: Hardware is specified with a Hardware Description Language. The
description of the specification model is very close to its logic description.
On the other hand, hardware is manufactured based on a model originated
from a transistor-level description. To make sure that the specification
model has the intended logic functionality, assertions are written for
the specification model in a temporal logic language. Therefore, the aim
of equivalence checking is to insure the correct behavior of the implementation
model in the operation states. This includes both the correct input-output
behavior as well as the validity of the temporal assertions of the specification
model. Here the operation states are the states into which hardware is
brought by a reset sequence. Reset is needed to insure that hardware will
be deterministic after power-up. In this talk, we discuss recent advances in equivalence checking of hardware. In particular, we will discuss compositional methods for proving post-reboot equivalence in a divide-and-conquer fashion. Further, we will briefly describe the underlying algorithms and model-checking techniques used in Intel's equivalence checking tool. Finally, we will present some experimental data on equivalence verification of Intel designs. |
VLSI Design and Education Center (VDEC), The University of Tokyo