Session 1 - Presentation 1
Title: Advanced MOS device technology for ultra-low power IoT applications | ||
Speaker: Shinichi Takagi (The University of Tokyo) | ||
Abstract: Low power consumption is one of the most important requirements for present and future integrated systems, particularly IoT applications, some of which should be driven by battery with no need of replacement or energy harvesting power supply. Here, reduction in supply voltage is most effective in power reduction. Low supply voltage operation of charge-based logic switch devices can be achieved typically by two strategies. One is realization of higher Ion due to higher mobility or velocity channels. Since on-current under ballistic/quasi ballistic transport is proportional to injection velocity at source edge, low effective mass materials are preferred. From this viewpoint, III-V/Ge materials are promising. The other strategy is to develop steep slope devices with lower sub-threshold swing than CMOS. For this purpose, there are two directions, sensitivity increase in surface potential with respect to gate voltage change and introduction of a new carrier conduction mechanism. Examples of the former and the latter directions are negative gate capacitance MOSFETs and tunnel FETs (TFETs), respectively. Particularly, TFETs have recently stirred a strong interest, because of the high compatibility with CMOS platform. Here, materials with small and/or direct band gap such as III-V and Ge are preferred to enhance on-current of TFETs. In this presentation, the critical issues and difficult challenges of such ultra-low power MOS devices are addressed with an emphasis on Ge and III-V channels. Some of viable technologies and demonstrated devices on the Si CMOS platform are introduced for solving these problems. | ||
|
Session 1 - Presentation 2
Title: Going Digital: Transformation of Society, Industry, and Life | ||
Speaker: Hiroyuki Morikawa (The University of Tokyo) | ||
Abstract: Data forms a key pillar in 21st century sources of growth. OECD is discussing the value of data as a new source of growth. The large data sets are becoming a core asset in the economy, fostering new industries, processes and products and creating significant competitive advantages. Big Data, IoT (Internet of Things), and M2M (Machine-to-machine) will be the key for realizing designing a future. Peter Drucker investigated the impact of railroad on society. "The technology of the steam engine did not end with the railroad." Although gthe railroad made the Industrial Revolution accomplished fasth, the boom it triggered lasted almost a hundred years. The dynamics of the technology shifted to totally new "social institutions: the modern postal service, the daily paper, investment banking, and commercial banking, to name just a few." Information and communication technology has also had an enormous impact on society. However, the information and communication technology did not end with broadband infrastructure as Peter Drucker revealed in the case of railroad. The most important impact of the technology should be the creation of totally new industries such as postal service, daily paper, and banking. The talk begins with the value of data. The effect of data to our society is shown to be similar to that of PLC (Programmable Logic Controller). Digitalization promotes the re-definition of business and R&D in all industrial segments, and increases productivity and creates value. Next, the value of data is shown in the area of health care, social infrastructure, agriculture, city planning, and maintenance. Finally, the directions and challenges for realizing data-driven economy are identified for designing a future from the viewpoints of digitalization of physical assets, general purpose technology, re-definition of business and organization, importance of "marines" type units, and the difference between invention and innovation. | ||
|
Session 2 - Presenation 1
Title: Power Supply Impedance Emulation to Eliminate Overkills and Underkills due to the Impedance Difference between ATE and Customer Board | ||
Speaker: Toru Nakura (The University of Tokyo) | ||
Abstract: This paper proposes a new type of power supply circuit for automatic test equipment (ATE) that has ability to emulate arbitrary power supply impedance. It can emulate power supply impedance of customer environment so as to match the power supply voltage fluctuation waveforms of the ATE and of the customer environment, in order to eliminate overkills/underkills coming from the voltage fluctuation difference caused by the impedance difference between the ATE and a practical operating environment. Our technique adjusts the equivalent impedance by injecting compensation current by a current source attached in parallel with the power supply source. The compensation current is calculated and injected in realtime with a feedback manner based on the power supply voltage measurement with the impedance characteristics of ATEfs original power delivery network (PDN) and the customer PDN. Experimental results of prototype circuits are demonstrated to show that the compensation current emulates the impedance, and the both power supply voltage fluctuation waveforms agree well. Limitations and applications of our method are also discussed. | ||
|
Session 2 - Presenation 2
Title: Common Pitfalls in Application of a Threshold Detection Comparator to a Continuous-Time Level Crossing Quantization | ||
Speaker: Takahiro J. Yamaguchi (Advantest Laboratories Ltd.) | ||
Abstract: Propagation delay variation of the threshold detection comparator in current-day level-crossing ADCs is a fundamental impediment to their performance. This paper reviews why commonly used threshold detection comparators can be inappropriate to detect level crossing times of high-frequency signals. The analysis presented in this paper helps establish a new common ground for developing a high-performance LCADC. | ||
|
Session 3 - Presenation 1
Title: Variation and Failure Characterization Through Test Data Analytics | ||
Speaker: Kwang-Ting Cheng (Hong Kong University of Science and Technology) | ||
Abstract: We describe a framework for characterizing systematic variations and failures through exploring the hidden patterns of test data from multiple test stages. The framework first provides prediction of process variations with a fine resolution based on a limited number of probed process parameters. An unsupervised biclustering technique is then utilized to extract spatial patterns from process parameters and production test results, respectively, through analyzing both item-to-item and die-to-die correlations in subsets of the test data. A template matching technique exploits these spatial patterns to discover connections between process variations and failures detected by production tests. The proposed framework has been verified by an industrial test dataset of a non-volatile memory product. The discovery of comprehensible correlations between process parameters and some production test items was confirmed by the engineers who have insights to the test dataset. | ||
|
Session 3 - Presenation 2
Title: Test Chip Design for Yield Learning in 7nm Semiconductor Technologies | ||
Speaker: Shawn Blanton (Carnegie Mellon University) | ||
Abstract: It is common practice for design houses (e.g., Nvidia and Qualcomm), IC foundries (Global Foundries and TSMC), and integrated device manufacturers (IDMs) such as Samsung and Intel to fabricate test chips that have functional characteristics similar to customer products. These test chips are not meant to be sold to customers but are instead example ICs that provide feedback about the design methodology and the underlying fabrication technology. Because test chips are not sold for profit, their volume is typically low so that cost is minimized. More importantly, the design of a test chip is currently ad hoc in nature in that they are typically composed of smaller portions of existing or past designs that have been scaled to the current technology node. Moreover, such designs are not optimal in that they are not ideal characterization vehicles (CVs) for providing design and fabrication feedback. In this talk, a new type of logic characterization vehicle (LCV) that simultaneously optimizes design, test, and diagnosis for yield learning is described. The Carnegie-Mellon LCV is a test chip composed of logic standard cells that uses constant-testability theory and logic/layout diversity to create a parameterized design that exhibits both front- and back-end demographics of a product-like, customer design. Analysis of various CMU-LCV designs (one of which has >4M gates) demonstrates that design time and density, test and diagnosis can all be simultaneously optimized. Several of our designs have been taped out in volume in state-of-the-art technologies with first test results now under analysis. | ||
|
Session 4 - Presentation 1
Title: Rethinking Design in the IoT Era - How Formal Methods Help to Meet the Challenges | ||
Speaker: Wolfgang Kunz (Technische Universitat Kaiserslautern) | ||
Abstract: This talk discusses the possible role of formal verification techniques in system-level design flows. In order to meet the challenges of the IoT era it is argued that the role of formal verification techniques should not be limited to "bug hunting" alone. Instead, formal technology should assume an entirely new role during the design process. It should be applied in such a way that a formal relationship is provided between an abstract system model and its concrete implementation at the Register Transfer Level (RTL). This will allow for new and highly effective approaches to achieving IoT-related design goals such as low power consumption and functional safety. The talk will present several industrial case studies demonstrating the potential of the proposed design methodology. | ||
|
Session 4 - Presentation 2
Title: Software in a Hardware View: New Models for Firmware Development and Safety Analysis in IoT Systems | ||
Speaker: Dominik Stoffel (Technische Universitat Kaiserslautern) | ||
Abstract: The Internet of Things builds upon large populations of embedded devices, many of which perform very specific, limited functions within a given application domain. There is a strong and increasing demand for small and efficient hardware/software computing platforms, at different tradeoff points for performance power consumption, and with strong requirements for safety and reliability. In highly competitive markets, ideally, the employed hardware/software platform should be nearly cost-optimal for the target application, and design productivity is a key factor for success. This poses great challenges for design and verification that can only be met with higher degrees of automation. In this talk, we present a new computational model for hardware-dependent software that allows representing the execution behavior of a processor running low-level software such as the firmware in IoT devices. The model efficiently captures not only the effects of program execution but also the behavior at the hardware/software interface, as, for example, the interaction between the firmware and peripheral devices. We show and discuss applications of this model in firmware development and verification as well as in safety analysis. | ||
|
VLSI Design and Education Center (VDEC), The University of Tokyo