Massive defects are predicted to exist in nanometer-scale chips due to process variation and quantum effects, resulting in the problem of low yields. By quantifying various attributes of erroneous yet acceptable behavior of faulty chips, error-tolerance seeks to improve the effective yield of IC products. One such attribute is error-rate. On the basis of error-rate estimation, previously a fault-oriented test methodology was proposed. Without violating the system error-rate constraint specified by the user, this methodology identifies a set of faults that can be ignored during testing, hence can lead to significant yield improvement. However, the problem of how to generate an appropriate set of test patterns that detect only unacceptable faults has not been addressed. In this paper we show that an ordinary ATPG procedure targeting unacceptable faults usually generates a test set that also detects a large number of acceptable faults, resulting in a significant degradation in achievable yield improvement. We thus propose a novel test pattern selection procedure and an output masking technique to address this problem. Experimental results show that by employing the proposed techniques, only a small number of acceptable faults are still detected. In many cases, the actual yield improvement approaches the predicted optimal value.
|Number of pages||9|
|Journal||International Journal of Electrical Engineering|
|Publication status||Published - 2007 Jun|
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering