Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
545077 | Microelectronics Reliability | 2012 | 5 Pages |
Abstract
The only way to keep pace with Moore’s Law is to use probabilistic computing for memory design. Probabilistic computing is ‘unavoidable’, especially when scaled memory dimensions go down to the levels where variability takes over. In order to print features below 20 nm, novel lithographies such as Extreme Ultra Violet (EUV) are required. However, transistor structures and memory arrays are strongly affected by pattern roughness caused by the randomness of such lithography, leading to variability induced data errors in the memory read-out. This paper demonstrates a probabilistic–holistic look at how to handle bit errors of NAND Flash memory and trades-off between lithography processes and error-correcting codes to ensure the data integrity.
Related Topics
Physical Sciences and Engineering
Computer Science
Hardware and Architecture
Authors
Pavel Poliakov, Pieter Blomme, Alessandro Vaglio Pret, Miguel Miranda Corbalan, Roel Gronheid, Diederik Verkest, Jan Van Houdt, Wim Dehaene,