Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
747566 | Solid-State Electronics | 2006 | 6 Pages |
Abstract
In recent years several publications have reported reductions in the low frequency noise of MOSFETs under large signal excitation. These observations are important for modern analog and RF circuits. The classically used low frequency noise models for circuit simulation are not able to explain this effect. In this paper, we extend the classical approach to non-equilibrium biasing conditions and give a device-physics based explanation for the noise amplitude reduction. In addition, we present measurements which are in good agreement with the derived model, and suggest approaches to implement the model within standard compact models.
Related Topics
Physical Sciences and Engineering
Engineering
Electrical and Electronic Engineering
Authors
Ralf Brederlow, Jeongwook Koh, Roland Thewes,