Random telegraph noise (RTN) has become an important reliability issue in nanoscale circuits recently. This study proposes a simulation framework to evaluate the temporal performance of digital circuits under the impact of RTN at 16 nm technology node. Two fast algorithms with linear time complexity are proposed: statistical critical path analysis and normal distribution-based analysis.The simulation results reveal that the circuit delay degradation and variation induced by RTN are both >20% and the maximum degradation and variation can be >30%. The effect of power supply tuning and gate sizing techniques on mitigating RTN is also investigated.
|Original language||English (US)|
|Number of pages||10|
|Journal||IET Circuits, Devices and Systems|
|State||Published - 2013|
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering