Title: Analog/RF performance comparison of underlap gate stack DG NMOSFETs in sub 65nm regime
Abstract: This paper consists of comparative study of U-DG-GS-NMOSFET for different channel lengths(L <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">ch</sub> ). Channel length below 100 nm leads to Short Channel Effects (SCEs). Gate Induced Drain Leakage (GIDL) and Drain Induced Barrrier Lowering are the major problem for any short channel device. Gate Stack arrangement is used to reduce GIDL and source-drain underlap regions are used to minimize the effects of DIBL. The length of the underlap (Lu) region is optimized for 32nm channel length. With the optimized underlap length, the device performance for different channel lengths such as 32nm, 45nm and 65nm have been studied. The device performance which have been studied are transfer characteristics, drain characteristics, transconductance (g <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m</sub> ), transconductance generation factor (g <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m</sub> /I <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">ds</sub> ), output resistance(R <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">o</sub> ), intrinsic gain(gmRo), total gate capacitance (C <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">gg</sub> ), cut-off frequency (f <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">T</sub> ), maximum frequency of oscillation(f <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">max</sub> ).
Publication Year: 2017
Publication Date: 2017-03-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot