Item Details

Print View

Bypassing Material Limits in Graphene Transistors Through Gate Engineering

Tseng, Frank
Thesis/Dissertation; Online
Tseng, Frank
Ghosh, Avik
Graphene is the first of now several two-dimensional materials that has garnered significant interest for its potential application as a transistor for digital and radio-frequency applications. Its natural chemical ’flat-land’ has several advantages unique to its hexagonal network of carbon atoms. First, graphene has a measured mobility of 230,000 cm2/V − s compared to silicon mobility of 1400 cm2/V − s, which means electrons in graphene can respond to faster changes input voltage or higher clock cycles. Second, graphene’s atomically thin body allows for ease of channel conductance modulation. The third advantage its intuitive compatibility to advanced planar fabrication processes already developed for silicon complementary-metal- oxide-semiconductor (Si-CMOS) transistors. However switching a graphene field effect transistor(FET) off remains a challenge. In the literature, various methods of reducing OFF-currents and achieving output current saturation have resulted in the reduction in mobility. While it may seem graphene’s future in the context of the CMOS switching paradigm is unsalvageable, how we can reduce OFF-current and extend current saturation without hurting the ON-current or mobility through momentum filtering aided by gate geometry engineering. This work starts by investigating the limitations on electron transport in a conventional graphitic-FET with no band gap at low and high biases through a unified physics based model for graphene IV from ballistic to diffusive limits and from low to high bias. At low-bias, we show how band structure is tied to the fundamental material trade-off between opening bandgaps and mobility. We find that band gap opening increases effective mass and reduces scattering time due to increase in band-edge density of states, thus reducing mobility by a factor of 1/E2. This happens for all graphitic derivatives. Also at low bias, we show how the minimum conductivity behaves in the ballistic and diffusive limit in the presence of impurities. We extracted the entire phase space and showed a flip in curvature followed by a saturation with increased impurity density. At high bias, our model benchmarked with experiments and converted device model to Verliog for use in Cadence for circuit level simulations. We also show how optical phonons influence the high bias current voltage behavior leading to current saturation. Finally we address in particular the trade-off between mobility and opening a bandgap with a proof-of-concept way to bypass these material limits for narrow band gap channel through contact engineering which is unique to the device community. The real merit of our model is the simplicity, and the use of contact engineering alone for momentum rather than energy filtering to reduce OFF-current and extend current saturation without hurting the ON-current (Fig.5.8) that made graphene so promising in the first place. Gates are uniquely positioned and biased to cascade their local narrow bandgaps along a staircase potential profile (Fig.5.1), suppressing the transmission of intermediate conducting modes between the highest conduction band and the lowest valence band (Fig.5.4). The effective mode- filtering widen a gap in the transmission spectrum. The conventional approach of widening a real bandgap in the channel to reduce OFF-current comes at the expense of ON-current due to decrease in mobility. We show a way to bypass this fundamental material limitation. We established a proof-of-concept with a 5nm wide graphene nanoribbon and 2-D bilayer graphene, with bandgaps less than 200meV were convenient channels for simulation, but the concept of gate engineering can be generalized to other narrow bandgap materials with higher mobility.
University of Virginia, Department of Electrical Engineering, PHD (Doctor of Philosophy), 2013
Published Date
PHD (Doctor of Philosophy)
Libra ETD Repository
In CopyrightIn Copyright
▾See more
▴See less


Read Online