Description of project relevance to DOE mission. For decades, the semiconductor technology has followed Moore’s law, making newer
devices more powerful and energy efficient. Recently, however, it has reached a point where the performance and the energy efficiency of the device do not improve with the shrinking device size. One of the fundamental reasons of this deviation from the past trend is the interconnect resistance, which becomes larger with the shrinking size. It causes the delay of signals and heating of the circuits. The device’s size is so small that the quantum mechanical effects can no longer be ignored and the traditional continuum simulation tools such as TCAD become inadequate. In this project, Samsung Semiconductor Inc. and Lawrence Berkeley National Laboratory will collaborate to perform first of kind device-scale ab initio simulations to optimize materials and interconnect morphology to minimize interconnect resistance. Mitigating this problem will allow the continued down scaling of the devices. To optimize the performance of the interconnect, the semiconductor industry heavily relies on computer simulations. So far the industrial simulation has been mostly based on the TCAD methods that use bulk diffusion equations and ad hoc scattering parameters (either empirical or heavily calibrated to experiment). For sub-10 nm interconnect the following quantum mechanical effects become important: electron wave function confinement, the atomic scale scatterings at the metal/metal interface, metal grain boundaries, and the metal surfaces. The continuum TCAD method does not treat these effects accurately and it becomes increasingly hard to reproduce the experimental results without heavy parameter calibrations to hardware data. To overcome this problem, we will use atomistic first principles quantum mechanical method to simulate and optimize the interconnect geometry and material compositions to reduce its resistance.
|Source||Hours||Start Date||End Date|