Date: 
Friday, February 9, 2024
Speaker: 
Dott.ssa Valeria Bragaglia
Abstract: 

Deep Neural Networks have achieved excellent performance on various artificial intelligence (AI) applications, due to their ability to learn from large and unstructured sets of data. Nevertheless, the constant shuffling of big data sets between physically separated memory and processing unit in standard von Neumann architectures, leads to tremendous power inefficiencies. Dedicated neuromorphic hardware can help overcome this issue. Especially memristors are suitable for this dedicated hardware: by enabling to store and process information in the same location, they get around the biggest bottleneck for computing speed and power [1]. In this seminar I will focus on the role of materials science in the development of various devices based on Phase Change Materials and Oxide Resistive Random Access Memories, key building blocks for the realization of the artificial neural and synaptic function in neuromorphic computing. These devices rely on diverse physical mechanisms and materials and their understanding via experimental and theoretical means is pivotal to the device optimization and coupling to the higher layers of the computer architecture [2,3].

[1] A. Sebastian et al., "Memory devices and applications for in-memory computing", Nature Nanotechnology, 15, 529-544, 2020.

[2] T. Stecconi et al., “Analog Resistive Switching Devices for Training Deep Neural Networks with the Novel Tiki-Taka Algorithm” Nano Lett., 24, 866−872, 2024.

[3] G. Syed et al., “In memory compute chips with carbon based projected phase change memory devices”, IEDM 2023.