Johannes Leugering (Fraunhofer IIS): Neuromorphic Computing and its Mathematical Challenges
The stellar rise of Deep Learning in recent years has produced models numbering millions, if not billions of parameters and consuming vast amounts of compute resources. Consequently, a lot of theoretical work and implementation effort has been invested into optimizing these models for efficient execution on CPUs and GPUs – in terms of operations performed, memory used or the number of parameters. But instead of thus adapting our models to the available hardware, we could also develop novel hardware on which to execute our models efficiently – that is the premise of Neuromorphic Computing. In this talk, I’d like to present the basic idea behind Neuromorphic Computing and highlight some of the mathematical challenges for Deep Learning that arise in this context.
The Zoom link can be found in the StudOn group „AG Mathematics of Deep Learning“.