Making Neuromorphic Computing Mainstream

Join us for a workshop with Timoleon Moraitis, research group leader in neuromorphic computing, at the interface of computational neuroscience with artificial intelligence.

Neuromorphic computing (NC) recently has been focusing on decreasing the energy consumption of artificial intelligence (AI) through efficient approximations of the more conventional methods. This talk argues that this approach might prevent NC from significantly impacting the mainstream market, because, on the one hand, the performance is then inherently limited to the conventional one at best, and, on the other hand, efficiency as a goal is not unique to NC.

Our recent series of results shows that carefully designed and suitably applied neuromorphic models are not only efficient, but also actually expand the capabilities of the state of the art (SOTA) in AI, surpassing it in accuracy and reward, while also improving speed of inference and learning, even in GPUs. These advantages are obtainable in tasks that were previously often out of reach for neuromorphic models.

The talk will present our work on short-term plasticity, meta-learning, Hebbian learning, self-supervised learning, and partly spiking neural networks. The talk will briefly mention the physical realizations of some of these mechanisms on extremely efficient neuromorphic hardware, namely memristive nanodevices. Thus, Dr Moraitis proposes, we as a field should not aim for efficiency-performance trade-offs, but rather for biological mechanisms that improve SOTA performance – and are also efficient. This strategy has the potential to bring NC to the mainstream.

Social share preview for Making Neuromorphic Computing Mainstream

Upcoming Workshops

Tonic: Building the PyTorch Vision of Neuromorphic Data Loading
Gregor Lenz
September 29, 2025
20:00 - 21:30 CEST

About the Speaker

Dr Timoleon Moraitis has lately focused on demonstrating that the potential of neuromorphic computing extends beyond efficiency, into capabilities and performance that surpass the state of the art in conventional AI. His work with his team ranges from computational neuroscience to deep learning, from theoretical modelling to neuromorphic hardware emulation in nanodevices, and from academic publications to some of the first neuromorphic products in the market. Most recently he led Huawei’s neuromorphic computing group in Zurich, following a position at IBM Research – Zurich. Earlier, during his PhD studies at the Institute of Neuroinformatics (University of Zurich and ETH Zurich), his work included machine learning models of the sensorimotor system, implementation of neuromorphic brain-machine interfaces, surgery and electrophysiology experiments on rats, psychophysics in humans, as well as configuring and using spiking neuromorphic processors.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Related Workshops

Does the Brain do Gradient Descent?

Does the Brain do Gradient Descent?

Explore the brain's potential use of gradient descent in learning processes with Konrad Kording in this engaging recorded session.

Programming Scalable Neuromorphic Algorithms with Fugu

Programming Scalable Neuromorphic Algorithms with Fugu

Explore neural-inspired computing with Brad Aimone, a leading neuroscientist at Sandia Labs. Join us for insights into next-gen technology and neuroscience.

Accelerating Inference and Training at the Edge

Accelerating Inference and Training at the Edge

Join us for a talk by Maxence Ernoult, Research Scientist at Rain, on accelerating inference and training at the edge.