Does the Brain Do Gradient Descent?

Explore the brain's potential use of gradient descent in learning processes with Konrad Kording in this engaging recorded session.

About the Speakers

Konrad Kording

Konrad Kording

Professor at UPenn, researching credit assignment in the brain and causality in biomedical research. Trained at ETH Zurich, UCL, and MIT.
Jason Eshraghian

Jason Eshraghian

Assistant Professor at UC Santa Cruz, leading UCSC Neuromorphic Computing Group. Focuses on brain-inspired circuits for AI & SNNs. Maintainer of snnTorch.
Fabrizio Ottati

Fabrizio Ottati

AI/ML Processor Engineer at NXP, PhD from Politecnico di Torino. Focuses on event cameras, digital hardware, and deep learning. Maintains Tonic & Expelliarmus.
Social share preview for Does the Brain do Gradient Descent?

Upcoming Workshops

No workshops are currently scheduled. Check back soon for new events!

Are you an expert in a neuromorphic topic? We invite you to share your knowledge with our community. Hosting a workshop is a great way to engage with peers and share your work.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Related Workshops

Open-Source Neuromorphic Research Infrastructure: A Community Panel

Open-Source Neuromorphic Research Infrastructure: A Community Panel

Join leading maintainers of neuromorphic software libraries for a panel discussion on building open-source infrastructure, sharing lessons learned, and shaping the future of the neuromorphic ecosystem.

Hands-On with Sinabs and Speck

Hands-On with Sinabs and Speck

Join Gregor Lenz for an engaging hands-on session featuring Sinabs and Speck. Explore the world of neuromorphic engineering and spike-based machine learning.

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

Join us for a talk by Sangyeob Kim, Postdoctoral researcher at KAIST, on designing efficient accelerators that mix SNNs and ANNs.