Spyx Hackathon: Speeding Up Neuromorphic Computing

Explore the power of Spyx in a hands-on hackathon session and dive into the world of neuromorphic frameworks with Kade Heckel.

Edit Page

Join us on December 13th for an exciting Spyx hackathon and ONM talk! Learn how to use and contribute to Spyx , a high-performance spiking neural network library, and gain insights into the latest developments in neuromorphic frameworks. The session will cover Spyx’s utilization of memory and GPU to maximize training throughput, along with discussions on the evolving landscape of neuromorphic computing.

Don’t miss this opportunity to engage with experts, collaborate on cutting-edge projects, and explore the potential of Spyx in shaping the future of neuromorphic computing. Whether you’re a seasoned developer or just curious about the field, this event promises valuable insights and hands-on experience.

Agenda:

  • 18:00 - 19:00: Spyx Introduction
    • Dive into Spyx, its features, and how to contribute
    • Hands-on session: Explore Spyx functionalities and tackle real-world challenges
    • Q&A and collaborative discussions
  • 19:00 - 20:00: Hackathon
    • Collaborate on cutting-edge projects and explore the potential of Spyx
    • Q&A and collaborative discussions

Speakers:

  • Kade Heckel

Note: The event will be hosted virtually. Stay tuned for the video link and further updates. Let’s come together to push the boundaries of neuromorphic computing!

Social share preview for Spyx Hackathon: Speeding up Neuromorphic Computing

Upcoming Workshops

No workshops are currently scheduled. Check back soon for new events!

Are you an expert in a neuromorphic topic? We invite you to share your knowledge with our community. Hosting a workshop is a great way to engage with peers and share your work.

Spyx Hackathon: Speeding up Neuromorphic Computing

About the Speaker

Kade studied Computer Science and Computer Engineering at the U.S. Naval Academy. Studying in the UK as a Marshall Scholar, Kade completed an MSc in A.I. and Adaptive Systems with distinction from the University of Sussex and is currently pursuing an MPhil in Machine Learning and Machine Intelligence at the University of Cambridge. His dissertation at Sussex focused on comparing surrogate gradient and large scale neuroevolutionary algorithms for optimizing spiking neural networks.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Learn How to Present

Related Workshops

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

Join us for a talk by Sangyeob Kim, Postdoctoral researcher at KAIST, on designing efficient accelerators that mix SNNs and ANNs.

Accelerating Inference and Training at the Edge

Accelerating Inference and Training at the Edge

Join us for a talk by Maxence Ernoult, Research Scientist at Rain, on accelerating inference and training at the edge.

Hands-on with Xylo and Rockpool

Hands-on with Xylo and Rockpool

Discover Xylo and Rockpool in a hands-on session with Dylan Muir, exploring cutting-edge neural computation architectures and signal processing.