Event Start Date: 24. October 2018 | Event End Date: 24. October 2018 | Event Venue: |
Spiking neural network simulation on SpiNNaker
A massively parallel neuromorphic computing platform
Wednesday, 24 October, 14.00-15.00
Physiology Lunch Room/Library, Domus Medica, UiO (next to Rikshopitalet and Gaustad Hotel)
SpiNNaker: SpiNNaker is the largest massively-parallel neuromorphic computer platform in the world. It has a million cores and can simulate hundreds of millions of neurons, and hundreds of billions of synapses.
Abstract: Simulating the brain is a challenge, even for modern supercomputers. As simulations grow in
size, traditional memory and communications mechanisms do not scale, and energy consumption can
become prohibitive. A solution to this problem is to develop bespoke hardware tailored to brain
simulation, also known as neuromorphic hardware. This seminar will introduce SpiNNaker (Spiking
Neural Network Architecture), a neuromorphic platform developed at the University of Manchester.
This massively parallel machine, comprised of over 1 million programmable ARM cores with a unique
routing framework, enables real-time power-efficient simulation of large-scale spiking neural
networks. As well as introducing the machine architecture and software, a range of simulations will
be presented, including a discussion of how simulating the brain can also inform the field of computer
science in order to help us develop more efficient, fault-tolerant computing machines.
Oliver Rhodes is a researcher from the University of Manchester,
UK. He earned a masters degree in Mechanical Engineering from the
University of Exeter, and was awarded a PhD in Aeronautical
Engineering from Imperial College London, for his work exploring
optimal design methods for flexible morphing structures. After several
years in industry with SIMULIA, the simulation division of the Dassault
Systemes software company, he joined the lab of Prof. Steve Furber
in Manchester to work on neural simulation software for the
SpiNNaker neuromorphic platform.