Dear Sonja,
Some work that sits on the border between spiking neural network and deeplearning:
https://arxiv.org/abs/1901.09049
Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, Wolfgang Maass
The way how recurrently connected networks of spiking neurons in the brain acquire powerful information processing capabilities through learning has remained a mystery. This lack of understanding is linked to a lack of learning algorithms for recurrent networks of spiking neurons (RSNNs) that are both functionally powerful and can be implemented by known biological mechanisms. Since RSNNs are simultaneously a primary target for implementations of brain-inspired circuits in neuromorphic hardware, this lack of algorithmic insight also hinders technological progress in that area. The gold standard for learning in recurrent neural networks in machine learning is back-propagation through time (BPTT), which implements stochastic gradient descent with regard to a given loss function. But BPTT is unrealistic from a biological perspective, since it requires a transmission of error signals backwards in time and in space, i.e., from post- to presynaptic neurons. We show that an online merging of locally available information during a computation with suitable top-down learning signals in real-time provides highly capable approximations to BPTT. For tasks where information on errors arises only late during a network computation, we enrich locally available information through feedforward eligibility traces of synapses that can easily be computed in an online manner. The resulting new generation of learning algorithms for recurrent neural networks provides a new understanding of network learning in the brain that can be tested experimentally. In addition, these algorithms provide efficient methods for on-chip training of RSNNs in neuromorphic hardware.
It is my understanding that work is going on to implement this method in NEST also.
Greets,
Wouter
On 15-Jun-20 14:24, s.kraemer96@gmx.net wrote:
Dear all, I´m writing a master thesis on spiking neural networks and how transparent they are. For that I need to implement a SNN network and train it. So I started with Brian but that is much to complex and I don´t need something special. So I decided to use PyNest. I did all the tutorials but I´m missing a tutorial how to train the network. I don´t know how to put in a dataset to train the model. I haven´t found anything to this topic. So my questions are:
- Can PyNest train set up a SNN and train it trough data and if not is there another simulator who can do this?
- How do I do it? Is there anything I missed to read or can someone send me an example? This would be very helpful.
Thanks for your help.
Best, Sonja _______________________________________________ NEST Users mailing list -- users@nest-simulator.org To unsubscribe send an email to users-leave@nest-simulator.org
-- Wouter Klijn w.klijn@fz-juelich.de
Team Leader Multiscale simulation and design SimLab Neuroscience Jülich Supercomputing Centre Institute for Advanced Simulation Forschungszentrum Jülich http://www.fz-juelich.de/ias/jsc/slns
Office: +49 2461 61-3523 Fax # : +49 2461 61-6656
------------------------------------------------------------------------------------------------ ------------------------------------------------------------------------------------------------ Forschungszentrum Juelich GmbH 52425 Juelich Sitz der Gesellschaft: Juelich Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498 Vorsitzender des Aufsichtsrats: MinDir Volker Rieke Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender), Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt ------------------------------------------------------------------------------------------------ ------------------------------------------------------------------------------------------------