Hi everyone, has anyone successfully added a gap junction to an existing model using NESTML? I would like to add gap junction support to the aeif_cond_alpha neuron model but it is unclear how this can be done within NESTML. As suggested by Charl, I started to look into updating the jinja templates but I am not sure this will be possible since both the input/output of the neuron and the model's equation must be updated to support gap junctions. If anyone has any suggestions or if anyone has NESTML code for adding a gap junction to an existing model, I would be very appreciative. Thanks!
Hello NEST users,
I'm trying to connect some devices in a black box manner and run into this
error:
```
nest.lib.hl_api_exceptions.NESTErrors.IllegalConnection: IllegalConnection
in SLI function Connect_g_g_D_D: Creation of connection is not possible
because:
Source node does not send output.
Note that recorders must be connected as Connect(neuron, recorder).
```
Is there a way for me to inspect whether I should nest.Connect(A, B) or
nest.Connect(B, A), or should I catch this error and try the inverse? Is
this the only `IllegalConnection` error that can be thrown, or should I
parse the error message? (which is prone to changes)
--
Robin De Schepper, MSc (they/them)
Department of Brain and Behavioral Sciences
Unit of Neurophysiology
University of Pavia, Italy
Via Forlanini 6, 27100 Pavia - Italy
Tel: (+39) 038298-7607
http://www-5.unipv.it/dangelo/
Dear NEST users,
We are going to buy some nodes for the departmental HPC cluster and they are currently oriented towards nodes with AMD processors (e.g., AMD EPYC 7413 and AMD EPYC 9654 2). On HPC, in the past years we have always been using Intel-based nodes (e.g., Xeon Gold). I just wanted to check if there are possible issues in using these nodes or if others have already been using multi-CPU AMD nodes successfully.
Thank you and have a great weekend!
Best,
Alberto
--
Alberto Antonietti, Ph.D.
Assistant Professor
Nearlab - NeuroEngineering And medical Robotics Laboratory
Department of Electronics, Information and Bioengineering
Politecnico di Milano
http://www.nearlab.polimi.it/
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday May 22, 11.30-12.30 CEST (UTC+2).
One point of discussion will be the required preparations for the next hackathon (overflow of last meeting).
Also as usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-05-22-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Hi,
I am in trouble here: I would like to stimulate every delta_t a random subset of neurons in a population.
Should I create, for a population of size N, N generators and independently connect them to the N neurons, each spike generator provided with a random spike time series.
Or is there a clever way to do it ?
Thanks for any help!
Best,
Adrien
Dear friends,
Thanks to all your help, I've successfully reproduced the behavior of the modified Izhikevich neuron described in the paper:
Chen, L., Campbell, S.A. Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 50, 445–469 (2022). https://doi.org/10.1007/s10827-022-00825-9
But when I try to connect a population of neurons among themselves, I have problems. In particular, imagine I create 10 neurons with the following Python code:
neurons = nest.Create("izhikevich_ODE", 10)
Then, I found in some of the examples that the neurons do not come connected, and to connect them all-to-all, I had to do the following:
nest.Connect(neurons, neurons, 'all_to_all')
Is that right?
Then, in the NESTML model, how do I receive the spikes into the gating variable? If I am not wrong, I have to set the input as
input:
spikes real <- spike
and then add them to the synaptic gating variable as a simple convolution:
kernel G = delta(t)
SS' = ((-SS/tau) + S_jump/N * convolve(G, spikes)) /s
Note: the /s is because the equations in the paper are without units.
Is this right? As kernel, I also tried exp(-t/tau), with the same tau as the one used for SS, but then it seemed to take forever to compute, so I left the delta... However, if the spikes themselves are Dirac deltas, I do not completely understand the meaning of this convolution... What is the correct way to get the spikes from the other neurons?
thanks!
gus.-
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday May 8, 11.30-12.30 CEST (UTC+2).
One point on the agenda is the recent burst of activity on refactoring of the test suite.
Also as usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
* Test suite refactoring
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-05-08-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear all,
The abstract submission deadline for the NEST Conference 2023 has been extended to 21 May.
We are looking forward to your contributions!
[cid:image001.jpg@01D97E95.93B01E40]
The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application.
This year's conference will again take place as a virtual conference on Thursday/Friday 15/16 June 2023.
We are inviting contributions to the conference, including talks, "posters" and workshops on specific topics.
For more information on how to submit your contribution, register and participate, please visit the conference website
https://nest-simulator.org/conference
Important dates
21 May 2023 — Deadline for submission of contributions
26 May 2023 — Notification of acceptance
5 June 2023 — Registration deadline
15 June 2023 — NEST Conference 2023 starts
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and the conference organizing committee
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Hi all,
I am currently using NEST to build a Liquid State Machine (LSM) as a preprocessing step for my work. I am setting the amplitude_times and amplitude_values of multiple step_current_generators as input, which is read from .npy files as batches of size 32.
However, the issue is that the first dimension of the input data, which is 32 in this example, is not directly handled as the batch size in NEST. To work around this issue, I have implemented the following approach:
I first get the shape of the input and determine the size of the first dimension, which in this case is 32.
I then set up a for-loop that repeats 32 times.
Within each repetition of the for-loop, I set the parameters of the step current generators as input, and then simulate for some time using the "Simulate" function (e.g. using "Simulate(3000)").
While this approach works, it is not ideal as it requires me to simulate each data sample individually. For example, if I have 100 batches, each with 32 data samples, I would need to simulate 100 x 32 x 3000ms to finish processing all the data samples.
Therefore, I am wondering if there is a way to handle the input data directly in batches, as is typically done with tools like PyTorch.
Thank you very much for your assistance!
Best,
Yin
Hi all,
I’m relatively new to NEST as well and would like to provide custom input to neurons over the course of the simulation. I’m trying to model a ring-attractor network that accepts continuously changing input (different neurons receive input over the course of the simulation). I specifically want to do this to update the neural inputs as I simulate animal movement in space (see https://www.pnas.org/doi/10.1073/pnas.2102157118 <https://www.pnas.org/doi/10.1073/pnas.2102157118> for details). Is there a way I can continuously change the neurons in the network that receive external input?
Thank You.
Vivek
> On 3. May 2023, at 08:36, vivek sridhar <vivekhsridhar(a)yahoo.com> wrote:
>
> Hi all,
>
> I’m relatively new to NEST as well and would like to do something similar. I’m trying to model a ring-attractor network that accepts continuously changing input (different neurons receive input over the course of the simulation). I specifically want to do this to recursively update the neural inputs as I simulate animal movement (see https://www.pnas.org/doi/10.1073/pnas.2102157118 <https://www.pnas.org/doi/10.1073/pnas.2102157118> for details). Is there a way I can continuously change the neurons in the network that receive external input?
>
> Thank You.
>
> Vivek