Dear NEST users,
We are going to buy some nodes for the departmental HPC cluster and they are currently oriented towards nodes with AMD processors (e.g., AMD EPYC 7413 and AMD EPYC 9654 2). On HPC, in the past years we have always been using Intel-based nodes (e.g., Xeon Gold). I just wanted to check if there are possible issues in using these nodes or if others have already been using multi-CPU AMD nodes successfully.
Thank you and have a great weekend!
Best,
Alberto
--
Alberto Antonietti, Ph.D.
Assistant Professor
Nearlab - NeuroEngineering And medical Robotics Laboratory
Department of Electronics, Information and Bioengineering
Politecnico di Milano
http://www.nearlab.polimi.it/
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday May 22, 11.30-12.30 CEST (UTC+2).
One point of discussion will be the required preparations for the next hackathon (overflow of last meeting).
Also as usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-05-22-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Hi,
I am in trouble here: I would like to stimulate every delta_t a random subset of neurons in a population.
Should I create, for a population of size N, N generators and independently connect them to the N neurons, each spike generator provided with a random spike time series.
Or is there a clever way to do it ?
Thanks for any help!
Best,
Adrien
Dear friends,
Thanks to all your help, I've successfully reproduced the behavior of the modified Izhikevich neuron described in the paper:
Chen, L., Campbell, S.A. Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 50, 445–469 (2022). https://doi.org/10.1007/s10827-022-00825-9
But when I try to connect a population of neurons among themselves, I have problems. In particular, imagine I create 10 neurons with the following Python code:
neurons = nest.Create("izhikevich_ODE", 10)
Then, I found in some of the examples that the neurons do not come connected, and to connect them all-to-all, I had to do the following:
nest.Connect(neurons, neurons, 'all_to_all')
Is that right?
Then, in the NESTML model, how do I receive the spikes into the gating variable? If I am not wrong, I have to set the input as
input:
spikes real <- spike
and then add them to the synaptic gating variable as a simple convolution:
kernel G = delta(t)
SS' = ((-SS/tau) + S_jump/N * convolve(G, spikes)) /s
Note: the /s is because the equations in the paper are without units.
Is this right? As kernel, I also tried exp(-t/tau), with the same tau as the one used for SS, but then it seemed to take forever to compute, so I left the delta... However, if the spikes themselves are Dirac deltas, I do not completely understand the meaning of this convolution... What is the correct way to get the spikes from the other neurons?
thanks!
gus.-
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday May 8, 11.30-12.30 CEST (UTC+2).
One point on the agenda is the recent burst of activity on refactoring of the test suite.
Also as usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
* Test suite refactoring
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-05-08-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear all,
The abstract submission deadline for the NEST Conference 2023 has been extended to 21 May.
We are looking forward to your contributions!
[cid:image001.jpg@01D97E95.93B01E40]
The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application.
This year's conference will again take place as a virtual conference on Thursday/Friday 15/16 June 2023.
We are inviting contributions to the conference, including talks, "posters" and workshops on specific topics.
For more information on how to submit your contribution, register and participate, please visit the conference website
https://nest-simulator.org/conference
Important dates
21 May 2023 — Deadline for submission of contributions
26 May 2023 — Notification of acceptance
5 June 2023 — Registration deadline
15 June 2023 — NEST Conference 2023 starts
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and the conference organizing committee
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Hi all,
I am currently using NEST to build a Liquid State Machine (LSM) as a preprocessing step for my work. I am setting the amplitude_times and amplitude_values of multiple step_current_generators as input, which is read from .npy files as batches of size 32.
However, the issue is that the first dimension of the input data, which is 32 in this example, is not directly handled as the batch size in NEST. To work around this issue, I have implemented the following approach:
I first get the shape of the input and determine the size of the first dimension, which in this case is 32.
I then set up a for-loop that repeats 32 times.
Within each repetition of the for-loop, I set the parameters of the step current generators as input, and then simulate for some time using the "Simulate" function (e.g. using "Simulate(3000)").
While this approach works, it is not ideal as it requires me to simulate each data sample individually. For example, if I have 100 batches, each with 32 data samples, I would need to simulate 100 x 32 x 3000ms to finish processing all the data samples.
Therefore, I am wondering if there is a way to handle the input data directly in batches, as is typically done with tools like PyTorch.
Thank you very much for your assistance!
Best,
Yin
Hi all,
I’m relatively new to NEST as well and would like to provide custom input to neurons over the course of the simulation. I’m trying to model a ring-attractor network that accepts continuously changing input (different neurons receive input over the course of the simulation). I specifically want to do this to update the neural inputs as I simulate animal movement in space (see https://www.pnas.org/doi/10.1073/pnas.2102157118 <https://www.pnas.org/doi/10.1073/pnas.2102157118> for details). Is there a way I can continuously change the neurons in the network that receive external input?
Thank You.
Vivek
> On 3. May 2023, at 08:36, vivek sridhar <vivekhsridhar(a)yahoo.com> wrote:
>
> Hi all,
>
> I’m relatively new to NEST as well and would like to do something similar. I’m trying to model a ring-attractor network that accepts continuously changing input (different neurons receive input over the course of the simulation). I specifically want to do this to recursively update the neural inputs as I simulate animal movement (see https://www.pnas.org/doi/10.1073/pnas.2102157118 <https://www.pnas.org/doi/10.1073/pnas.2102157118> for details). Is there a way I can continuously change the neurons in the network that receive external input?
>
> Thank You.
>
> Vivek
Dear NEST Developers and Users,
Those of you following the NEST Github repository will have noticed a high level of activity last week and an all new appearance of Github actions for NEST (see https://github.com/nest/nest-simulator/actions/runs/4858443145 for an example).
First of all, thanks to a major effort by Dennis Terhorst in particular, we have a completely re-organized continuous integration test setup. A considerably larger set of static code checks is now run independent of each other in a first stage of testing before NEST is build on Linux and macOS runners for actual testing. Furthermore, documentation is also linted and test-built. Overall, this setup runs faster and provides more information and assurances.
Second, we have started a major effort to port all tests from the SLI-based testsuite to Python using pytest, so that we eventually will be able to remove the SLI interpreter and still have a full set of tests for NEST. Over time, the number of tests in `testsuite/{unittests, regressiontests, mpitests}` will decrease as tests are moved to `testsuite/pytests`. In that directory, new tests are distributed into several subdirectories according the the part of NEST they cover. Those directories are all prefixed with `sli2py_` for now; the prefix will be dropped when the transition is complete.
We have tried to come up with good pytest test designs, but we are still developing our skills and style in this area. In general, consider tests in the sli2py_ directories as reference. Tests in the main pytests directory and its other subdirectories are mostly written in older unittest style and have not undergone thorough review. We plan to bring them ajour later.
As a side effect of this transition, the number of tests reported by the testsuite has increased significantly. This has mainly two reasons: Many *.sli test files contained multiple tests but were counted only as one. And pytest's parametrize support allows us to run tests more systematically across models.
If you would like to join our effort in porting tests form SLI to Python, why not drop by the Open NEST Developer VC on Monday?
Best,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Hi All,
I am using the aeif_cond_alpha neuron model in NEST and the synaptic dynamics are calculated together with the membrane potential of the neuron model. I understand that the synapse uses an alpha function. However, it is unclear what the actual equation is for calculating the synaptic conductance. After looking through the documentation and C++ code it was still unclear to me, however, I found this journal article which defines an equation for the synaptic conductance (equation 4), https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0057134. Is this the correct equation? Thanks for any help!
Best,
Beck