Hi, Charl
Thanks for your reply. And sorry about taking such a long time in this question. My main research interest is the biophysical model of glutamatergic cortico-striatal projections.
I previous constructed a striatal circuit (contain two neuron populations MSN and FSI) using simple Integrate-and-fire model, and it successfully reproduced the firing state of striatum MSN and FSI. In that case I simplified the glutamatergic cortical input from multi-synapse to single synapse.
Since glutamatergic cortico-striatal projections consist of AMPA and NMDA receptors, I then changed the simple Integrate-and-fire model to multi-synapse Integrate-and-fire model using NESTML. But the results didn't seem good, and got even worse when I handly tuned some parameters. I have been confused for a long time and I thought my multi-synapse definition might account for this.
I read about "cm_default" neuron model on NEST model directory, which you described containing AMPA_NMDA receptors. And the difference between your definition on this receptor and mine is the term of NMDA ratio (ratio of NMDA versus AMPA channels). So I thought that might help me resolve this issue and that's why I wrote to ask how to customize receptor types of "cm_default" neuron model through NESTML.
So briefly:
I wonder how I can apply the NMDA ratio in my own multi-synapse Integrate-and-fire model or is it possible to extract the AMPA_NMDA receptors from "cm_default".
And I read about the "cm_default" neuron model on NEST model directory, and you described like "For receptors, the choice is AMPA, GABA or NMDA or AMPA_NMDA. Ion channels and receptor types can be customized with NESTML."
------------------------------------------------------------------
Sender:Charl Linssen <nest-users(a)turingbirds.com>
Sent At:2022 Dec. 8 (Thu.) 16:56
Recipient:users <users(a)nest-simulator.org>
Subject:[NEST Users] Re: how to customize Ion channels and receptor types of "cm_default" neuron model through nestml
Dear Zirui,
Thank you for writing in. Just to double-check, the compartmental ("cm_default") model in NEST is intended for use in morphologically detailed models, containing dozens or hundreds of compartments. If your model can be described by only a few compartments, it could be easier to manually define these (define say a separate membrane potential and dendritic potential and the coupling between them; see https://nestml.readthedocs.io/en/v5.1.0/tutorials/active_dendrite/nestml_ac… <https://nestml.readthedocs.io/en/v5.1.0/tutorials/active_dendrite/nestml_ac… > for an example).
If you do indeed need the full morpological complexity in your simulations, then the cm_default model is indeed the way to go. We are currently working on extending NESTML to support defining the biophysics in NESTML (ion channels, membrane dynamics etc.) and then combining this with a morphology in NEST. For a prototype of this functionality, please see the branch in https://github.com/nest/nestml/pull/772 <https://github.com/nest/nestml/pull/772 >. The current effort there is focused on adding the ability to specify different ion channels in a more flexible manner.
If you could describe a little bit more about your use case and what exactly you are trying to achieve, we could probably give you some more specific advice.
Cheers,
Charl
On Mon, Dec 5, 2022, at 05:54, 王梓瑞 wrote:
Dear nest community,
I wonder if any of you know how to customize Ion channels and receptor types of "cm_default" neuron model through nestml.
I noticed some information on Extending NESTML- Running NESTML with custom templates, but I still found myself confused due to my poor understanding.
Is there any way to acquire the .nestml file of the cm_default neuron model?
Thank you.
Best,
Zirui
_______________________________________________
NEST Users mailing list -- users(a)nest-simulator.org <mailto:users@nest-simulator.org >
To unsubscribe send an email to users-leave(a)nest-simulator.org <mailto:users-leave@nest-simulator.org >
Hi,
I am using Ubuntu and just updated to Nest 3.4 however I get the following error:
Feb 26 15:53:59 SLIStartup [Fatal]:
SLI initialisation file not found at
/build/nest-xF9H0B/nest-3.4/debian/nest/usr/share/nest/sli/sli-init.sli.
Please check your NEST installation.
Moreover, is there a simple way to roll back and install Nest 3.3 , such as: sudo apt install nest=3.3?
Thanks,
Inton
Hello,
For the below code I get an error related to __cinit__ (Cython) when performing a deepcopy that includes a nest.spatial specification.
I am using NEST 3.3 and Python 3.8.12
Best,
Xavier
----
import nest
import copy
my_dict = {'rule': 'pairwise_bernoulli', 'p': nest.spatial_distributions.gaussian(nest.spatial.distance, std=1.0), 'mask': {'circular': {'radius': 4.}}}
new_dict = copy.deepcopy(my_dict)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python3.8/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/usr/lib64/python3.8/copy.py", line 230, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/usr/lib64/python3.8/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/usr/lib64/python3.8/copy.py", line 270, in _reconstruct
state = deepcopy(state, memo)
File "/usr/lib64/python3.8/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/usr/lib64/python3.8/copy.py", line 230, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/usr/lib64/python3.8/copy.py", line 161, in deepcopy
rv = reductor(4)
File "stringsource", line 2, in pynestkernel.SLIDatum.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__
Hello everyone,
we are pleased to announce that NESTML and NEST Desktop will be
represented at the satellite training sessions
(https://flagship.kip.uni-heidelberg.de/jss/HBPm?mI=252&m=showAgenda&showAbs…)
as part of the HBP Summit 2023 in Marseille! These sessions will take
place on *27 March 2023*.
Participation is free of charge, but a registration is mandatory. Please
note that only a limited number of participants can register - at the
moment there are still places available! The registration page is
https://flagship.kip.uni-heidelberg.de/jss/HBPm?meetingID=252. You might
need to create a free EBRAINS account for that, if you do not have one yet.
The abstract of the workshop can be found below, as well as on the
conference page. We also have a dedicated page for this tutorial with
detailed information on the topics covered
(https://clinssen.github.io/HBP-summit-2023-workshop/). Catering will be
provided on site.
We are looking forward to meeting you at the workshop!
On behalf of the tutorial organizers,
Jens Bruchertseifer
PS: Please have also a look at the other exciting topics at the training
session! ;)
-----
License to Spike - A NEST Desktop and NESTML Workshop
NEST is an established, open-source simulator for spiking neuronal
networks, which can capture a high degree of detail of biological
network structures while retaining high performance and scalability from
laptops to HPC [1]. This tutorial provides hands-on experience in
building and simulating neuron, synapse, and network models. It
introduces several tools and front-ends to implement modeling ideas most
efficiently. Participants do not have to install software as all tools
can be accessed via the cloud.
First, we look at NEST Desktop [2], a web-based graphical user interface
(GUI), which allows the exploration of essential concepts in
computational neuroscience without the need to learn a programming
language. This advances both the quality and speed of teaching in
computational neuroscience. To get acquainted with the GUI, we will
create and analyze a balanced two-population network.
In the second half of the session, we will create a new, custom neuron
model that extends the capabilities of NEST Simulator by introducing new
mechanisms, such as an active spiking dendritic compartment. NESTML [3]
makes it quick and easy it is to implement and simulate model variants.
A neuronal plasticity rule is then introduced, which allows a network to
be trained by means of reinforcement learning. This is accomplished by
combinating a typical spike-timing dependent plasticity learning rule
with a global neuromodulatory dopamine signal. We will use the new
learning rule to train a stimulus preference in the balanced network.
Citations
[1] https://nest-simulator.readthedocs.org/
[2] https://nest-desktop.readthedocs.org/
[3] https://nestml.readthedocs.org/
Dear Colleagues,
The NEST Initiative is excited to invite everyone interested in Neural Simulation Technology and the NEST Simulator to the NEST Conference 2023. The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application. We particularly encourage young scientists to participate in the conference!
This year's conference will take place as a virtual event on 15-16 June 2023.
Register now!
For more information please visit the conference website
https://nest-simulator.org/conference
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and colleagues
[Image]
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday February 27, 11.30-12.30 CET (UTC+1).
* Special topic for the in-depth today will be the introduction to "NEAT" a Neural Analysis Toolkit by @WillemWybo.
* Additionally, @heplesser will shed more light on performance optimizations described in #2617: Integrating "deliver events first".
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-02-27-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Hello,
In my model I am using STDP synapse connections. When running simulations with nest.Simulate(), everything seems to work fine. When using nest.Run() inside a "with RunManager():" execution is *MUCH* faster, but STDP connections are not updated. Is it a correct behaviour?
Thanks a lot in advance,
Xavier
Dear nest community,
Is there anyone know how to record local field potential from the network since I only found spike/weight recorder in NEST.
Thank you.
Best,
Zirui
Dear NEST Users,
I would like to "record" or read calcium concentration of the neurons
(let's say aeif_cond_exp). For now Ca variable is 0 before and after I
simulate, however the neuron spikes.
Is there any neuron model that supports this, or do I need to edit the
nestml of the desired model to achieve this.
If so, a sample code snippet would be helpful.
--
Thanks and Regards
*Maryada*
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday February 13, 11.30-12.30 CET (UTC+1).
Feel free to join the meeting also if it's just to bring your own quick questions for direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2023-02-13-Open-NEST-Developer-…
Looking forward to seeing you!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4