Hi!
I just have some minor additions to what Hans said.
You mention that you collect senders, times, sources and targets. What do you mean by
senders vs sources? And wouldn’t it suffice to just record the senders and times (using
the NEST spike_detector)? If you then have a one-time-dump of connection information, you
could infer targets for each spike during analysis.
The spike_detector is called spike_recorder in current master.
For highly parallel simulations, the current master branch of NEST (NEST3) allows you to
use SIONLib for efficient parallel recording.
The documentation for this is
here<https://nest-simulator.readthedocs.io/en/latest/guides/recording_from_simulations.html#store-data-to-an-efficient-binary-format>.
Some revisions to it are currently under review in
#1806<https://github.com/nest/nest-simulator/pull/1806>.
Cheers,
Jochen!
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser@nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home
http://arken.nmbu.no/~plesser
On 02/11/2020, 18:35, "swathi anil"
<swathi.anil@anat.uni-freiburg.de<mailto:swathi.anil@anat.uni-freiburg.de>>
wrote:
Hello all,
I use NEST with OpenMPI on a high performance cluster to run plasticity related network
simulations.
So far I have divided my whole job into:
phase 1: data collection (data collected: senders, times, sources, targets for each time
point)
phase 2: data anaylsis (refers to spike count and connectivity calculation)
I face an issue with data-handling. In phase 1, the data pertaining to each of the four
variables is saved in X different files (X = number of virtual processes), rank-wise. This
means that the total number of files generated goes [ n(time-points)*X*4 ], which
surpasses the chunk file storage limit, per user for the cluster. Each file here is an
ndarray saved as a *.npy file.
I wonder if there is a way to retrieve the data from each of the X processes while
collecting data, concatenating and then saving them? So instead of X number of files, I
can save the concatenated version. This probably involves recruiting a single VP to
collect and concatenate and save the datapoints, but i am not quite sure how to execute
this using NEST. Any help would be highly appreciated!
Thanks in advance!
Best,
Swathi
_______________________________________________
NEST Users mailing list --
users@nest-simulator.org<mailto:users@nest-simulator.org>
To unsubscribe send an email to
users-leave@nest-simulator.org<mailto:users-leave@nest-simulator.org>
--
Dr. Jochen Martin Eppler
Phone: +49(2461)61-96653
----------------------------------
Simulation Laboratory Neuroscience
Jülich Supercomputing Centre
Institute for Advanced Simulation
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Volker Rieke
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------