CNS 2016 Jeju, South Korea: Tutorials

This year OCNS will organize a number of tutorials at the CNS meeting in Jeju, South Korea. The tutorials will be held before the main meeting on Saturday 2nd July 2016.
Each tutorial will cover substantial part of the tutorial topic including theory and practice, commonly used tools, as well as give examples, demonstrations and/or exercises.

 

[Full day] Iain Hepburn (Okinawa Institute of Science and Technology, Japan) and Dr. Andrew Gallimore (Okinawa Institute of Science and Technology, Japan)

[Full day] Dr. Ben Torben-Nielsen (University of Hertfordshire, UK) and Dr. Yann Le Franc (e-Science Data Factory, France)

[Full day] Jun Igarashi (Okinawa Institute of Science and Technology, Japan ) and Hannah Bos (Jülich Research Centre, Germany)

[Half day, morning] Prof. Jaeseung Jeong (Korea Advanced Institute of Science and Technology, South Korea)

Prof. Gaute T. Einevoll (Norwegian University of Life Sciences & University of Oslo, Norway) and Dr. Espen Hagen (Jülich Research Centre and JARA, Jülich, Germany)

Details

 

T1: Subcellular modeling (full day)

Lecturers:
Dr. Iain Hepburn (Okinawa Institute of Science and Technology, Japan)
Dr. Andrew Gallimore (Okinawa Institute of Science and Technology, Japan)

Description of the tutorial:
Many important neural functions are controlled by complex networks of intracellular proteins and signalling molecules. A variety of modular signalling pathways connect and interact to form large networks possessing emergent properties irreducible to individual molecules or pathways. These include bistable and ultrasensitive switches, as well as feedback regulation, and synchronisation. These properties are essential for the induction and regulation of critical neural functions, such as long-term depression and potentiation. The complexity of these networks renders their analysis by inspection alone unfeasible, and we must turn to computational modelling to understand them.

The first half of this tutorial will focus on the structure and function of intracellular networks and deterministic methods for modelling and analysing them. We will use a number of important subcellular pathways to illustrate the key concepts and demonstrate the importance and utility of deterministic methods in their modelling and simulation. We will discuss both the biochemistry of these pathways and their mathematical representation. We will then discuss how these modular pathways connect and interact to form large networks. Important network motifs and their emergent properties will also be explained with specific examples given, as well as mathematical methods for their analysis. We will discuss a number of tools for simulating these differential equation models, but will use the open source software Copasi in the tutorial, owing to its ease of installation and use. Participants will have the opportunity to build and simulate their own signalling pathway model in Copasi. This part of the tutorial will serve as a good introduction to molecular systems modelling for those with little prior experience.

The second half of the tutorial will focus on more advanced modelling approaches based on several state of the art software packages. We will explain how the time evolution of real molecular systems can diverge from a differential equation-based description due to concepts such as probabilistic interactions in small volumes and spatial heterogeneity. We will describe mathematical approaches to modelling stochastic effects and diffusion and introduce a number of software tools that are based on such descriptions. These include particle-tracking packages such as MCell and Smoldyn, and voxel-based packages such as NeuroRD and STEPS. The features of the different software tools will be discussed and illustrated with specific practical examples. Finally, we will briefly discuss recent advances and expected near-future directions of the field, including massively parallel implementations and membrane potential coupling.

References and background reading:
[1] Antunes, G., De Schutter, E. A Stochastic Signaling Network Mediates the Probabilistic Induction of Cerebellar Long-Term Depression. Journal of Neuroscience 32, 9288-9300, 2012.
[2] Bhalla, U.S., Iyengar, R. Emergent properties of networks of biological signaling pathways. Science 283, 381-387, 1999.
[3] Eungdamrong, N.J., Iyengar, R. Computational approaches for modeling regulatory cellular networks. Trends in Cell Biology 14, 661-669, 2004.
[4] Gallimore, A.R., Aricescu, A.R., Yuzakl, M., Calinescu, R. A Computational Model for the AMPA Receptor Phosphorylation Master Switch Regulating Cerebellar Long-Term Depression. Plos Computational Biology 12, 23, 2016.
[5] Kotaleski, J.H., Blackwell, K.T. Modelling the molecular mechanisms of synaptic plasticity using systems biology approaches. Nature Reviews Neuroscience 11, 239-251, 2010.

Software:
Copasi: http://copasi.org/
SimBiology (Matlab): http://uk.mathworks.com/products/simbiology/
Genesis: http://www.genesis-sim.org/
STEPS: http://steps.sourceforge.net/STEPS/default.php
MCell: http://mcell.org/
Smoldyn: http://www.smoldyn.org/
NeuroRD: http://krasnow1.gmu.edu/CENlab/software.html

 


T2: Detailed modeling of structure and function at the cellular level (full day)

Lecturers:
Dr. Ben Torben-Nielsen (University of Hertfordshire, UK)
Dr. Yann Le Franc (e-Science Data Factory, France)

Description of the tutorial:
In the morning session, we introduce the morphology of dendrites and axons, the specialised input and output arborisations of neurons. Their shape is pivotal for brain functioning for two reasons: First, overlap between dendrites and axons defines the micro-circuit. Second, the shape and membrane composition of dendrites define how inputs are transformed into relevant outputs. In this tutorial, we will start by explaining the importance of morphologies and how to quantify them (say, in order to distinguish healthy from pathological morphologies). We will touch on algorithmic synthesis of large numbers of unique neuronal morphologies for application in large-scale modelling efforts. We finish the morning session with a hands-on tutorial using btmorph [1] to analyse populations of neuronal morphologies.

In the afternoon session, we explain how neuronal dynamics takes place at the single neuron level and how dendrites turn input signals into an output. We briefly explain the conductance-based and compartmental-modelling paradigms to simulate the dynamics on neurons with detailed membrane composition and elaborate neuronal morphologies. We then proceed to show several free community resources to construct, simulate, share and analyse single neuron models. We end the afternoon session with a hands-on demonstration of how to construct and simulate detailed models of neurons using NEURON and python [2].

References and background reading:
[1] Torben-Nielsen B. An efficient and extendable Python library to analyze neuronal morphologies. Neuroinformatics 12:619-622, 2014 .
[2] James G.K., Hines M., Hill S., Goodman P.H., Markram H.,1 Schürmann F. Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON. Front. Neuroinformatics, 3:1-12, 2009.
[3] Torben-Nielsen B., Cuntz H. Introduction to dendritic morphology, The computing dendrite, Springer, 2014.
[4] London M., Häusser M. Dendritic computation. Annu Rev Neurosci. 28:503-32, 2005.
[5] Parekh R., Ascoli G. Neuronal Morphology Goes Digital: A Research Hub for Cellular and System Neuroscience. Neuron 77(6): 1017–1038, 2013.
[6] Silver A. Neuronal arithmetic. Nature Reviews Neuroscience 11, 474-489, 2010.

 

T3: Simulation of large-scale neural network

  

Lecturers:

Dr. Jun Igarashi (RIKEN and Okinawa Institute and Science and Technology, Japan)

Dr. Hannah Bos (Jülich Research Centre and JARA, Jülich, Germany)

 

Description of the tutorial:

The first part of this tutorial is concerned with the emergence of large scale neuronal networks in neuroscience and the resulting challenges in software and hardware that are necessary to support large scale simulations. We will start by an introduction covering the development of networks examined in neuroscience and give an overview over existing large scale models. Subsequently we will give an overview over the history of supercomputers used for simulations of large scale networks. The introduction is followed by two lectures going into detail of the implementation of neuronal networks shedding light on the software as well as the hardware aspects. We will first discuss how a neural simulator can be implementation on the example of NEST [1]. The lecture concerned with the hardware aspect will introduce how calculation of neural network simulation is executed using processors and memory in a computer, with a story of recent representative semi-conductor chips and supercomputers.
The second part of the tutorial focuses on hands-on exercise using NEST. The tutorial does not assume any prior knowledge in NEST. However, it is recommended that participants install NEST on their laptops beforehand [2]. We will start by introducing the basic commands of NEST and work our way up to the implementation of a random balanced network [3,4]. The session is planned as an interactive mixture of lectures and exercise. At the end a final lecture on a basal ganglia-thalamo-cortical circuit model that helps to understand Parkinson’s disease motor symptoms, will introduce an example of a large-scale network in more detail

References and background reading:

[1] Gewaltig MO, Diesmann M. NEST (NEural Simulation Tool). Scholarpedia.2007;2(4):1430.
[2] http://www.nest-simulator.org/installation/
[3] Brunel N. Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J Comput Neurosci. 2000;8(3):183–208.
[4] Potjans TC, Diesmann M. The Cell-Type Specific Cortical Microcircuit: Relating Structure and Activity in a Full-Scale Spiking Network Model. Cereb Cortex. 2014;24(3):785–806. Doi: 10.1093/cercor/bhs358.

 

 

T4: Nonlinear dynamical analysis of brain datasets

Lecturers: 

Prof. Jaeseung Jeong (Korea Advanced Institute of Science and Technology, South Korea)

Description of the tutorial:

To be announced

 

 

 

T5: Modeling and analysis of extracellular potentials

Lecturers:
Prof. Gaute T. Einevoll (Norwegian University of Life Sciences & University of Oslo, Norway)
Dr. Espen Hagen (Jülich Research Centre and JARA, Jülich, Germany)

Description of the tutorial:
While extracellular electrical recordings have been one of the main workhorses in electrophysiology, the interpretation of such recordings is not trivial [1,2,3]. The recorded extracellular potentials in general stem from a complicated sum of contributions from all transmembrane currents of the neurons in the vicinity of the electrode contact. The duration of spikes, the extracellular signatures of neuronal action potentials, is so short that the high-frequency part of the recorded signal, the multi-unit activity (MUA), often can be sorted into spiking contributions from the individual neurons surrounding the electrode [4]. No such simplifying feature aids us in the interpretation of the low-frequency part, the local field potential (LFP). To take a full advantage of the new generation of silicon-based multielectrodes recording from tens, hundreds or thousands of positions simultaneously, we thus need to develop new data analysis methods grounded in the underlying biophysics [1,3,4].  This is the topic of the present tutorial.
In the first part of this tutorial we will go through
1) the biophysics of extracellular recordings in the brain,
2) a scheme for biophysically detailed modeling of extracellular potentials and the application to modeling single spikes [5-7], MUAs [8] and LFPs, both from single neurons [9] and populations of neurons [8,10,11],
3) methods for estimation of current source density (CSD) from LFP data, such as the iCSD [12-14] and kCSD methods [15],
4) decomposition of recorded signals in cortex into contributions from various laminar populations, i.e., (i) laminar population analysis (LPA) [16,17] based on joint modeling of LFP and MUA, or (ii) a scheme using LFP and known constraints on the synaptic connections [18].
In the second part, the participants will get demonstrations and, if wanted, hands-on experience with
1) LFPy (github.com/LFPy) [19], a versatile tool based on Python and the simulation program NEURON [20] (www.neuron.yale.edu) for calculation of extracellular potentials around neurons, and
2) new results from applying the biophysical forward-modelling scheme to predict LFPs from comprehensive point-neuron network models, in particular Potjans and Diesmann's model of the early sensory cortical microcircuit using hybridLFPy [22,23] will be presented.

References and background reading:
[1] Pettersen K.H. et al., "Extracellular spikes and CSD" in Handbook of Neural Activity Measurement, Cambridge (2012).
[2] Buzsaki G. et al., Nat Rev Neurosci 13:407 (2012)
[3] Einevoll G.T. et al., Nat Rev Neurosci 14:770 (2013)
[4] Einevoll G.T. et al., Curr Op Neurobiol 22:11 (2012)
[5] Holt G., Koch C, J Comp Neurosci 6:169 (1999)
[6] Gold J. et al., J Neurophysiol 95:3113 (2006)
[7] Pettersen K.H., Einevoll G.T., Biophys J 94:784 (2008)
[8] Pettersen K.H. et al., J Comp Neurosci 24:291 (2008)
[9] Lindén H. et al., J Comp Neurosci 29: 423 (2010)
[10] Lindén H. et al., Neuron 72:859 (2011)
[11] Łęski S. et al., PLoS Comp Biol 9:e1003137 (2013)
[12] Pettersen K.H. et al., J Neurosci Meth 154:116 (2006)
[13] Łęski S. et al., Neuroinform 5:207 (2007)
[14] Łęski S. et al., Neuroinform 9:401 (2011)
[15] Potworowski J. et al., Neural Comp 24:541 (2012)
[16] Einevoll G.T. et al., J Neurophysiol 97:2174 (2007)
[17] Blomquist P. et al., PLoS Comp Biol 5:e1000328 (2009)
[18] Gratiy S.L. et al., Front Neuroinf  5:32 (2011)
[19] Lindén H. et al., Front Neuroinf 7:41 (2014)
[20] Hines M.L. et al., Front Neuroinf 3:1 (2009)
[21] Glabska H. et al., PLoS ONE 9:e105071 (2014)
[22] Potjans T.C., Diesmann M., Cereb Cortex 24:785 (2014)
[23] Hagen E. et al., arXiv:1511.01681 [q-bio.NC].