CNS*2022 Melbourne: Tutorials 

Tutorials are intended as introductions into main methodologies of various fields in computational neuroscience. This year, CNS tutorials offer introductory full day courses covering a wide range of different topics as well as specialized half day tutorials. Tutorials are particularly tailored for early stage researchers as well as researchers entering a new field in computational neuroscience. 

Tutorials are grouped into the following categories:

For inquiries related to tutorials, please contact the tutorials organizer: [email protected]. Please note that the program is not final.

Satellite Tutorials, June 27 - July 1

These are free online tutorials organized by the INCF/OCNS Software Working Group. 

 

List of Onsite Tutorials, Saturday, July 16

Whole day tutorials

TitleLecturersReferenceLocation
From single-cell modeling to large-scale network dynamics with NEST Simulator Charl Linssen, Agnes Korcsak-Gorzo, Jasper Albers, Pooja Babu, Joshua Böttcher, Jessica Mitchell, Willem Wybo,  Jens Bruchertseifer, Sebastian Spreizer, and Dennis Terhorst T1 101
Models of Neuron-Glial Interactions Daniel Park, Afroditi Talidou, Hugues Berry, Jeremie Lefebvre, Marinus Toman, Liam McDaid, and Maurizio De Pittà T2 103
A step-by-step tutorial on active inference and its application to empirical data Ryan Smith and Christopher J. Whyte T3 102
Building mechanistic multiscale models from molecules to circuits using NEURON and NetPyNE Salvador Dura-Bernal, William W Lytton, and Robert A McDougal T4 104

Half-day tutorials

TitleLecturersReferenceLocation
Characterizing neural dynamics using highly comparative time-series analysis Ben D. Fulcher, Oliver Cliff, and Trent Henderson, Annie Bryant  T5 107 (morning)
GPU enhanced Neuronal Networks James C Knight and Thomas Nowotny T6 107 (afternoon)
Spectral analysis of neural signals Axel Hutt T7 108 (afternoon)

Showcase

TitleLecturersReferenceLocation
Introduction to the Brain Dynamics Toolbox
Stewart Heitmann T8 111 + 112


Descriptions

T1: From single-cell modeling to large-scale network dynamics with NEST Simulator.

  • Charl Linssen - Jülich Research Centre, Germany
  • Agnes Korcsak-Gorzo - Jülich Research Centre, Germany
  • Jasper Albers - Jülich Research Centre, Germany
  • Pooja Babu - Jülich Research Centre, Germany
  • Joshua Böttcher - Jülich Research Centre, Germany
  • Jessica Mitchell - Jülich Research Centre, Germany
  • Willem Wybo - Jülich Research Centre, Germany
  • Jens Bruchertseifer - University of Trier, Germany
  • Sebastian Spreizer - University of Trier, Germany
  • Dennis Terhorst - Jülich Research Centre, Germany

Description of the tutorial

Tutorial website

NEST is an established, open-source simulator for spiking neuronal networks, which can capture a high degree of detail of biological network structures while retaining high performance and scalability from laptops to HPC [1]. This tutorial provides hands-on experience in building and simulating neuron, synapse, and network models. It introduces several tools and front-ends to implement modeling ideas most efficiently. Participants do not have to install software as all tools can be accessed via the cloud.

First, we look at NEST Desktop [2], a web-based graphical user interface (GUI), which allows the exploration of essential concepts in computational neuroscience without the need to learn a programming language. This advances both the quality and speed of teaching in computational neuroscience. To get acquainted with the GUI, we will create and analyze a
balanced two-population network.

The model is then exported to a Jupyter notebook and endowed with a data-driven spatial connectivity profile of the cortex, enabling us to study the propagation of activity. Then, we make the synapses in the network plastic and let the network learn a reinforcement learning task, whereby the learning rule goes beyond pre-synaptic and post-synaptic spikes by adding
a dopamine signal as a modulatory third factor. NESTML [3] makes it easy to express this and other advanced synaptic plasticity rules and neuron models, and automatically translates them into fast simulation code.

More morphologically detailed models, with a large number of compartments and custom ion channels and receptor currents, can also be defined using NESTML. We first implement a simple dendritic layout and use it to perform a sequence discrimination task. Next, we implement a compartmental layout representing semi-independent subunits and recurrently
connect several such neurons to elicit an NMDA-spike driven network state.

 Background reading and software tools

  1. https://nest-simulator.readthedocs.org
  2. https://nest- desktop . readthedocs.org /
  3. https://nestml.readthedocs.org /
Back to top

T2: Models of Neuron-Glial Interactions.

Organizer: Maurizio De Pittà, UHN Krembil Brain Institute, Toronto, ON, Canada

Presenters: 

  • Daniel Park
  • Afroditi Talidou
  • Jeremie Lefebvre 
  • Hugues Berry
  • Marinus Toman
  • Liam McDaid
  • Maurizio De Pittà

Description of the tutorial

Tutorial Website

This tutorial aims to introduce essential state-of-the-art modeling approaches to study neuron-glial interactions in the brain. The tutorial articulates in four self-contained sections that grasp the current trends in the emerging field of computational glioscience – that is, the branch of computational neuroscience that deals with characterizing neuron-glial signaling. These are:

  1. Modeling of oligodendrocytes and myelination plasticity.
  2. Models of astrocytic calcium signaling.
  3. Interaction models for astrocyte-synapse cross talks.
  4. Mass-balance models for glia-mediated ion homeostasis.

Each session lasts approximately 1.45 hours and contemplates presentation and work-through examples of mathematical models of neuron-glial interaction models, along with code snippets for simulations of most of the models presented. 

Program

9am-10.45am, Session 1: : Oligodendrocyte modeling presented into two sessions:

  • 9am-10am: Modeling adaptive myelination and its effects on oscillatory synchrony using the Kuramoto network model.
  • 10am-11am: Activity dependent myelination as a regulator of axonal conduction velocity
Speakers: Daniel Park, Afroditi Talidou, Jeremie Lefebvre (Sync Lab, University of Ottawa, ON,Canada) 

10.45am-11am Coffee break

11am-12.45pm, Session 2Modelling astrocytic calcium dynamics in soma or thin branchlets
Speaker: Hugues Berry (INRIA Lyon Research Center, Villeurbanne, France)

12.45pm-2.30pm Lunch

2.30pm-4.15pm, Session 3: The tripartite synapse and neuron-glial network models
Speaker: Maurizio De Pittà, UHN Krembil Brain Institute, Toronto, ON, Canada

4.15pm-4.30pm Coffee break

4.30pm-6.15pm, Session 4: Ionostasis at the tripartite synapse
Speakers: Marinus Toman, Liam McDaid (School of Computing, Engineering and Intelligent Systems, University of Ulster, Northern Ireland, The United Kingdom)

Background reading and software tools

Back to top

T3: A step-by-step tutorial on active inference and its application to empirical data.

  • Ryan Smith -  Laureate Institute for Brain Research, The University of Tulsa, OK, USA
  • Christopher J. Whyte - Brain and Mind Research Institute, The University of Sydney, Australia
     

Description of the tutorial

Active inference is a recently developed computational framework for jointly modeling perception, learning, and decision-making as partially observable Markov decision processes (POMDPs). This framework assumes the brain represents a generative model of the (internal and external) causes of the sensory input it receives; and that it uses approximate Bayesian inference to infer hidden states of the world causing sensory input (perception), model parameters (learning), and optimal action policies (decision-making) within this generative model. It places special emphasis on how the brain infers the observations that will be generated by its own actions. In empirical studies, generative models of task behavior can be constructed; and the parameters of these models can be fit to behavior in individual participants. These individual parameter estimates can then be used as predictors in between-subjects analyses. This framework also affords simulation studies in which precise predictions can be derived from generative models designed to capture broader theoretical proposals.

At the algorithmic level, active inference proposes that the brain uses variational message passing approaches to approximate Bayesian inference in a biologically plausible manner. This entails a neural process theory of how messages passed between neurons in the brain can be viewed as variational messages – providing information about the sufficient statistics of various probability distributions. Following assumptions associated with neural mass models, this allows active inference to provide empirically testable predictions of neural responses that could be measured with fMRI and EEG, among other neuroimaging measures.

This tutorial will first introduce the active inference framework at a conceptual and mathematical level. It will then focus on teaching attendees how to build generative models of multiple types of behavioral tasks, and how to fit these models to behavioral data. It will also introduce the neural process theory and teach attendees how to derive simulated neuronal responses that could be applied in their own neuroscientific research.

Software tools:

 Background reading:

This tutorial will be based largely around the following paper:

  1. Smith, R., Friston, K. J., & Whyte, C. J. (2022). A step-by-step tutorial on active inference and its application to empirical data. Journal of mathematical psychology, 107, 102632.
     *  MATLAB code from this paper that we will use can be found here: https://github.com/rssmith33/Active-Inference-Tutorial-Scripts

Some other papers that may be of interest for further conceptual background or mathematical details include:

  1. Smith, R., Badcock, P., & Friston, K. J. (2021). Recent advances in the application of predictive coding and active inference models within clinical neuroscience. Psychiatry and Clinical Neurosciences, 75(1), 3-13
  2. Da Costa, L., Parr, T., Sajid, N., Veselic, S., Neacsu, V., & Friston, K. (2020). Active inference on discrete state-spaces: A synthesis. Journal of Mathematical Psychology, 99, 102447.
  3. Sajid, N., Ball, P. J., Parr, T., & Friston, K. J. (2021). Active inference: demystified and compared. Neural computation, 33(3), 674-712.
  4. Friston, K. J., Parr, T., & de Vries, B. (2017). The graphical brain: belief propagation and active inference. Network neuroscience, 1(4), 381-414.
  5. Friston, K. J., Rosch, R., Parr, T., Price, C., & Bowman, H. (2018). Deep temporal models and active inference. Neuroscience & Biobehavioral Reviews, 90, 486-501.
  6. Parr, T., Markovic, D., Kiebel, S. J., & Friston, K. J. (2019). Neuronal message passing using Mean-field, Bethe, and Marginal approximations. Scientific reports, 9(1), 1-18.
  7. Parr, T., & Friston, K. J. (2018). The anatomy of inference: generative models and brain structure. Frontiers in computational neuroscience, 90.

Schedule:

All times listed are Melbourne, Australia Time Zone (AEST). Note that these are approximate, as some sections are likely to go longer (or perhaps shorter) than scheduled.

  • 9:00 – 10:00 – Initial introduction to the theory and mathematics of perception and decision-making within active inference
  • 10:00 – 10:05 – Short break
  • 10:05 - 10:50 – Introduction to building generative (POMDP) models of behavioral tasks, with worked examples
  • 10:50 – 11:00 – Coffee break
  • 11:00 – 12:00 – Introduction to learning in active inference & practice exercises building generative models of reinforcement learning tasks
  • 12:00 – 1:00 – Lunch break
  • 1:00 – 2:00 – Introduction to fitting models to behavioral data, with practice exercises
  • 2:00 – 2:05 – Short break
  • 2:05 – 2:50 – Introduction to variational (marginal) message passing and building deep temporal models
  • 2:50 – 3:00 – Coffee break
  • 3:00 - 4:00 – Introduction to simulating neural responses with worked examples
Back to top

T4: Building mechanistic multiscale models from molecules to circuits using NEURON and NetPyNE.

Description of the tutorial

Understanding brain function requires characterizing the interactions occurring across many temporal and spatial scales. Mechanistic multiscale modeling aims to organize and explore these interactions. In this way, multiscale models provide insights into how changes at molecular and cellular levels, caused by development, learning, brain disease, drugs, or other factors, affect the dynamics of local networks and of brain areas. Large neuroscience data-gathering projects throughout the world (e.g. US BRAIN, EU HBP, Allen Institute) are making use of multiscale modeling, including the NEURON ecosystem, to better understand the vast amounts of information being gathered using many different techniques at different scales.

This tutorial will introduce multiscale modeling using two NIH-funded tools: the NEURON 8.0 simulator [1], including the Reaction-Diffusion (RxD) module [2,3], and the NetPyNE tool [4]. The tutorial will combine background, examples and hands on exercises covering the implementation of models at four key scales: (1) intracellular dynamics (e.g. calcium buffering, protein interactions), (2) single neuron electrophysiology (e.g. action potential propagation), (3) neurons in extracellular space (e.g. spreading depression), and (4) networks of neurons. For network simulations, we will use NetPyNE, a high-level interface to NEURON supporting both programmatic and GUI specification that facilitates the development, parallel simulation, and analysis of biophysically detailed neuronal circuits. We conclude with an example combining all three tools that links intracellular molecular dynamics with network spiking activity and local field potentials. The tutorial will incorporate the recent substantial developments and nre features in both the NEURON and NetPyNE tools [5].

Basic familiarity with Python is recommended. No prior knowledge of NEURON or NetPyNE is required. The tutorial will use these tools on the cloud via the Open Source Brain v2.0 platform [6], so no software installation is necessary.

Software links (the tutorial will be on the cloud, so no software installation is necessary):

 Background reading

If you’re new to Python programming, there are many excellent tutorials online. There are Python lectures with exercises and solutions taught by one of us in a summer course last year under “Monday” at https://ycmi.github.io/summer-course-2020/
  1. Hines M, Carnevale T, McDougal RA. (2019) NEURON Simulation Environment. In: Jaeger D., Jung R. (eds) Encyclopedia of Computational Neuroscience. Springer, New York, NY. doi: 10.1007/978-1-4614-7320-6_795-2
  2. McDougal R, Hines M, Lytton W. (2013) Reaction-diffusion in the NEURON simulator. Front. Neuroinform. 7, 28. 10.3389/fninf.2013.00028/
  3. Newton AJH, McDougal RA, Hines ML and Lytton WW (2018) Using NEURON for Reaction-Diffusion Modeling of Extracellular Dynamics. Front. Neuroinform. 12, 41. 10.3389/fninf.2018.00041
  4. Dura-Bernal S, Suter B, Gleeson P, Cantarelli M, Quintana A, Rodriguez F, Kedziora DJ, Chadderdon GL, Kerr CC, Neymotin SA, McDougal R, Hines M, Shepherd GMG, Lytton WW. (2019) NetPyNE: a tool for data-driven multiscale modeling of brain circuits. eLife 8:e44494
  5. Omar Awile, Pramod Kumbhar, Nicolas Cornu, Salvador Dura-Bernal, James Gonzalo King, Olli Lupton, Ioannis Magkanaris, Robert A. McDougal, Adam J.H. Newton, Fernando Pereira, Alexandru Săvulescu, Nicholas T. Carnevale, William W. Lytton, Michael L. Hines, Felix Schürmann. Modernizing the NEURON Simulator for Sustainability, Portability, and Performance. bioRxiv 2022.03.03.482816; doi: https://doi.org/10.1101/2022.03.03.482816
  6. Gleeson P, Cantarelli M, Quintana A, Earnsah M, Piasini E, Birgiolas J, Cannon RC, Cayco- Gajic A, Crook S, Davison AP, Dura-Bernal S, Ecker A, Hines ML, Idili G, Lanore F, Larson S, Lytton WW, Majumdar A, McDougal RA, Sivagnanam S, Solinas S, Stanislovas R, van Albada S, van Geit W, Silver RA. (2019) Open Source Brain: a collaborative resource for visualizing, analyzing, simulating and developing standardized models of neurons and circuits. Neuron, 10.1016/j.neuron.2019.05.019.

Schedule (10am-4pm Melbourne time; 8pm-2am NY time):

  • 10:00 - 10:15 - Bill Lytton - Overview: Implementing the Conceptual Model
  • 10:15 - 11:30 - Robert McDougal - NEURON scripting basics
  • 11:30 - 11:45 - coffee break
  • 11:45 - 12:45 - Adam Newton - Reaction-Diffusion
  • 12:45 - 1:15 - lunch break
  • 1:15 - 2:30 - Salvador Dura - NetPyNE GUI-based tutorials
  • 2:30 - 2:45 - coffee break
  • 2:45 - 4:00 - Salvador Dura - NetPyNE scripting tutorial
Back to top


T5:
Characterizing neural dynamics using highly comparative time-series analysis.

  • Ben D. Fulcher -  School of Physics, The University of Sydney, Australia
  • Oliver Cliff -  School of Physics, The University of Sydney, Australia
  • Trent Henderson -  School of Physics, The University of Sydney, Australia
  • Annie Bryant -  School of Physics, The University of Sydney, Australia

Description of the tutorial

Massive open datasets of neural dynamics, from microscale neuronal circuits to macroscale population-level recordings, are becoming available to the computational neuroscience community. There are myriad ways to quantify the univariate dynamics of any individual component of a spatially distributed neural system, including methods from statistical time-series modeling, the physical nonlinear time-series analysis literature, and methods derived from information theory. And similarly, when analyzing multiple time series simultaneously (multivariate time series), there are hundreds of methods for quantifying pairwise dependencies between elements of the system, from Pearson correlations (as ‘functional connectivity’) through to distance correlations, wavelet coherence, dynamic time warping, and many others.

Across this highly interdisciplinary range of possible time-series analysis methods, that each provide unique information about the measured dynamics, the choice of method for a given dataset is typically subjective, leaving open the possibility that alternative methods might yield better understanding or performance. In this tutorial, we will demonstrate the highly comparative approach to time-series analysis, in which the behavior and performance large numbers of scientific analysis methods are compared on a dataset, providing a systematic means of selecting useful methods.

Software tools:

In the tutorial, we will demonstrate three software tools:

  1. hctsa (Matlab): partially automates the selection of useful time-series analysis methods from an interdisciplinary library of over 7000 time-series features.
  2. theft (R): tools for time-series feature extraction and analysis, leveraging a range of open feature sets from python and R.
  3. pyspi (python): software for computing hundreds of statistics of pairwise interactions from multivariate time-series data.

These tools will be demonstrated on a range of applications: (i) determining the relationship between structural connectivity and fMRI dynamics in mouse and human, (ii) understanding the effects of targeted brain stimulation using DREADDs using mouse fMRI, (iii) classifying seizure dynamics from EEG; and (iv) extracting biomarkers of brain disorders from fMRI.

Background reading:

  1. Fulcher BD, Jones NS. hctsa: A computational framework for automated time-series phenotyping using massive feature extraction. Cell Systems. 2017 5:527.
  2. Cliff OM, Lizier JT, Tsuchiya N, Fulcher BD. Unifying Pairwise Interactions in Complex Dynamics. arXiv:220111941. 2022: http://arxiv.org/abs/2201.11941

Schedule:

  • Introduction to the approach and software

0–30 min: Introduction to highly comparative time-series analysis (univariate and bivariate).

30–60 min: Software implementations of feature-based time-series analysis: hctsa, theft, and pyspi, and examples of their application to problems in neuroscience.

  • –Break–
  • Brief software demos

1h 15 min-1h 30 min: Demonstration of hctsa (Matlab) functionality [Ben Fulcher]

1h 30 min—1h 45 min: Demonstration of theft (R) functionality [Trent Henderson]

1h 45 min—2h: Demonstration of pyspi (python) functionality [Annie Bryant]

  • 2h-3h: Interactive session

Working through sample datasets with audience members.

Back to top

T6: GPU enhanced Neuronal Networks.

  • James C Knight - School of Engineering and Informatics, University of Sussex 
  • Thomas Nowotny - School of Engineering and Informatics, University of Sussex

Description of the tutorial

Fancy running your brain simulation 10x faster? The GPU enhanced Neuronal Networks (GeNN) software ecosystem [1,2] is freely available from https://github.com/genn-team/genn and provides an environment for GPU accelerated spiking neural network simulations. GeNN is capable of simulating large spiking neural network (SNN) models at competitive speeds, even on single, commodity GPUs. In GeNN, SNN models are described using a simple model description API through which variables, parameters and C-like code snippets that describe various aspects of the model elements can be specified, e.g. neuron and synapse update equations or learning dynamics. Model elements of neuron and synapse types are combined into neuron and synapse populations to form a full spiking neural network model. GeNN takes the model description and generates optimised code to simulate the model. Current code-generation backends include CUDA for NVIDIA GPUs and OpenCL for other accelerators as well as a C++ CPU-only mode.

In recent years the GeNN ecosystem has expanded rapidly with a Python wrapper, PyNN interface, OpenCL backend and, most recently, mlGeNN [3], an interface to machine learning workflows. Furthermore, large strides have been made with concurrent initialisation methods [4], improved recording of results [5] and “procedural connectivity” [6].

In this tutorial we will first introduce GeNN and then walk through two tutorials that provide a flavour of the capabilities and user experience of GeNN  and enable you to start using it for your own work. The first tutorial will cover the basic usage of GeNN to simulate a population of neurons with different parameters and the second will cover the development of an insect mushroom body-inspired model for MNIST classification. The tutorials are available on Google collabs without need for installation but we will also help you to install GeNN on your own computer for use in your own work.

Software tools:

Background reading:

  1. E. Yavuz, J. Turner and T. Nowotny (2016). GeNN: a code generation framework for accelerated brain simulations. Scientific Reports 6:18854. doi: 10.1038/srep18854
  2. GeNN, https://github.com/genn-team/genn
  3. J. Turner, J. C. Knight, A. Subramanian and T. Nowotny, T. (2022). mlGeNN: accelerating SNN inference using GPU-enabled neural networks. Neuromorphic Computing and Engineering 2(2). doi: 10.1088/2634-4386/ac5ac5
  4. J. C. Knight, T. Nowotny (2018) GPUs outperform current HPC and neuromorphic solutions in terms of speed and energy when simulating a highly-connected cortical model. Front Neurosci 12:941. doi: 10.3389/fnins.2018.00941
  5. J. C. Knight, A. Komissariv, T. Nowotny (2021) PyGeNN: A Python Library for GPU-Enhanced Neural Networks. Frontiers in Neuroinformatics 15: 10. doi: 10.3389/fninf.2021.659005
  6. J. C. Knight and T. Nowotny (2021) Larger GPU-accelerated brain simulations with procedural connectivity. Nat Comput Sci 1(2): 136-42. doi: 10.1038/s43588-020-00022-7

Schedule:

  1. Introduction of GeNN (10 min)
  2. CompNeuro 101 neurons tutorial walk-through (20 min)
  3. Interactive programming / exercises & installation (20 min)
  4. Break (10 min)
  5. Insect-inspired MNIST classification
    1. Projection Neurons tutorial walk-through (15 mins)
    2. Interactive programming (20 mins)
    3. Kenyon Cells tutorial walk-through (15 mins)
    4. Interactive programming (20 mins)
    5. Output neurons and testing tutorial walk-through (15 mins)
    6. Interactive programming / exercises (35 mins)

Back to top

T7: Spectral analysis of neural signals.

  •  Axel Hutt

Description of the tutorial

The spectral analysis of observed neural activity is essential in a large part of experimental research. To apply successfully the meanwhile large number and different types of spectral analysis techniques, it is important to understand in detail fundamental aspects of spectral analysis methods.


The tutorial is targeted at experimentalists at all levels and will just touch theoretical details. It will be a hands-on tutorial based on practical problems. As an additional support, Python source code scripts will be provided for several analysis problems discussed in the tutorial. These scripts permit the participant to implement herself/himself different techniques discussed and support further understanding.

Software tools:

                            https://www.geocities.ws/digitalbath/Spectral_Analysis_slides_1.pdf
                            https://www.geocities.ws/digitalbath/Spectral_Analysis_slides_2.pdf
                            https://www.geocities.ws/digitalbath/Spectral_Analysis_slides_3.pdf

Background reading:

  1. S. Mallat, A wavelet tour of signal processing: the sparse way (1998), 3rd edition, Academic Press, London
  2. B. Boashash, Estimating and Interpreting The Instantaneous Frequency of a Signal, Part 1: Fundamentals, Proc. IEEE 80(4): 520-538(1992)

Schedule:

  • 1:30 pm: Fundamentals in sampling theory, Fourier theory and related artifact (aliasing, spectral leakage)
  • 2:30 pm: Linear filters and spectral power in stationary signals
  • 3:00 pm: Break
  • 3:10 pm: Spectral power in non-stationary signals: windowed Fourier transform and time-frequency spectral analysis [1].
  • 4:00 pm: The concept of analytical signal [2]: Hilbert Transform, phase synchronization, Empirical Mode Decomposition

Back to top

T8: Introduction to the Brain Dynamics Toolbox.

  • Stewart Heitmann - Victor Chang Cardiac Research Institute, Australia.

Description of the showcase

The Brain Dynamics Toolbox (bdtoolbox.org) is open-source matlab software for simulating dynamical systems in neuroscience. Specifically, initial-value problems in systems of Ordinary Differential Equations (ODEs), Delay Differential Equations (DDEs), Stochastic Differential Equations (SDEs) and Partial Differential Equations (PDEs). Users define their equations as custom Matlab functions which are then loaded into the graphical interface for running. Interchangeable solvers and display panels can be applied to any system with no additional programming effort. Large-scale simulations can also be scripted with command-line tools. This software showcase aims to introduce the toolbox to a wide audience through a series of real-time demonstrations. The audience will learn how to get started with the toolbox, how to run existing models and how to semi-automate the controls to generate a bifurcation diagram. Further training courses are available from the bdtoolbox.org website.

Background reading and software tools:

  1. Brain Dynamics Toolbox Website. https://bdtoolbox.org
  2. Heitmann S, Breakspear M (2022) Handbook for the Brain Dynamics Toolbox: Version 2022. bdtoolbox.org. 7th Edition, ISBN 978-0-6450669-2-0.
  3. Heitmann S, Aburn M, Breakspear M (2017) The Brain Dynamics Toolbox for Matlab. Neurocomputing. Vol 315. p82-88. doi:10.1016/j.neucom.2018.06.026.

Schedule:

All times listed are Melbourne, Australia Time Zone (AEST). 

  • 15:30 – 16:00 – Showcase

Back to top