Yoshua Bengio

Yoshua Bengio

University of Montréal


Bridging the gap between brains, cognition and deep learning
We start by reviewing connectionist ideas from three decades ago which have fuelled a revolution in artificial intelligence with the rise of deep learning methods. We also discuss the new ideas from deep learning, including a discussion of the newly acquired theoretical understanding of the advantages brought by jointly optimizing a deep architecture. (more…)

Moritz Helmstaedter

Max Planck Institute for Brain Research

Will talk about: Reconstruction approaches and new discoveries

Sharon Crook

Sharon Crook

Arizona State University

Will talk about: Standardization of circuit models

Karel Svoboda

Karel Svoboda

Howard Huges Medical Institute Janelia Campus

Yiota Poirazi

Foundation of Research and Technology-Hellas
Session: developing data-driven models of synapses and neurons


Dendritic contributions to complex functions: insights from computational modeling
My lab (www.dendrites.gr) uses computational modelling approaches to investigate the role of dendrites in learning and memory processes. Our models range in complexity from detailed biophysical single cells, to reduced microcircuits and large scale simplified neuronal networks. Brain areas of interest include the hippocampus, the amygdala, the prefrontal cortex and the visual cortex.

Russ Poldrack

Stanford University
Session: brain imaging standards and best practices


Towards a robust data organization scheme for neuroimaging: the Brain Imaging Data Structure
The field of neuroimaging has been at the vanguard of data sharing, but the utility of shared data has been limited by the lack of standards for data and metadata organization.  I will describe our data sharing efforts via the OpenfMRI and OpenNeuro projects, and how those efforts have been supported by the development of a community standard for data organization: The Brain Imaging Data Structure (BIDS) standard.  (more…)

Camille Maumet

Session: brain imaging standards and best practices

Neuroimaging is becoming increasingly collaborative. More and more brain imaging datasets are made available to the community, effectively creating a massive distributed resource for neuroscientists. But to make the best use of this asset, we need tools & standards to model and understand the diverse sources of variability present in these data. This talk will discuss recent initiatives to represent and share neuroimaging metadata and how these can help us leverage heterogeneous datasets.

Andrew Davison

Centre Nationnal de la Recherche Scientifique (CNRS)
Session: reproducible neuroscience + open science


Improving reproducibility and reuse in computational and systems neuroscience
I will survey recent initiatives to improve reproducibility in computational and systems neuroscience, focusing on automatic capture of workflow provenance, on model sharing, and on community-driven open source tool development.

Michel Dumontier

Michel Dumontier

Maastrictt University
Session: can we harmonize metadata

Will talk about: computational methods for scalable integration and reproducible analysis of FAIR (Findable, Accessible, Interoperable and Reusable) data across scales – from molecules, tissues, organs, individuals, populations to the environment.

Kenneth Harris

University College London
Session: computational infrastructure for neuroscience: automation / pipelines


High-Dimensional Geometry of the Cortical Population Code as Revealed by 10,000-Cell Recordings
We used 2-photon calcium imaging and improved analysis methods to record the responses of >10,000 neurons in the visual cortex of awake mice, to thousands of natural images. The recorded population code was high-dimensional, with the variance of its dimensions following a powerlaw. (more…)

Upinder Bhalla

National Centre for Biological Sciences
Session: standardization in multiscale modeling – connecting the levels


From neural recordings to subcellular neuronal computation.
Neural circuit activity is as much an outcome of subcellular and molecular computation as it is of spike integration. We are interested in three closely coupled problems: How to simulate multiscale neuronal models that span molecular, electrical, and structural events, how to define such models, and how to systematically parameterize them. (more…)

Fernando Perez

University of California, Berkeley
Session: computational infrastructure for neuroscience: automation / pipelines


Open science and reproducible research on Jupyter
Project Jupyter, evolved from the IPython environment, provides a platform for interactive computing that is widely used today in research, education, journalism and industry.  The core premise of the Jupyter architecture is to design tools around the experience of interactive computing. It provides an environment, protocol, file format and libraries optimized for the computational process when there is a human in the loop, in a live iteration with ideas and data assisted by the computer.


Doina Precup

Doina Precup

McGill University
Session: machine learning in neuroscience

A pioneer in the area known as reinforcement learning, which involves applying computer programs to solving problems by encouraging desired behaviour through rewards

Pierre Bellec

Pierre Bellec

Université de Montréal
Session: reproducible neuroscience + open science
Will talk about: generalizability of biomarkers of Alzheimer’s disease – and present the first release of standardized biomarkers of the Canadian Consortium on Neurodegeneration in Aging.