Graeme Moffat
PhD, VP of Scientific & Regulatory Affairs, Muse, Interaxon Brain Sensing Technologies
Interaxon builds the worlds bestselling EEG system and leading consumer neurotechnology platform, Muse. By integrating sparse, wearable neurotechnology and large-scale neurodata into everyday experiences for consumers, Muse makes possible brain health research and neuroinformatics at an unprecedented scale. Visit the Muse website.
Misha Benjamin
Legal counsel, Element AI
Element AI is a Montreal-based platform that advances cutting-edge AI research and turn it into scalable products that make businesses safer, stronger, and more agile. Visit the Element AI website. Misha Benjamin completed his Civil and Common Law at the University of Ottawa, and started his career as an articling student and then associate in the corporate and commercial group at BLG.
Christian Dansereau
CEO & Co-founder, Montreal-based Perceiv AI
Perceiv AI, a data-driven precision-medicine company, provides AI-based actionable biomarkers for the early detection of Alzheimers and dementia progression at the prodromal stage. Visit
the Perceive AI website.
Richard Gold
James McGill Professor, Faculty of Law, McGill University
Dr. Richard Gold was the founding Director of the Centre for Intellectual Property Policy. He teaches in the area of intellectual property and innovation. His research centres on the nexus
between innovation, development and commerce, particularly with respect to biotechnology.
Yoshua Bengio
University of Montréal
Bridging the gap between brains, cognition and deep learning
We start by reviewing connectionist ideas from three decades ago which have fuelled a revolution in artificial intelligence with the rise of deep learning methods. We also discuss the new ideas from deep learning, including a discussion of the newly acquired theoretical understanding of the advantages brought by jointly optimizing a deep architecture. (more…)
Moritz Helmstaedter
Max Planck Institute for Brain Research
Cerebral Cortex Connectomics
Brains are highly interconnected networks of millions to billions of neurons. For a century, we
have not been able to map these connectivity networks. Only recently, using novel electron
microscopy techniques and machine-learning based data analysis, the mapping of neuronal
networks has become possible at a larger scale. (more…)
Sharon Crook
Arizona State University
Reproducibility and rigor in computational neuroscience: Testing the data driven model
As computational models in neuroscience increase in complexity, there are additional barriers for their creation, exchange, and re-use. Successful projects have created standards and open source tools to address these issues, but specific, rigorous criteria for evaluating models against experimental data during model development remain rare. (more…)
Yiota Poirazi
Foundation of Research and Technology-Hellas
Session: developing data-driven models of synapses and neurons
Dendritic contributions to complex functions: insights from computational modeling
My lab (www.dendrites.gr) uses computational modelling approaches to investigate the role of dendrites in learning and memory processes. Our models range in complexity from detailed biophysical single cells, to reduced microcircuits and large scale simplified neuronal networks. Brain areas of interest include the hippocampus, the amygdala, the prefrontal cortex and the visual cortex.
Russ Poldrack
Stanford University
Session: brain imaging standards and best practices
Towards a robust data organization scheme for neuroimaging: the Brain Imaging Data Structure
The field of neuroimaging has been at the vanguard of data sharing, but the utility of shared data has been limited by the lack of standards for data and metadata organization. I will describe our data sharing efforts via the OpenfMRI and OpenNeuro projects, and how those efforts have been supported by the development of a community standard for data organization: The Brain Imaging Data Structure (BIDS) standard. (more…)
Camille Maumet
Inria
Session: brain imaging standards and best practices
Tools and standards to make neuroimaging derived data reusable
Neuroimaging is becoming increasingly collaborative. More and more brain imaging datasets are made available to the community, effectively creating a massive distributed resource for neuroscientists. But to make the best use of this asset, we need tools & standards to model and understand the diverse sources of variability present in these data. This talk will discuss recent initiatives to represent and share neuroimaging metadata and how these can help us leverage heterogeneous datasets.
Andrew Davison
Centre Nationnal de la Recherche Scientifique (CNRS)
Session: reproducible neuroscience + open science
Improving reproducibility and reuse in computational and systems neuroscience
I will survey recent initiatives to improve reproducibility in computational and systems neuroscience, focusing on automatic capture of workflow provenance, on model sharing, and on community-driven open source tool development.
Michel Dumontier
Maastrictt University
Session: can we harmonize metadata
Accelerating biomedical discovery with FAIR
Kenneth Harris
University College London
Session: computational infrastructure for neuroscience: automation / pipelines
High-Dimensional Geometry of the Cortical Population Code as Revealed by 10,000-Cell Recordings
We used 2-photon calcium imaging and improved analysis methods to record the responses of >10,000 neurons in the visual cortex of awake mice, to thousands of natural images. The recorded population code was high-dimensional, with the variance of its dimensions following a powerlaw. (more…)
Upinder Bhalla
National Centre for Biological Sciences
Session: standardization in multiscale modeling – connecting the levels
From neural recordings to subcellular neuronal computation.
Neural circuit activity is as much an outcome of subcellular and molecular computation as it is of spike integration. We are interested in three closely coupled problems: How to simulate multiscale neuronal models that span molecular, electrical, and structural events, how to define such models, and how to systematically parameterize them. (more…)
Fernando Perez
University of California, Berkeley
Session: computational infrastructure for neuroscience: automation / pipelines
Open science and reproducible research on Jupyter
Project Jupyter, evolved from the IPython environment, provides a platform for interactive computing that is widely used today in research, education, journalism and industry. The core premise of the Jupyter architecture is to design tools around the experience of interactive computing. It provides an environment, protocol, file format and libraries optimized for the computational process when there is a human in the loop, in a live iteration with ideas and data assisted by the computer.
Adriana Romero Soriano
McGill University
Session: machine learning in neuroscience
Deep learning for genomics and graph-structured data
In recent years, deep learning has achieved promising results in medical imaging analysis. However, in order to fullyexploit the richness of healthcare data, new models able to deal with a variety of modalities have to be designed.
Ivan Soltesz
Stanford University
Pierre Bellec
Université de Montréal
Session: reproducible neuroscience + open science
Dealing with clinical heterogeneity in the discovery of new biomarkers of Alzheimer’s disease
Simon Schultz
Imperial College London
Session: Building a framework for understanding circuit function session
From photon to pipette: optical and electrophysiological tools for studying intact neural circuits.
Advances in neurotechnology are revolutionising our ability to determine how neural circuit function underpins behavioural phenomena such as perception, memory and action, and how its dysfunction leads to disruption of those behaviours. In particular, optical approaches, in combination with electrophysiology, allow the role of targeted individual circuit elements in behaviour to be studied.
Nolan Nichols
Genentech
Session: Can we Harmonize Metadata?
Meaningful (meta)data at scale: removing barriers to precision medicine research
Randomized controlled trials (RCTs) are the gold standard for evaluating therapeutics in patient populations. The data collected during RCTs include a wealth of clinical measures, biomarkers, and tissue samples – the analysis of which can lead to the approval of new medicines that improve the lives of patients. (more…)
Gael Varoquaux
Inria
Session: Machine learning in neuroscience
Towards psychoinformatics with machine learning and brain imaging
Informatics in the psychological sciences brings fascinating challenges as concepts or pathologies have fuzzy boundaries and are hard to quantify. Brain imaging brings rich data on the neural substrate of these concepts, yet it is a non trivial link.
Tatyana Sharpee
Computational Neurobiology Laboratory, Salk Institute for Biological Studies
Session: Building a framework for understanding circuit function session
Cortical representation of natural stimuli
In this talk I will describe our recent findings of several organizing principles for how natural visual and auditory stimuli are presented across stages of cortical processing. For visual processing, I will describe how signals in the secondary cortical visual area build on the outputs provided by the first cortical visual area, and how they relate to representation found in subsequent visual areas, such as area V4.
Shaul Druckmann
Stanford University
Session: Building a framework for understanding circuit function
Relating circuit dynamics to computation: robustness and dimension-specific computation in cortical dynamics
Sonia Israel
Co-Founder of Montreal-based AiFred Health
Aifred Health is a Montreal based start-up that uses artificial intelligence (AI) to help physicians make better treatment decisions in mental health. Founded in August 2017, the company is initially developing clinical decision aids on personalized treatment selection for major depression, and plans to undertake ease-of-use and open-label trials in early 2019.
Randy McIntosh
The Virtual brain
The Virtual Brain, developed on a non-profit model, is an open-source platform that brings together computational, cognitive and clinical neuroscientists. The Virtual Brain is a simulation platform that can be run on any laptop but is scalable to HPC clusters and has over 10,000 installations worldwide.