Article ID Journal Published Year Pages File Type
5631160 NeuroImage 2017 16 Pages PDF
Abstract

•Neuroscience started collecting multi-modal datasets from thousands of individuals.•Non-parametric models will increase neurobiological insight as data accumulate.•Generative models will reveal candidate mechanisms underlying behavior and disease.•The advantages of frequentist and Bayesian modeling will be more often combined.•Null-hypothesis testing and out-of-sample generalization will draw formal inference.

Neuroscience is undergoing faster changes than ever before. Over 100 years our field qualitatively described and invasively manipulated single or few organisms to gain anatomical, physiological, and pharmacological insights. In the last 10 years neuroscience spawned quantitative datasets of unprecedented breadth (e.g., microanatomy, synaptic connections, and optogenetic brain-behavior assays) and size (e.g., cognition, brain imaging, and genetics). While growing data availability and information granularity have been amply discussed, we direct attention to a less explored question: How will the unprecedented data richness shape data analysis practices? Statistical reasoning is becoming more important to distill neurobiological knowledge from healthy and pathological brain measurements. We argue that large-scale data analysis will use more statistical models that are non-parametric, generative, and mixing frequentist and Bayesian aspects, while supplementing classical hypothesis testing with out-of-sample predictions.

Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, ,