Statistics Colloquium: Scott Linderman, Stanford University

This event is part of the Spring 2022 Statistics Colloquium


Point Process Models for Sequence Detection in Neural Spike Trains

Presented by Scott Linderman, Assistant Professor, Department of Statistics, Stanford University

Wednesday, April 13, 2022
4:00 p.m. ET
Online

Sparse sequences of neural spikes are posited to underlie aspects of working memory, motor production, and learning. Discovering these sequences in an unsupervised manner is a longstanding problem in statistical neuroscience. I will present our recent work using Neyman-Scott processes—a class of doubly stochastic point processes—to model sequences as a set of latent, continuous-time, marked events that produce cascades of neural spikes. Bayesian inference in this model requires integrating over the set of latent events, akin to inference in mixture of finite mixture (MFM) models and Dirichlet process mixture models (DPMMs). I will show how recent work on MFMs can be adapted to develop a collapsed Gibbs sampling algorithm for Neyman-Scott processes. Finally, I will present an empirical assessment of the model and algorithm on spike-train recordings from songbird HVC and rodent basal ganglia, which suggests novel connections between sequential activity in the brain and the generation of natural behavior.

Speaker Bio

Dr. Linderman is an Assistant Professor of Statistics and, by courtesy, Electrical Engineering and Computer Science at Stanford University. He is also an Institute Scholar in the Wu Tsai Neurosciences Institute and a member of Stanford Bio-X and the Stanford AI Lab. Previously, he was a postdoctoral fellow with Liam Paninski and David Blei at Columbia University, and he completed his PhD in Computer Science at Harvard University with Ryan Adams and Leslie Valiant. Dr. Linderman obtained his undergraduate degree in Electrical and Computer Engineering from Cornell University and spent three great years as a software engineer at Microsoft before graduate school.

Dr. Linderman’s research is focused on machine learning, computational neuroscience, and the general question of how computational and statistical methods can help decipher neural computation. His work aims at developing rich statistical models for analyzing neural data.