Scott W. Linderman completed his postdoctoral work in the Department of Statistics at Columbia University. His PhD in Computer Science was awarded by Harvard University, where his thesis, "Bayesian methods for discovering structure in neural spike trains" on networks, point processes, and state space models for neural data analysis, was awarded the 2016 Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology from the International Society for Bayesian Analysis. In 2017 Scott coauthored "Reparameterization gradients through acceptance-rejection sampling algorithms,"which received the Best Paper Award at the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). His work has been supported by a National Defense Science and Engineering Graduate Fellowship, the Siebel Scholarship, and a Simons Collaboration on the Global Brain Postdoctoral Fellowship.
Scott received his Bachelor of Science degree in Electrical and Computer Engineering from Cornell University. After completing his undergraduate work, he spent three years as a software development engineer at Microsoft. As a graduate student and postdoctoral fellow, he assisted in teaching courses in computational learning theory and advanced machine learning, as well as co-organizing multiple workshops at the intersection of machine learning and neuroscience.
Professor Linderman's research is focused on machine learning, computational neuroscience, and the general question of how computational and statistical methods can help decipher neural computation. His work aims at developing rich statistical models for analyzing neural data. This endeavor comprises two challenges that are intimately related: developing new statistical methodologies that are well suited to extract information from that data, and applying those methods to experimental data. His work has helped to reveal latent structure underlying neural activity and its relation to sensory inputs and behavioral outputs.