A unified, scalable framework for neural population decoding

M Azabou, V Arora, V Ganesh, X Mao… - Advances in …, 2024 - proceedings.neurips.cc
M Azabou, V Arora, V Ganesh, X Mao, S Nachimuthu, M Mendelson, B Richards, M Perich
Advances in Neural Information Processing Systems, 2024proceedings.neurips.cc
Our ability to use deep learning approaches to decipher neural activity would likely benefit
from greater scale, in terms of both the model size and the datasets. However, the
integration of many neural recordings into one unified model is challenging, as each
recording contains the activity of different neurons from different individual animals. In this
paper, we introduce a training framework and architecture designed to model the population
dynamics of neural activity across diverse, large-scale neural recordings. Our method first …
Abstract
Our ability to use deep learning approaches to decipher neural activity would likely benefit from greater scale, in terms of both the model size and the datasets. However, the integration of many neural recordings into one unified model is challenging, as each recording contains the activity of different neurons from different individual animals. In this paper, we introduce a training framework and architecture designed to model the population dynamics of neural activity across diverse, large-scale neural recordings. Our method first tokenizes individual spikes within the dataset to build an efficient representation of neural events that captures the fine temporal structure of neural activity. We then employ cross-attention and a PerceiverIO backbone to further construct a latent tokenization of neural population activities. Utilizing this architecture and training framework, we construct a large-scale multi-session model trained on large datasets from seven nonhuman primates, spanning over 158 different sessions of recording from over 27,373 neural units and over 100 hours of recordings. In a number of different tasks, we demonstrate that our pretrained model can be rapidly adapted to new, unseen sessions with unspecified neuron correspondence, enabling few-shot performance with minimal labels. This work presents a powerful new approach for building deep learning tools to analyze neural data and stakes out a clear path to training at scale for neural decoding models.
proceedings.neurips.cc
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References