Wasserstein dictionary learning is an unsupervised approach to learning a collection of probability distributions that generate observed distributions as Wasserstein barycentric …
We propose K-Deep Simplex (KDS) which, given a set of data points, learns a dictionary comprising synthetic landmarks, along with representation coefficients supported on a …
The classical sparse coding model represents visual stimuli as a linear combination of a handful of learned basis functions that are Gabor-like when trained on natural image data …
We pursue local sparse representations of data by considering a common data model where representations are formed as a combination of atoms that we call a dictionary. Our focus is …
JR Huml, DE Ba - SVRHM 2022 Workshop@ NeurIPS - openreview.net
Sparse coding is a pillar of computational neuroscience, learning filters that well-describe the sensitivities of mammalian simple cell receptive fields (SCRFs) in a least-squares sense …