Uses and abuses of the cross-entropy loss: case studies in modern deep learning E Gordon-Rodriguez, G Loaiza-Ganem, G Pleiss, JP Cunningham NeurIPS ICBINB, PMLR 137, 2020 | 86 | 2020 |
Learning sparse log-ratios for high-throughput sequencing data E Gordon-Rodriguez, TP Quinn, JP Cunningham Bioinformatics 38 (1), 157-163, 2022 | 34 | 2022 |
The continuous categorical: a novel simplex-valued exponential family E Gordon-Rodriguez, G Loaiza-Ganem, J Cunningham International Conference on Machine Learning, 3637-3647, 2020 | 27 | 2020 |
Data augmentation for compositional data: Advancing predictive models of the microbiome E Gordon-Rodriguez, T Quinn, JP Cunningham Advances in Neural Information Processing Systems 35, 20551-20565, 2022 | 10 | 2022 |
A critique of differential abundance analysis, and advocacy for an alternative TP Quinn, E Gordon-Rodriguez, I Erb arXiv preprint arXiv:2104.07266, 2021 | 9 | 2021 |
Uses and abuses of the cross-entropy loss: Case studies in modern deep learning. arXiv 2020 E Gordon-Rodriguez, G Loaiza-Ganem, G Pleiss, JP Cunningham arXiv preprint arXiv:2011.05231, 2011 | 7 | 2011 |
On disentanglement and mutual information in semi-supervised variational auto-encoders EG Rodriguez Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021 | 5 | 2021 |
Uses and abuses of the cross-entropy loss: Case studies in modern deep learning. arXiv E Gordon-Rodriguez, G Loaiza-Ganem, G Pleiss, JP Cunningham arXiv preprint arXiv:2011.05231, 2020 | 5 | 2020 |
Advances in Machine Learning for Compositional Data EG Rodriguez Columbia University, 2022 | 2 | 2022 |
On the Normalizing Constant of the Continuous Categorical Distribution E Gordon-Rodriguez, G Loaiza-Ganem, A Potapczynski, JP Cunningham arXiv preprint arXiv:2204.13290, 2022 | 1 | 2022 |