Program synthesis with large language models J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 890 | 2021 |
Show your work: Scratchpads for intermediate computation with language models M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ... arXiv preprint arXiv:2112.00114, 2021 | 456 | 2021 |
DreamCoder: growing generalizable, interpretable knowledge with wake–sleep Bayesian program learning K Ellis, L Wong, M Nye, M Sable-Meyer, L Cary, L Anaya Pozo, L Hewitt, ... Philosophical Transactions of the Royal Society A 381 (2251), 20220050, 2023 | 206 | 2023 |
Dreamcoder: Bootstrapping inductive program synthesis with wake-sleep library learning K Ellis, C Wong, M Nye, M Sablé-Meyer, L Morales, L Hewitt, L Cary, ... Proceedings of the 42nd acm sigplan international conference on programming …, 2021 | 175 | 2021 |
Write, execute, assess: Program synthesis with a repl K Ellis, M Nye, Y Pu, F Sosa, J Tenenbaum, A Solar-Lezama Advances in Neural Information Processing Systems 32, 2019 | 156 | 2019 |
Implicit representations of meaning in neural language models BZ Li, M Nye, J Andreas arXiv preprint arXiv:2106.00737, 2021 | 128 | 2021 |
Learning to infer program sketches M Nye, L Hewitt, J Tenenbaum, A Solar-Lezama International Conference on Machine Learning, 4861-4870, 2019 | 120 | 2019 |
Learning compositional rules via neural program synthesis M Nye, A Solar-Lezama, J Tenenbaum, BM Lake Advances in Neural Information Processing Systems 33, 10832-10842, 2020 | 112 | 2020 |
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning M Nye, M Tessler, J Tenenbaum, BM Lake Advances in Neural Information Processing Systems 34, 25192-25204, 2021 | 96 | 2021 |
The variational homoencoder: Learning to learn high capacity generative models from few examples LB Hewitt, MI Nye, A Gane, T Jaakkola, JB Tenenbaum arXiv preprint arXiv:1807.08919, 2018 | 74 | 2018 |
Communicating natural programs to humans and machines S Acquaviva, Y Pu, M Kryven, T Sechopoulos, C Wong, G Ecanow, M Nye, ... Advances in Neural Information Processing Systems 35, 3731-3743, 2022 | 49 | 2022 |
Program synthesis with large language models. CoRR abs/2108.07732 (2021) J Austin, A Odena, MI Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 39 | 2021 |
Show your work: Scratchpads for intermediate computation with language models, 2021 M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ... URL https://arxiv. org/abs/2112.00114, 2021 | 33 | 2021 |
Introducing our multimodal models, 2023 R Bavishi, E Elsen, C Hawthorne, M Nye, A Odena, A Somani, S Tasırlar URL https://www. adept. ai/blog/fuyu-8b 2, 0 | 33 | |
Representing partial programs with blended abstract semantics M Nye, Y Pu, M Bowers, J Andreas, JB Tenenbaum, A Solar-Lezama arXiv preprint arXiv:2012.12964, 2020 | 25 | 2020 |
A large-scale benchmark for few-shot program induction and synthesis F Alet, J Lopez-Contreras, J Koppel, M Nye, A Solar-Lezama, ... International Conference on Machine Learning, 175-186, 2021 | 21 | 2021 |
Are efficient deep representations learnable? M Nye, A Saxe arXiv preprint arXiv:1807.06399, 2018 | 21 | 2018 |
Program synthesis with large language models (2021) J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 13 | 2021 |
Language modeling with latent situations BZ Li, M Nye, J Andreas arXiv preprint arXiv:2212.10012, 2022 | 8 | 2022 |
Larc: Language annotated abstraction and reasoning corpus S Acquaviva, Y Pu, M Nye, C Wong, MH Tessler, J Tenenbaum Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43), 2021 | 4 | 2021 |