We survey recent work on neurosymbolic programming, an emerging area that bridges the areas of deep learning and program synthesis. Like in classic machine learning, the goal …
PW Koh, S Sagawa, H Marklund… - International …, 2021 - proceedings.mlr.press
Distribution shifts—where the training distribution differs from the test distribution—can substantially degrade the accuracy of machine learning (ML) systems deployed in the wild …
You are holding in your hands… oh, come on, who holds books like this in their hands anymore? Anyway, you are reading this, and it means that I have managed to release one of …
Recent work learns contextual representations of source code by reconstructing tokens from their context. For downstream semantic understanding tasks like summarizing code in …
Recent Language Models (LMs) achieve breakthrough performance in code generation when trained on human-authored problems, even solving some competitive-programming …
X Chen, D Song, Y Tian - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Program synthesis from input-output (IO) examples has been a long-standing challenge. While recent works demonstrated limited success on domain-specific languages (DSL), it …
Generative neural models hold great promise in enhancing programming education by synthesizing new content. We seek to design neural models that can automatically generate …
The advancements in machine learning techniques have encouraged researchers to apply these techniques to a myriad of software engineering tasks that use source code analysis …
The use of deep learning techniques has achieved significant progress for program synthesis from input-output examples. However, when the program semantics become more …