Transition-based dependency parsing with stack long short-term memory C Dyer, M Ballesteros, W Ling, A Matthews, NA Smith arXiv preprint arXiv:1505.08075, 2015 | 1025 | 2015 |
Finding function in form: Compositional character models for open vocabulary word representation W Ling, T Luís, L Marujo, RF Astudillo, S Amir, C Dyer, AW Black, ... arXiv preprint arXiv:1508.02096, 2015 | 774 | 2015 |
Two/too simple adaptations of word2vec for syntax problems W Ling, C Dyer, AW Black, I Trancoso Proceedings of the 2015 conference of the North American chapter of the …, 2015 | 528 | 2015 |
Program induction by rationale generation: Learning to solve and explain algebraic word problems W Ling, D Yogatama, C Dyer, P Blunsom arXiv preprint arXiv:1705.04146, 2017 | 473 | 2017 |
Latent predictor networks for code generation W Ling, E Grefenstette, KM Hermann, T Kočiský, A Senior, F Wang, ... arXiv preprint arXiv:1603.06744, 2016 | 441 | 2016 |
Character-based neural machine translation W Ling, I Trancoso, C Dyer, AW Black arXiv preprint arXiv:1511.04586, 2015 | 284 | 2015 |
Generative and discriminative text classification with recurrent neural networks D Yogatama, C Dyer, W Ling, P Blunsom arXiv preprint arXiv:1703.01898, 2017 | 249 | 2017 |
Learning to compose words into sentences with reinforcement learning D Yogatama, P Blunsom, C Dyer, E Grefenstette, W Ling arXiv preprint arXiv:1611.09100, 2016 | 202 | 2016 |
Evaluation of word vector representations by subspace alignment Y Tsvetkov, M Faruqui, W Ling, G Lample, C Dyer Proceedings of the 2015 Conference on Empirical Methods in Natural Language …, 2015 | 193 | 2015 |
Not all contexts are created equal: Better word representations with variable attention W Ling, Y Tsvetkov, S Amir, R Fermandez, C Dyer, AW Black, I Trancoso, ... Proceedings of the 2015 conference on empirical methods in natural language …, 2015 | 167 | 2015 |
Neural network-based abstract generation for opinions and arguments L Wang, W Ling arXiv preprint arXiv:1606.02785, 2016 | 163 | 2016 |
Reference-aware language models Z Yang, P Blunsom, C Dyer, W Ling arXiv preprint arXiv:1611.01628, 2016 | 106 | 2016 |
Learning the curriculum with bayesian optimization for task-specific word representation learning Y Tsvetkov, M Faruqui, W Ling, B MacWhinney, C Dyer arXiv preprint arXiv:1605.03852, 2016 | 100 | 2016 |
Semantic parsing with semi-supervised sequential autoencoders T Kočiský, G Melis, E Grefenstette, C Dyer, W Ling, P Blunsom, ... arXiv preprint arXiv:1609.09315, 2016 | 98 | 2016 |
A linguistically motivated taxonomy for Machine Translation error analysis  Costa, W Ling, T Luís, R Correia, L Coheur Machine Translation 29, 127-161, 2015 | 97 | 2015 |
Microblogs as Parallel Corpora W Ling, G Xiang, C Dyer, A Black, I Trancoso Association for Computational Linguistics, 2013 | 96 | 2013 |
Automatic keyword extraction on twitter L Marujo, W Ling, I Trancoso, C Dyer, AW Black, A Gershman, ... Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015 | 77 | 2015 |
Learning and evaluating general linguistic intelligence D Yogatama, CM d'Autume, J Connor, T Kocisky, M Chrzanowski, L Kong, ... arXiv preprint arXiv:1901.11373, 2019 | 72 | 2019 |
A mutual information maximization perspective of language representation learning L Kong, CM d'Autume, W Ling, L Yu, Z Dai, D Yogatama arXiv preprint arXiv:1910.08350, 2019 | 70 | 2019 |
Memory architectures in recurrent neural network language models D Yogatama, Y Miao, G Melis, W Ling, A Kuncoro, C Dyer, P Blunsom International Conference on Learning Representations, 2018 | 61 | 2018 |