Bloom: A 176b-parameter open-access multilingual language model T Le Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, R Castagné, ... | 1306 | 2023 |
Scifive: a text-to-text transformer model for biomedical literature LN Phan, JT Anibal, H Tran, S Chanana, E Bahadroglu, A Peltekian, ... arXiv preprint arXiv:2106.03598, 2021 | 126 | 2021 |
CoTexT: Multi-task Learning with Code-Text Transformer L Phan, H Tran, D Le, H Nguyen, J Anibal, A Peltekian, Y Ye ACL NLP4Prog, 2021 | 115 | 2021 |
Representation engineering: A top-down approach to ai transparency A Zou, L Phan, S Chen, J Campbell, P Guo, R Ren, A Pan, X Yin, ... arXiv preprint arXiv:2310.01405, 2023 | 114 | 2023 |
ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation L Phan, H Tran, H Nguyen, TH Trinh NAACL SRW 2022, 2022 | 48 | 2022 |
Harmbench: A standardized evaluation framework for automated red teaming and robust refusal M Mazeika, L Phan, X Yin, A Zou, Z Wang, N Mu, E Sakhaee, N Li, ... arXiv preprint arXiv:2402.04249, 2024 | 36 | 2024 |
The wmdp benchmark: Measuring and reducing malicious use with unlearning N Li, A Pan, A Gopal, S Yue, D Berrios, A Gatti, JD Li, AK Dombrowski, ... arXiv preprint arXiv:2403.03218, 2024 | 19 | 2024 |
Hierarchical transformer encoders for Vietnamese spelling correction H Tran, CV Dinh, L Phan, ST Nguyen Advances and Trends in Artificial Intelligence. Artificial Intelligence …, 2021 | 9 | 2021 |
SPBERT: an efficient pre-training BERT on SPARQL queries for question answering over knowledge graphs H Tran, L Phan, J Anibal, BT Nguyen, TS Nguyen Neural Information Processing: 28th International Conference, ICONIP 2021 …, 2021 | 7 | 2021 |
Mtet: Multi-domain translation for english and vietnamese C Ngo, TH Trinh, L Phan, H Tran, T Dang, H Nguyen, M Nguyen, ... arXiv preprint arXiv:2210.05610, 2022 | 6 | 2022 |
Enriching biomedical knowledge for low-resource language through large-scale translation L Phan, T Dang, H Tran, TH Trinh, V Phan, LD Chau, MT Luong arXiv preprint arXiv:2210.05598, 2022 | 5 | 2022 |
Viesum: how robust are transformer-based models on Vietnamese summarization? H Nguyen, L Phan, J Anibal, A Peltekian, H Tran arXiv preprint arXiv:2110.04257, 2021 | 5 | 2021 |
HAL-X: Scalable hierarchical clustering for rapid and tunable single-cell analysis J Anibal, AG Day, E Bahadiroglu, L O’Neil, L Phan, A Peltekian, A Erez, ... PLoS Computational Biology 18 (10), e1010349, 2022 | 4* | 2022 |
Improving Alignment and Robustness with Short Circuiting A Zou, L Phan, J Wang, D Duenas, M Lin, M Andriushchenko, R Wang, ... arXiv preprint arXiv:2406.04313, 2024 | 2 | 2024 |