S Caton, C Haas - ACM Computing Surveys, 2024 - dl.acm.org
When Machine Learning technologies are used in contexts that affect citizens, companies as well as researchers need to be confident that there will not be any unexpected social …
Large language models (LLMs), exemplified by ChatGPT, have gained considerable attention for their excellent natural language processing capabilities. Nonetheless, these …
Y Huang, L Sun, H Wang, S Wu… - International …, 2024 - proceedings.mlr.press
Large language models (LLMs) have gained considerable attention for their excellent natural language processing capabilities. Nonetheless, these LLMs present many …
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural language processing tasks: Unidirectional PLMs (eg, GPT) are well known for their …
A Parnami, M Lee - arXiv preprint arXiv:2203.04291, 2022 - arxiv.org
Few-Shot Learning refers to the problem of learning the underlying pattern in the data just from a few training samples. Requiring a large number of data samples, many deep learning …
With the emergence of machine learning methods, data-driven fault diagnosis has gained significant attention in recent years. However, traditional data-driven diagnosis approaches …
Decades of research have shown machine learning superiority in discovering highly nonlinear patterns embedded in electroencephalography (EEG) records compared with …
H Tang, J Liu, M Zhao, X Gong - … of the 14th ACM Conference on …, 2020 - dl.acm.org
Multi-task learning (MTL) has been successfully applied to many recommendation applications. However, MTL models often suffer from performance degeneration with …
Sequential fine-tuning and multi-task learning are methods aiming to incorporate knowledge from multiple tasks; however, they suffer from catastrophic forgetting and difficulties in …