Bridging pre-trained models and downstream tasks for source code understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong… - Proceedings of the 44th …, 2022 - dl.acm.org
With the great success of pre-trained models, the pretrain-then-finetune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong… - arXiv e …, 2021 - ui.adsabs.harvard.edu
With the great success of pre-trained models, the pretrain-then-finetune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong… - 2022 IEEE/ACM 44th …, 2022 - ieeexplore.ieee.org
With the great success of pre-trained models, the pretrain-then-fine tune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

[PDF][PDF] Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong, X Liao - 2022 - researchgate.net
With the great success of pre-trained models, the pretrain-thenfinetune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

[PDF][PDF] Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong, X Liao - 2022 - yuyue.github.io
With the great success of pre-trained models, the pretrain-thenfinetune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong… - arXiv preprint arXiv …, 2021 - arxiv.org
With the great success of pre-trained models, the pretrain-then-finetune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …

Bridging Pre-trained Models and Downstream Tasks for Source Code Understanding

D Wang, Z Jia, S Li, Y Yu, Y Xiong, W Dong… - 2022 IEEE/ACM 44th …, 2022 - computer.org
With the great success of pre-trained models, the pretrain-then-fine tune paradigm has been
widely adopted on downstream tasks for source code understanding. However, compared to …