关注
SHIH-HENG WANG
SHIH-HENG WANG
在 ntu.edu.tw 的电子邮件经过验证
标题
引用次数
引用次数
年份
Dynamic-superb: Towards a dynamic, collaborative, and comprehensive instruction-tuning benchmark for speech
C Huang, KH Lu, SH Wang, CY Hsiao, CY Kuan, H Wu, S Arora, ...
ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and …, 2024
322024
Dynamic-superb phase-2: A collaboratively expanding benchmark for measuring the capabilities of spoken language models with 180 tasks
C Huang, WC Chen, S Yang, AT Liu, CA Li, YX Lin, WC Tseng, A Diwan, ...
arXiv preprint arXiv:2411.05361, 2024
32024
ML-SUPERB 2.0: Benchmarking Multilingual Speech Models Across Modeling Constraints, Languages, and Datasets
J Shi, SH Wang, W Chen, M Bartelds, VB Kumar, J Tian, X Chang, ...
arXiv preprint arXiv:2406.08641, 2024
32024
How to Learn a New Language? An Efficient Solution for Self-Supervised Learning Models Unseen Languages Adaption in Low-Resource Scenario
SH Wang, ZC Chen, J Shi, MT Chuang, GT Lin, KP Huang, D Harwath, ...
arXiv preprint arXiv:2411.18217, 2024
12024
Fusion of Discrete Representations and Self-Augmented Representations for Multilingual Automatic Speech Recognition
S Wang, J Shi, C Huang, S Watanabe, H Lee
SLT 2024, 2024
2024
Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning
SC Huang, SH Wang, MH Shih, S Sahay, HY Lee
Proceedings of the 2024 Conference of the North American Chapter of the …, 2024
2024
General Framework for Self-Supervised Model Priming for Parameter-Efficient Fine-tuning
SC Huang, SH Wang, MH Shih, S Sahay, H Lee
arXiv preprint arXiv:2212.01032, 2022
2022
系统目前无法执行此操作,请稍后再试。
文章 1–7