关注
Sungjae Lee
Sungjae Lee
NAVER Clova
在 yonsei.ac.kr 的电子邮件经过验证
标题
引用次数
引用次数
年份
What changes can large-scale language models bring? intensive study on hyperclova: Billions-scale korean generative pretrained transformers
B Kim, HS Kim, SW Lee, G Lee, D Kwak, DH Jeon, S Park, S Kim, S Kim, ...
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
1012021
Lut-gemm: Quantized matrix multiplication based on luts for efficient inference in large-scale generative language models
G Park, B Park, M Kim, S Lee, J Kim, B Kwon, SJ Kwon, B Kim, Y Lee, ...
arXiv preprint arXiv:2206.09557, 2022
792022
Dfx: A low-latency multi-fpga appliance for accelerating transformer-based text generation
S Hong, S Moon, J Kim, S Lee, M Kim, D Lee, JY Kim
2022 55th IEEE/ACM International Symposium on Microarchitecture (MICRO), 616-630, 2022
352022
Energy-efficient acceleration of deep neural networks on realtime-constrained embedded edge devices
B Kim, S Lee, AR Trivedi, WJ Song
IEEE Access 8, 216259-216270, 2020
262020
The nebula benchmark suite: Implications of lightweight neural networks
B Kim, S Lee, C Park, H Kim, WJ Song
IEEE Transactions on Computers 70 (11), 1887-1900, 2020
52020
HPC2 lusterScape: Increasing Transparency and Efficiency of Shared High-Performance Computing Clusters for Large-scale AI Models
H Park, A Cho, H Jeon, H Lee, Y Yang, S Lee, H Lee, J Choo
2023 IEEE Visualization in Data Science (VDS), 21-29, 2023
22023
HyperCLOVA X Technical Report
KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ...
arXiv preprint arXiv:2404.01954, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–7