What changes can large-scale language models bring? intensive study on hyperclova: Billions-scale korean generative pretrained transformers B Kim, HS Kim, SW Lee, G Lee, D Kwak, DH Jeon, S Park, S Kim, S Kim, ... Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021 | 101 | 2021 |
Lut-gemm: Quantized matrix multiplication based on luts for efficient inference in large-scale generative language models G Park, B Park, M Kim, S Lee, J Kim, B Kwon, SJ Kwon, B Kim, Y Lee, ... arXiv preprint arXiv:2206.09557, 2022 | 79 | 2022 |
Dfx: A low-latency multi-fpga appliance for accelerating transformer-based text generation S Hong, S Moon, J Kim, S Lee, M Kim, D Lee, JY Kim 2022 55th IEEE/ACM International Symposium on Microarchitecture (MICRO), 616-630, 2022 | 35 | 2022 |
Energy-efficient acceleration of deep neural networks on realtime-constrained embedded edge devices B Kim, S Lee, AR Trivedi, WJ Song IEEE Access 8, 216259-216270, 2020 | 26 | 2020 |
The nebula benchmark suite: Implications of lightweight neural networks B Kim, S Lee, C Park, H Kim, WJ Song IEEE Transactions on Computers 70 (11), 1887-1900, 2020 | 5 | 2020 |
HPC2 lusterScape: Increasing Transparency and Efficiency of Shared High-Performance Computing Clusters for Large-scale AI Models H Park, A Cho, H Jeon, H Lee, Y Yang, S Lee, H Lee, J Choo 2023 IEEE Visualization in Data Science (VDS), 21-29, 2023 | 2 | 2023 |
HyperCLOVA X Technical Report KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ... arXiv preprint arXiv:2404.01954, 2024 | | 2024 |