[HTML][HTML] Xai-driven knowledge distillation of large language models for efficient deployment on low-resource devices

R Cantini, A Orsino, D Talia - Journal of Big Data, 2024 - Springer
Abstract Large Language Models (LLMs) are characterized by their inherent memory
inefficiency and compute-intensive nature, making them impractical to run on low-resource …

[HTML][HTML] Xai-driven knowledge distillation of large language models for efficient deployment on low-resource devices

R Cantini, A Orsino, D Talia - Journal of Big …, 2024 - journalofbigdata.springeropen.com
Large Language Models (LLMs) are characterized by their inherent memory inefficiency and
compute-intensive nature, making them impractical to run on low-resource devices and …

Xai-driven knowledge distillation of large language models for efficient deployment on low-resource devices.

R Cantini, A Orsino, D Talia - Journal of Big Data, 2024 - search.ebscohost.com
Abstract Large Language Models (LLMs) are characterized by their inherent memory
inefficiency and compute-intensive nature, making them impractical to run on low-resource …

[PDF][PDF] Xai-driven knowledge distillation of large language models for efficient deployment on low-resource devices

R Cantini, A Orsino, D Talia - 2024 - journalofbigdata.springeropen.com
Abstract Large Language Models (LLMs) are characterized by their inherent memory
inefficiency and compute-intensive nature, making them impractical to run on lowresource …