A systematic review of Green AI

R Verdecchia, J Sallou, L Cruz - Wiley Interdisciplinary Reviews …, 2023 - Wiley Online Library
With the ever‐growing adoption of artificial intelligence (AI)‐based systems, the carbon
footprint of AI is no longer negligible. AI researchers and practitioners are therefore urged to …

[HTML][HTML] The importance of resource awareness in artificial intelligence for healthcare

Z Jia, J Chen, X Xu, J Kheir, J Hu, H Xiao… - Nature Machine …, 2023 - nature.com
Artificial intelligence and machine learning (AI/ML) models have been adopted in a wide
range of healthcare applications, from medical image computing and analysis to continuous …

Llama 2: Open foundation and fine-tuned chat models

H Touvron, L Martin, K Stone, P Albert… - arXiv preprint arXiv …, 2023 - arxiv.org
In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large
language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine …

Harnessing the power of llms in practice: A survey on chatgpt and beyond

J Yang, H Jin, R Tang, X Han, Q Feng, H Jiang… - ACM Transactions on …, 2024 - dl.acm.org
This article presents a comprehensive and practical guide for practitioners and end-users
working with Large Language Models (LLMs) in their downstream Natural Language …

Power hungry processing: Watts driving the cost of AI deployment?

S Luccioni, Y Jernite, E Strubell - The 2024 ACM Conference on …, 2024 - dl.acm.org
Recent years have seen a surge in the popularity of commercial AI products based on
generative, multi-purpose AI systems promising a unified approach to building machine …

Large language models can accurately predict searcher preferences

P Thomas, S Spielman, N Craswell… - Proceedings of the 47th …, 2024 - dl.acm.org
Much of the evaluation and tuning of a search system relies on relevance labels---
annotations that say whether a document is useful for a given search and searcher. Ideally …

[HTML][HTML] Efficient methods for natural language processing: A survey

M Treviso, JU Lee, T Ji, B Aken, Q Cao… - Transactions of the …, 2023 - direct.mit.edu
Recent work in natural language processing (NLP) has yielded appealing results from
scaling model parameters and training data; however, using only scale to improve …

Scalable, Distributed AI Frameworks: Leveraging Cloud Computing for Enhanced Deep Learning Performance and Efficiency

N Mungoli - arXiv preprint arXiv:2304.13738, 2023 - arxiv.org
In recent years, the integration of artificial intelligence (AI) and cloud computing has
emerged as a promising avenue for addressing the growing computational demands of AI …

Educational data augmentation in physics education research using ChatGPT

F Kieser, P Wulff, J Kuhn, S Küchemann - Physical Review Physics Education …, 2023 - APS
Generative AI technologies such as large language models show novel potential to enhance
educational research. For example, generative large language models were shown to be …

Adaptersoup: Weight averaging to improve generalization of pretrained language models

A Chronopoulou, ME Peters, A Fraser… - arXiv preprint arXiv …, 2023 - arxiv.org
Pretrained language models (PLMs) are trained on massive corpora, but often need to
specialize to specific domains. A parameter-efficient adaptation method suggests training an …