Flora: Federated fine-tuning large language models with heterogeneous low-rank adaptations

Z Wang, Z Shen, Y He, G Sun, H Wang, L Lyu… - arXiv preprint arXiv …, 2024 - arxiv.org
The rapid development of Large Language Models (LLMs) has been pivotal in advancing AI,
with pre-trained LLMs being adaptable to diverse downstream tasks through fine-tuning …

Llm4ad: A platform for algorithm design with large language model

F Liu, R Zhang, Z Xie, R Sun, K Li, X Lin… - arXiv preprint arXiv …, 2024 - arxiv.org
We introduce LLM4AD, a unified Python platform for algorithm design (AD) with large
language models (LLMs). LLM4AD is a generic framework with modularized blocks for …

Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning

D Bu, W Huang, A Han, A Nitanda, T Suzuki… - arXiv preprint arXiv …, 2024 - arxiv.org
Transformer-based large language models (LLMs) have displayed remarkable creative
prowess and emergence capabilities. Existing empirical studies have revealed a strong …

Monte Carlo Tree Search for Comprehensive Exploration in LLM-Based Automatic Heuristic Design

Z Zheng, Z Xie, Z Wang, B Hooi - arXiv preprint arXiv:2501.08603, 2025 - arxiv.org
Handcrafting heuristics for solving complex planning tasks (eg, NP-hard combinatorial
optimization (CO) problems) is a common practice but requires extensive domain …