M Asai,
S Wissow - Proceedings of the International Symposium on …, 2024 - ojs.aaai.org
Abstract Monte-Carlo Tree Search (MCTS) combined with Multi-Armed Bandit (MAB) has
had limited success in domain-independent classical planning until recently. Previous work …