Scalable, MDP-based planning for multiple, cooperating agents

JD Redding, NK Ure, JP How… - 2012 American …, 2012 - ieeexplore.ieee.org
2012 American Control Conference (ACC), 2012ieeexplore.ieee.org
This paper introduces an approximation algorithm for stochastic multi-agent planning based
on Markov decision processes (MDPs). Specifically, we focus on a decentralized approach
for planning the actions of a team of cooperating agents with uncertainties in fuel
consumption and health-related models. The core idea behind the algorithm presented in
this paper is to allow each agent to approximate the representation of its teammates. Each
agent therefore maintains its own planner that fully enumerates its local states and actions …
This paper introduces an approximation algorithm for stochastic multi-agent planning based on Markov decision processes (MDPs). Specifically, we focus on a decentralized approach for planning the actions of a team of cooperating agents with uncertainties in fuel consumption and health-related models. The core idea behind the algorithm presented in this paper is to allow each agent to approximate the representation of its teammates. Each agent therefore maintains its own planner that fully enumerates its local states and actions while approximating those of its teammates. In prior work, the authors approximated each teammate individually, which resulted in a large reduction of the planning space, but remained exponential (in n - 1 rather than in n, where n is the number of agents) in computational scalability. This paper extends the approach and presents a new approximation that aggregates all teammates into a single, abstracted entity. Under the persistent search & track mission scenario with 3 agents, we show that while resulting performance is decreased nearly 20% compared with the centralized optimal solution, the problem size becomes linear in n, a very attractive feature when planning online for large multi-agent teams.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References