Grounded predictions of teamwork as a one-shot game: A multiagent multi-armed bandits approach

Published: 01 Jan 2025, Last Modified: 26 Sept 2025Artif. Intell. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We introduce a novel category of mixed games enabling the representation of how a manager can assess the team outcome and facilitates the calculation of scalable Nash equilibria.•We propose a MultiAgent Multi-Armed Bandits (MA-MAB) framework to approximate the Nash equilibria in teamwork games.•We develop the proposed Multiagent Multi-Armed Bandit Model (MA-MAB) to empirically prove the convergence of the learned strategies towards approximated NE of the game.•Leveraging our MA-MAB predictions, we unveil significant insights on how team composition, task dynamics, and evaluation criteria impact productivity in the game.•We use game theory to explain motivation loss in teams, modeling agents with human-like behaviors, including the Köhler effect.
Loading