Keywords: Differentiable Games, Direct Sum Decomposition, Nash Equilibrium, Gradient Descent
TL;DR: We show that any Differentiable game can be decomposed in direct sum into a Potential game, a near Solenoidal game, and a non-strategic game
Abstract: To understand the complexity of the dynamic of learning in Differentiable games, we decompose the game into components where the dynamic is well understood. One of the possible tools is Helmholtz's theorem, which can decompose a vector field into a potential and a harmonic component. This has been shown to be effective in finite and normal-form games. However, applying Helmholtz's theorem by connecting it with the Hodge theorem on $\mathbb{R}^n$ (which is the strategy space of the Differentiable game) is non-trivial due to the non-compactness of $\mathbb{R}^n$. Bridging the dynamic-strategic disconnect through Hodge/Helmholtz's theorem in Differentiable games is then left as an open problem in Letcher et al. 2019. In this work, we provide two decompositions of Differentiable games to answer this question: the first as an exact Potential part, a near Solenoidal part, and a non-strategic part; the second as a near Potential part, an exact Solenoidal part, and a non-strategic part. We show that Potential games coincide with potential games proposed by Monderer and Shapley, 1996, where the gradient descent dynamic can successfully find the Nash equilibrium. For the Solenoidal game, we show that the individual gradient field is divergence-free, in which case the gradient descent dynamic may either be divergent or recurrent.
Supplementary Material: pdf
Primary Area: learning theory
Submission Number: 10256
Loading