Keywords: Implicit Neural Representation, AI for Math, Extremal Graph Theory, Homomorphism Density, Graphon
TL;DR: We propose a novel implicit neural representation of graphons for automatically solving asymptotic extremal problems in graph theory.
Abstract: Machine learning is opening new directions in mathematical discovery, from automatic provers for IMO problems,
to new insights and progress for mathematical conjectures guided by machine learning techniques. We extend this line of research by proposing a neural architecture that tackles asymptotic extremal problems in graph theory.
For an optimisation question on graphs with $n$ vertices, an asymptotic extremal problem investigates the limiting behaviour of the answer to the question as $n$ tends to infinity.
We start with a well-known fact that these problems are often equivalently formulated as optimisation problems over graphons, which are symmetric measurable functions from $[0,1]^2$ to $[0,1]$.
We represent graphons as neural networks from $[0,1]^2$ to $[0,1]$, and solve graphon-optimisation problems by gradient descent.
Because optimal graphons for these problems are usually discontinuous, naively using a standard neural architecture and a standard learning algorithm does not lead to mathematically-interesting near-optimal solutions. To overcome this challenge, we propose a new neural architecture inspired by diffusion models, which uses multiple varying-scale sinusoidal encodings of a given input. Also, to deal with a constraint in those graphon-optimisation problems effectively, our architecture includes a constraint solver that finds a solution of an empirical counterpart of such a constraint,
and backpropages the gradient signal over the solver using implicit differentiation.
For some generalised Turan problems and domination exponent problems that were solved by extensive human effort, our method rediscovers known optimal graphons without human intervention.
For some open problems of the same kind, it finds novel candidate optima and counterexamples, which may be of interest to the mathematics community.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 10962
Loading