Potts Relaxations and Soft Self-labeling for Weakly-Supervised Segmentation

13 May 2024 (modified: 06 Nov 2024)Submitted to NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Soft pseudo-labels; Potts model; Scribble-supervised semantic segmentation
Abstract: We consider weakly supervised segmentation where only a fraction of pixels have ground truth labels (scribbles) and focus on a self-labeling approach where soft pseudo-labels on unlabeled pixels optimize some relaxation of the standard unsupervised CRF/Potts loss. While WSSS methods can directly optimize CRF losses via gradient descent, prior work suggests that higher-order optimization can lead to better network training by jointly estimating pseudo-labels, e.g. using discrete graph cut sub-problems. The inability of hard pseudo-labels to represent class uncertainty motivates the relaxed pseudo-labeling. We systematically evaluate standard and new CRF relaxations, neighborhood systems, and losses connecting network predictions with soft pseudo-labels. We also propose a general continuous sub-problem solver for such pseudo-labels. Soft self-labeling loss combining the log-quadratic Potts relaxation and collision cross-entropy achieves state-of-the-art and can outperform full pixel-precise supervision on PASCAL.
Primary Area: Machine vision
Submission Number: 7619
Loading