Discovering a Zero (Zero-Vector Class of Machine Learning)

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 spotlightposterEveryoneRevisionsBibTeXCC BY-NC-SA 4.0
TL;DR: The classes are defined as vectors in a Vector Space, where Addition corresponds to the union of classes, and scalar multiplication resembles set complement of classes. Zero-Vector in that vector space has many useful applications.
Abstract:

In Machine learning, separating data into classes is a very fundamental problem. A mathematical framework around the classes is presented in this work to deepen the understanding of classes. The classes are defined as vectors in a Vector Space, where addition corresponds to the union of classes, and scalar multiplication resembles set complement of classes. The Zero-Vector in the vector space corresponds to a class referred to as the Metta-Class. This discovery enables numerous applications. One such application, termed 'clear learning' in this work, focuses on learning the true nature (manifold) of the data instead of merely learning a boundary sufficient for classification. Another application, called 'unary class learning', involves learning a single class in isolation rather than learning by comparing two or more classes. Additionally, 'set operations on classes' is another application highlighted in this work. Furthermore, Continual Learning of classes is facilitated by smaller networks. The Metta-Class enables neural networks to learn only the data manifold; therefore, it can also be used for generation of new data. Results for the key applications are shown using the MNIST dataset. To further strengthen the claims, some results are also produced using the CIFAR-10 and ImageNet-1k embeddings. The code supporting these applications is publicly available at: github.com/hm-4/Metta-Class.

Lay Summary:

Machine learning models are often designed to tell different categories apart — like separating cats from dogs — by simply drawing a boundary between them. This idea comes from traditional statistical thinking, where the goal is to find a separating point or surface that distinguishes classes. For example, logistic regression and linear discriminant analysis (LDA) are well-known techniques that focus on learning such decision boundaries. While effective for classification, this approach doesn’t help the model truly understand what each group looks like on its own. As a result, the model may struggle to generalize to new situations, adapt to changing data, or generate meaningful new examples.

Classical statistical models are often limited to simple two-class problems and can’t handle the rich complexity of the real world. In contrast, neural networks are much more powerful — they can model multiple categories at once and learn more complex structures within the data. This makes them better suited for real-world tasks. A very important property of classes is that set operations can be performed on them. We leveraged this idea in our work — we incorporated set operations on the classes learned by the neural network. We did this by modeling classes as vectors, where set operations like union and complement correspond to vector addition and scalar multiplication.

This opens up new capabilities for machine learning, such as learning one class in isolation, generating new examples, or learning continuously over time. Finally, we help neural networks break free from the constraints of a statistical mindset.

Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Primary Area: General Machine Learning
Keywords: Metta, Metta-Class, Metta Class, Machine Learning, ICML, Class Vector, Class Tensor Equation, Class Integration, Repository of Classes, Continual Learning, Class Addition, Class Subtraction, Class Invert, Zero Vector Class, Set operations on Classes, Boolean operation on Classes, Unary classification, Manifold learning, Lifelong Learning, Classifier as Generator
Submission Number: 4514
Loading