Abstract: We introduce the Neural Conditioner (NC), a self-supervised machine able to learn about all the conditional distributions of a random vector X. The NC is a function NC(x⋅a,a,r) that leverages adversarial training to match each conditional distribution P(Xr|Xa=xa). After training, the NC generalizes to sample from conditional distributions never seen, including the joint distribution. The NC is also able to auto-encode examples, providing data representations useful for downstream classification tasks. In sum, the NC integrates different self-supervised tasks (each being the estimation of a conditional distribution) and levels of supervision (partially observed data) seamlessly into a single learning experience.
Code Link: https://github.com/IshmaelBelghazi/learning_an_exponential_amount_of_conditional_distributions/
CMT Num: 7615
0 Replies
Loading