KBFormer: A Transformer-based Diffusion Model of Structured Entities with Heterogeneous Properties

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Knowledge Bases, Structured Data, Discrete State Diffusion
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: We present a generative attention-based architecture that models structured entities comprising different property types, such as numerical, categorical, string, and composite. This architecture handles such heterogeneous data through a mixed continuous-discrete diffusion process over the properties. This flexible framework is capable of modeling entities with arbitrary hierarchical properties, enabling applications to structured KB entities and tabular data. Experiments with a device KB and a nuclear physics dataset demonstrate the model's ability to learn representations useful for entity completion in diverse settings. This has many downstream use cases, including modeling numerical properties with high accuracy - critical for science applications. An additional benefit of the model is its inherent probabilistic nature, enabling predictions accompanied by uncertainties. These critical capabilities are leveraged in a nuclear physics dataset to make precise predictions on various properties of nuclei.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8454
Loading