Abstract: In the propositional setting, the marginal problem is to
find a (maximum-entropy) distribution that has some given
marginals. We study this problem in a relational setting and
make the following contributions. First, we compare two different
notions of relational marginals. Second, we show a duality
between the resulting relational marginal problems and
the maximum likelihood estimation of the parameters of relational
models, which generalizes a well-known duality from
the propositional setting. Third, by exploiting the relational
marginal formulation, we present a statistically sound method
to learn the parameters of relational models that will be applied
in settings where the number of constants differs between
the training and test data. Furthermore, based on a relational
generalization of marginal polytopes, we characterize
cases where the standard estimators based on feature’s number
of true groundings needs to be adjusted and we quantitatively
characterize the consequences of these adjustments.
Fourth, we prove bounds on expected errors of the estimated
parameters, which allows us to lower-bound, among other
things, the effective sample size of relational training data.
0 Replies
Loading