Abstract: The tensor Ising model is a discrete exponential family used for modeling binary data on networks with not just pairwise, but higher-order dependencies. A particularly important class of tensor Ising models are the tensor Curie-Weiss models, where all tuples of nodes of a particular order interact with the same intensity. A computationally efficient alternative to the intractible maximum likelihood estimator (MLE) in this model, is the maximum pseudolikelihood estimator (MPLE). In this paper, we show that the MPLE is in fact as efficient as the MLE (in the Bahadur sense) in the 2-spin model, and for all values of the null parameter above log2 in higher-order tensor models. Also, there exists an estimation threshold below which consistent estimation of the model parameter is impossible, such that even if the null parameter happens to lie within the very small window between this threshold and log2, they are equally efficient unless the alternative parameter is large. Therefore, not only is the MPLE computationally preferable to the MLE, but also theoretically as efficient as the MLE over most of the parameter space. Our results extend to the more general class of Erdős-Rényi hypergraph Ising models, under slight sparsities too.
Loading