Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs

Published in Thirty-third AAAI Conference on Artificial Intelligence (AAAI'19), Honolulu, HI, 2019

Recommended citation: M. Das, D. S. Dhami, G. Kunapuli, K. Kersting and S. Natarajan. Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs. Proc. Thirty-third AAAI Conference on Artificial Intelligence (2019). http://gkunapuli.github.io/files/19hypergraphAAAI.pdf

Counting the number of true instances of a clause is arguably a major bottleneck in relational probabilistic inference and learning. We approximate counts in two steps: (1) transform the fully grounded relational model to a large hypergraph, and partially-instantiated clauses to hypergraph motifs; (2) since the expected counts of the motifs are provably the clause counts, approximate them using summary statistics (in/out-degrees, edge counts, etc). Our experimental results demonstrate the efficiency of these approximations, which can be applied to many complex statistical relational models, and can be significantly faster than state-of-the-art, both for inference and learning, without sacrificing effectiveness.

[BibTeX]