Learning Parameters for Relational Probabilistic Models with Noisy-Or Combining Rule
Published in Eighth International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, FL, 2009
Recommended citation: S. Natarajan, P. Tadepalli, G. Kunapuli and J. W. Shavlik. Learning Parameters for Relational Probabilistic Models with Noisy-Or Combining Rule. Eighth International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, FL, December 13-15 2009. http://gkunapuli.github.io/files/09NoisyORICMLA.pdf
Languages that combine predicate logic with probabilities are needed to succinctly represent knowledge in many real-world domains. We consider a formalism based on universally quantified conditional influence statements that capture local interactions between object attributes. The effects of different conditional influence statements can be combined using combining rules such as Noisy-OR. To combine multiple instantiations of the same rule we need other combining rules at a lower level. In this paper we derive and implement algorithms based on gradient-descent and EM for learning the parameters of these multi-level combining rules. We compare our approaches to learning in Markov Logic networks and show superior performance in multiple domains.