# Markov Logic Networks

This is a method that combines first-order logic and probabilistic graphical models. In first-order logic, a set of formulas represent hard constraints over a set of instances, and if an instance violates one of them, it has zero probability. The basic idea of a Markov Logic Network (MLN) is to generalize first-order logic by softening those hard constraints, assigning a real number (the weight) to each formula to indicate how hard it is, so that an instance that violates one or more formulas is not impossible anymore, just less probable.

## Definition

A Markov Logic Network $L$ is a set of pairs $(F_{i},w_{i})$ where $F_{i}$ is a formula in first-order logic and $w_{i}$ is a real number. Given $C$ is a finite set of constants $C=\{c_{1},c_{2},...\}$ , $L$ defines together with $C$ a Markov Network $M_{L,C}$ in the following way:

• $M_{L,C}$ contains one binary node for each possible grounding of each predicate in $L$ . If the ground atom is true, the value of the node is 1, and 0 otherwise.
• ML, C contains one feature for each possible grounding of each formula $F_{i}$ in $L$ . If the ground formula is true, the value of the feature is 1, and 0 otherwise; and the weight of the feature is $w_{i}$ .

In other words, the Markov network has an edge between two nodes iff the corresponding predicates appear together in at least one formula in $L$ . Then, the probability distribution over possible instances $x$ specified by ML, C is:

$P(X=x)={\frac {1}{Z}}\prod _{i}{\phi _{i}(x_{\{i\}})}$ where $Z$ is the normalization factor, $\phi _{i}(x_{\{i\}})$ is the potential function defined on the $i$ -th clique which is related to $F_{i}$ , and $x_{\{i\}}$ is the discrete variable vector in the clique. Given that $\phi _{i}(x_{\{i\}})$ equals $e^{w_{i}}$ when $F_{i}(x_{\{i\}})$ is True and 1 otherwise, we can represent probability as:

$P(X=x)={\frac {1}{Z}}{\text{ exp }}\left\{{\sum _{i}{w_{i}n_{i}(x)}}\right\}$ where $n_{i}(x)$ is the number of true groundings of $F_{i}$ in $x$ .

## Relationship to Markov Networks

MLNs are special cases of Markov networks, where the vertices of the network graph are atomic formulas and the edges are the logical connectives used to construct the formula. Each formula is considered to be a clique and the Markov blanket is the set of formulas in which a given atom appears. A potential function is associated to each formula and it is combined with the weight to form the Gibbs measure and partition function for the Markov network.

Therefore, inference in MLNs can be performed using standard Markov network inference techniques like Gibbs sampling, belief propagation and approximation via pseudolikelihood.