Briefly, it is a collection of formulas from first-order logic, to each of which is assigned a real number, the weight. Taken as a Markov network, the vertices of the network graph are atomic formulas, and the edges are the logical connectives used to construct the formula. Each formula is considered to be a clique, and the Markov blanket is the set of formulas in which a given atom appears. A potential function is associated to each formula, and takes the value of one when the formula is true, and zero when it is false. The potential function is combined with the weight to form the Gibbs measure and partition function for the Markov network. The above definition glosses over a subtle point: atomic formulas do not have a truth value unless they are grounded and given an interpretation; that is, until they are ground atoms with a Herbrand interpretation. Thus, a Markov logic network becomes a Markov network only with respect to a specific grounding and interpretation; the resulting Markov network is called the ground Markov network. The vertices of the graph of the ground Markov network are the ground atoms. The size of the resulting Markov network thus depends strongly on the number of constants in the domain of discourse.
Inference
The goal of inference in a Markov logic network is to find the stationary distribution of the system, or one that is close to it; that this may be difficult or not always possible is illustrated by the richness of behaviour seen in the Ising model. As in a Markov network, the stationarydistribution finds the most likely assignment of probabilities to the vertices of the graph; in this case, the vertices are the ground atoms of an interpretation. That is, the distribution indicates the probability of the truth or falsehood of each ground atom. Given the stationary distribution, one can then perform inference in the traditional statistical sense of conditional probability: obtain the probability that formula A holds, given that formula B is true. Inference in MLNs can be performed using standard Markov network inference techniques over the minimal subset of the relevant Markov network required for answering the query. These techniques include Gibbs sampling, which is effective but may be excessively slow for large networks, belief propagation, or approximation viapseudolikelihood.