By Anish Doshi
Andrey Markov was a Russian mathematician who lived from 1856 to 1922 and made significant contribution to statistics and probability theory. He is most famous for his namesake chains and processes, but also, of course, for random fields.
Markov random fields are defined as a graphical model where a certain set of random variables have a Markov property as displayed by an undirected graph.
What this means is that, given a group of certain random variables, i.e., ones whose values are not determined by a previous state of the variables, an undirected graph can be created (A graph is essentially a map of objects connected by edges, undirected refers to the fact that the edges do not have orientations and are not vectors), if the variables have the Markov property, then the graph is a Markov random field. The Markov property essentially states that the future states of variables do not depend on anything prior to that time, because they are randomly created. Specifically, there are three components to the Markov property: the pairwise property, the local property, and the global property, which all deal with the independence and randomness of variables in the specific graph. A major variant to Markov random fields are conditional random fields, in which the variables are a little less independent than those in Markov ones.
One popular model for Markov random fields is the Ising model, which uses chemistry in combination with spin. An application for these fields is quantum computing (computing using quantum mechanics), which has been claimed by the company D-Wave Systems, Inc. Since then, applications of Markov Random Fields have expanded, one notable example is in 1984 when Donald and Stuart Geman, American mathematicians, published an engineering paper which introduces the possibility of using random fields for image analysis. Conditional random fields have many applications in gene finding and shallow parsing
A picture of an undirected graph:
An illustration of the Ising model: