# Undirected Graphical Model Structure

An Undirected Graphical Model Structure is a graphical probability model instance that is an undirected graph (and with a potential function).

**AKA:**Markov Network, MRN, Markov Random Field (MRF).**Context:**- It can be associated to a Undirected Conditional Graphical Family.
- It represent a Random Field that manifests the Markov property.
- It can be associated to a Undirected Graphical Statistical Family.
- It can range from being a Generative Undirected Probabilistic Graphical Model to being a Discriminative Undirected Probabilistic Graphical Model.
- It can range from (typically) being an Undirected Conditional Probability Network to being an Undirected Non-Conditional Probability Network????
- It is a Traditional Structured Model???
- …

**Example(s):****Counter-Example(s):****See:**Markov Chain, Conditional Logic; Markov Blanket; Maximum Entropy Markov Model; Graphical Models, Random Field, Bayesian Network.

## References

### 2016

- (Wikipedia, 2016) ⇒ https://en.wikipedia.org/wiki/markov_random_field Retrieved:2016-8-13.
- In the domain of physics and probability, a
**Markov random field**(often abbreviated as**MRF**),**Markov network**or**undirected graphical model**is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said to be Markov random field if it satisfies Markov properties.A Markov network or MRF is similar to a Bayesian network in its representation of dependencies; the differences being that Bayesian networks are directed and acyclic, whereas Markov networks are undirected and may be cyclic. Thus, a Markov network can represent certain dependencies that a Bayesian network cannot (such as cyclic dependencies); on the other hand, it can't represent certain dependencies that a Bayesian network can (such as induced dependencies). The underlying graph of a Markov random field may be finite or infinite.

When the joint probability density of the random variables is strictly positive, it is also referred to as a Gibbs random field

**, because, according to the Hammersley–Clifford theorem, it can then be represented by a Gibbs measure for an appropriate (locally defined) energy function. The prototypical Markov random field is the Ising model; indeed, the Markov random field was introduced as the general setting for the Ising model. In the domain of artificial intelligence, a Markov random field is used to model various low- to mid-level tasks in image processing and computer vision.**

- In the domain of physics and probability, a

### 2015

- http://en.wikipedia.org/wiki/Random_element#Random_field
- … Several kinds of random fields exist, among them the Markov random field (MRF), Gibbs random field (GRF), conditional random field (CRF), and Gaussian random field. An MRF exhibits the Markovian property : [math]\displaystyle{ P(X_i=x_i|X_j=x_j, i\neq j) =P(X_i=x_i|\partial_i), \, }[/math] where [math]\displaystyle{ \partial_i }[/math] is a set of neighbours of the random variable
*X*_{i}. In other words, the probability that a random variable assumes a value depends on the other random variables only through the ones that are its immediate neighbours. The probability of a random variable in an MRF is given by : [math]\displaystyle{ P(X_i=x_i|\partial_i) = \frac{P(\omega)}{\sum_{\omega'}P(\omega')}, }[/math] where Ω' is the same realization of Ω, except for random variable*X*_{i}. It is difficult to calculate with this equation, without recourse to the relation between MRFs and GRFs proposed by Julian Besag in 1974.

- … Several kinds of random fields exist, among them the Markov random field (MRF), Gibbs random field (GRF), conditional random field (CRF), and Gaussian random field. An MRF exhibits the Markovian property : [math]\displaystyle{ P(X_i=x_i|X_j=x_j, i\neq j) =P(X_i=x_i|\partial_i), \, }[/math] where [math]\displaystyle{ \partial_i }[/math] is a set of neighbours of the random variable

### 2011

- (Sammut & Webb, 2011) ⇒ Claude Sammut, Geoffrey I. Webb. (2011). “Markov Network.” In: (Sammut & Webb, 2011) p.646
- QUOTE: A Markov network is a form of undirected graphical model for representing multivariate probability distributions.

- (Sammut & Webb, 2011) ⇒ Claude Sammut, Geoffrey I. Webb. (2011). “Markov Random Field.” In: (Sammut & Webb, 2011) p.646
- QUOTE: Markov Network.

### 2010

- (Wikipedia, 2010) ⇒ http://en.wikipedia.org/wiki/Hidden_Markov_model
- A
**hidden Markov model**(HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (*hidden*) states. An HMM can be considered as the simplest dynamic Bayesian network.

- A

### 2007

- (Domingos, 2007) ⇒ Pedro Domingos, (2007). “Practical Statistical Relational AI." Tutorial at AAAI 2007 Conference.
- (Sutton & McCallum, 2007) ⇒ Charles Sutton, and Andrew McCallum. (2007). “An Introduction to Conditional Random Fields for Relational Learning.” In: (Getoor & Taskar, 2007).
- A graphical model is a family of probability distributions that factorize according to an underlying graph. The main idea is to represent a distribution over a large number of random variables by a product of local functions that each depend on only a small number of variables. Given a collection of subsets
*A*⊂*V*, we define an*undirected graphical model*as the set of all distributions that can be written in the form

- A graphical model is a family of probability distributions that factorize according to an underlying graph. The main idea is to represent a distribution over a large number of random variables by a product of local functions that each depend on only a small number of variables. Given a collection of subsets

- (Sutton & McCallum, 2007) ⇒ Charles Sutton, and Andrew McCallum. (2007). “An Introduction to Conditional Random Fields for Relational Learning.” In: (Getoor & Taskar, 2007).
- A graphical model is a family of probability distributions that factorize according to an underlying graph. The main idea is to represent a distribution over a large number of random variables by a product of local functions that each depend on only a small number of variables. Given a collection of subsets
*A*⊂*V*, we define an*undirected graphical model*as the set of all distributions that can be written in the form ... <snip> ... (These functions are also called*local functions*or*compatibility functions*.) We will occasionally use the term random field to refer to a particular distribution among those defined by an undirected model. To reiterate, we will consistently use the term model to refer to a family of distributions, and**random field**(or more commonly,**distribution**) to refer to a single one.

- A graphical model is a family of probability distributions that factorize according to an underlying graph. The main idea is to represent a distribution over a large number of random variables by a product of local functions that each depend on only a small number of variables. Given a collection of subsets

### 2006

- (Awate, 2006) ⇒ Suyash P. Awate. (2006). “Adaptive Nonparametric Markov Models and Information-Theoretic Methods for Image Restoration and Segmentation." PhD Dissertation.
- Markov random fields (MRFs) are stochastic models that characterize the local spatial interactions in data. The last 40 years have seen significant advances in the mathematical analysis of MRFs as well as numerous application areas for MRFs ranging from physics, pattern recognition, machine learning, artificial intelligence, image processing, and computer vision. This has firmly established MRFs as powerful statistical tools for data analysis. This dissertation proposes an adaptive MRF image model and builds processes images relying on this model. This section gives a brief review of theory behind MRFs and some relevant MRF-based algorithms.
The first concept of the

*MRF theory*came from the physicist Ernst Ising in the 1920s. Ising was trying to devise a mathematical model to explain the experimental results concerning properties of ferromagnetic materials. This dealt with local interactions between a collection of dipoles associated with such materials. He published the model in his doctoral thesis, which later became popular as the Ising model. The name Markov, however, is dedicated in the memory of the mathematician Andrei Markov who pioneered the work on Markov chains, i.e., ordered sequences of RVs where the conditional PDF of an RV given all previous RVs is exactly the same as the conditional PDF of the RV given only its pr{e}ceeding RV. In other words, the next RV, given the present RV, is conditionally independent of all other previous RVs. This notion of conditional independence concerning chains of RVs generalizes to grids of RVs or random fields. Such random fields are called MRFs.A random field [47,161] is a family of RVs [math]\displaystyle{ \bf{X} = \lbrace {X_t} \rbrace_{t \in \mathcal{T} } }[/math], for some index set [math]\displaystyle{ \mathcal{T} }[/math]. For each index [math]\displaystyle{ t }[/math], the RV [math]\displaystyle{ X_t }[/math] is defined on some sample-space [math]\displaystyle{ \Omega }[/math]. If we let [math]\displaystyle{ \mathcal{T} }[/math] be a set of points defined on a discrete Cartesian grid and fix [math]\displaystyle{ \Omega = \omega }[/math], we have a realization or an instance of the random field, [math]\displaystyle{ {\bf X} (\omega) = {\bf x} }[/math], called the digital image. In this case, [math]\displaystyle{ \mathcal{T} }[/math] is the set of grid points in the image. For vector-valued images [math]\displaystyle{ X_t }[/math] becomes a vector RV.

- Markov random fields (MRFs) are stochastic models that characterize the local spatial interactions in data. The last 40 years have seen significant advances in the mathematical analysis of MRFs as well as numerous application areas for MRFs ranging from physics, pattern recognition, machine learning, artificial intelligence, image processing, and computer vision. This has firmly established MRFs as powerful statistical tools for data analysis. This dissertation proposes an adaptive MRF image model and builds processes images relying on this model. This section gives a brief review of theory behind MRFs and some relevant MRF-based algorithms.

### 2005

- (Rue & Held, 2005) ⇒ Havard Rue, and Leonhard Held. (2005). “Gaussian Markov Random Fields: Theory and Applications." CRC Press. ISBN:1584884320
- This monograph considers
*Gaussian Markov random fields*(GMRFs) covering both theory and applications. A GMRF is really a simple construct: It is just a (finite-dimensional) random vector following a multivariate normal (or Gaussian) distribution. However, we will be concerned with more restrictive versions where the GMRF satisfies additional*conditional independence*assumption, hense the term*Markov*. Conditional independence is a powerful concept. Let [math]\displaystyle{ \mathbf{x} = (x_1,x_2,x_3)^T }[/math] be a random vector, then [math]\displaystyle{ x_1 }[/math] and [math]\displaystyle{ x_2 }[/math] are conditionally independent given [math]\displaystyle{ x_3 }[/math] if, for known value of [math]\displaystyle{ x_3 }[/math], discover [math]\displaystyle{ x_2 }[/math] tells you nothing new about the distribution of [math]\displaystyle{ x_1 }[/math].

- This monograph considers

### 2004

- (Taskar et al., 2004) ⇒ Ben Taskar, Carlos Guestrin, and Daphne Koller. (2004). “Max-Margin Markov Networks.” In: Advances in Neural Information Processing Systems.

### 2002

- (Taskar et al., 2002) ⇒ Ben Taskar, Pieter Abbeel, and Daphne Koller. (2002). “Discriminative Probabilistic Models for Relational Data.” In: Proceedings of UAI Conference (UAI 2002).
- We introduce the framework of relational Markov network (RMNs), which compactly defines a Markov network over a relational data set. The graphical structure of a RMN is based on the relational structure of the domain, and can easily model complex patterns over related entities.

### 1995

- (Li, 1995) ⇒ Stan Z. Li. (1995). “Markov Random Field Modeling in Computer Vision.” Springer-Verlag. ISBN:4431701451