# 1992 ABayesianMethodForTheIndOfProbNets

- (Cooper & Herskovits, 1992) ⇒ Gregory F. Cooper, and Edward Herskovits. (1992). “A Bayesian Method for the Induction of Probabilistic Networks from Data.” In: Machine Learning, 9(4). doi:10.1007/BF00994110

**Subject Headings:** Bayesian Belief Network, Bayesian Network Training.

## Notes

## Cited by

- ~2,298 …

### 1995

- (Heckerman et al., 1995) ⇒ David Heckerman, Dan Geiger, and David M. Chickering. (1995). “Learning Bayesian networks: The combination of knowledge and statistical data.” In: Machine Learning, 20(3). doi:10.1007/BF00994016

## Quotes

### Author Keywords

probabilistic networks, Bayesian belief networks, machine learning, induction

### Abstract

This paper presents a Bayesian method for constructing probabilistic networks from databases. In particular, we focus on constructing Bayesian belief networks. Potential applications include computer-assisted hypothesis testing, automated scientific discovery, and automated construction]] of probabilistic expert systems. We extend the basic method to handle [missing data]] and hidden (latent) variables. We show how to perform probabilistic inference by averaging over the inferences of multiple belief networks. Results are presented of a preliminary evaluation of an algorithm for constructing a belief network from a database of cases. Finally, we relate the methods in this paper to previous work, and we discuss open problems.

### 1. Introduction

In this paper, we present a Bayesian method for constructing a probabilistic network from a database of records, which we call cases. Once constructed, such a network can provide insight into probabilistic dependencies that exist among the variables in the database. One application is the automated discovery of dependency relationships. The computer program searches for a [probabilistic-network structure]] that has a high posterior probability given the database, and outputs the structure and its probability. A related task is computer-assisted hypothesis testing: The user enters a hypothetical structure of the dependency relationships among a set of variables, and the program calculates the probability of the structure given a database of cases on the variables.

…

A Bayesian belief-network structure, BS, is augmented by conditional probabilities, BP, to form a Bayesian belief network B. Thus, B = (BS, BP). For brevity, we call B a belief network. For each node1 in a belief-network structure, there is a conditional-probability function that relates this node to its immediate predecessors (parents). We shall use *T*_{i}, to denote the parent nodes of variable *x*_{i}. If a node has no parents, then a prior-probability function, P*(*x_{i}), is specified. A set of probabilities is shown in table 2 for the belief-network structure in figure 1. We used the probabilities in table 2 to generate the cases in table 1 by applying Monte Carlo simulation.

We shall use the term *conditional probability* to refer to a probability statement, such as P(*x*_{2} = present* | *x_{1} = *present*). We use the term *conditional-probability assignment* to denote a numerical assignment to a conditional probability, as, for example, the assignment *P*(*x*_{2} = present | x_{1} = *present*) = 0.8. The network structure BS1 in figure 1 and the probabilities BP1 in table 2 together define a belief network which we denote as B1. Belief networks are capable of representing the probabilities over any discrete sample space: The probability of any sample point in that [Discrete Sample Space|space]] can be computed from the probabilities in the belief network. The key feature of belief networks is their explicit representation of the conditional independence and dependence among events. In particular, investigators have shown (Kiiveri, Speed, & Carlin, 1984; Pearl, 1988; Shachter, 1986) that the joint probability of any particular instantiation2 of all [math]\displaystyle{ n }[/math] variables in a belief network can be calculated as follows:

…

,

Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|

1992 ABayesianMethodForTheIndOfProbNets | Gregory F. Cooper Edward Herskovits | A Bayesian Method for the Induction of Probabilistic Networks from Data | Machine Learning (ML) Subject Area | http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.88.4436 | 10.1007/BF00994110 | 1992 |