2007 PracticalStatisticalAI

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Statistical Relational Model, Markov Network

Notes

Cited By

~92 http://scholar.google.com/scholar?cites=8848896538786858875

Quotes

Plan

  • We have the elements:
    • Probability for handling uncertainty
    • Logic for representing types, relations, and complex dependencies between them
    • Learning and inference algorithms for each

Hammersley-Clifford Theorem

  • If Distribution is strictly positive (P(x) > 0)
  • And Graph encodes conditional independences
  • Then Distribution is product of potentials over cliques of graph
  • Inverse is also true.

(“Markov network = Gibbs distribution”

Markov Nets vs. Bayes Nets

| Property | Markov Nets | Bayes Nets| | Form | Prod. potentials | Prod. potentials| | Potentials | Arbitrary | Cond. probabilities| | Cycles | Allowed | Forbidden| | Partition func. | Z = ? | Z = 1| | Indep. check | Graph separation | D-separation| | Indep. props. | Some | Some| | Inference | MCMC, BP, etc. | Convert to Markov|

References


,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2007 PracticalStatisticalAIPedro DomingosPractical Statistical Relational AITutorial at AAAI 2007 Conferencehttp://www.cs.washington.edu/homes/pedrod/psrl.ppt2007