# Statistical Modeling Algorithm

Jump to navigation
Jump to search

A Statistical Modeling Algorithm is a model-based learning algorithm that uses of a statistical model and abides by some statistical theory.

**AKA:**Probabilistic Learning, Statistical Data Analysis, Statistical Decision Algorithm.**Context:**- It can range from being a Parametric Statistical Modeling Algorithm (which assumes knowledge of the underlying probability function) to being a Nonparametric Statistical Modeling Algorithm.
- It can range from being an Exploratory Statistical Analysis Algorithm to being a Confirmatory Statistical Analysis Algorithm.
- It can range from being a Frequentist Analysis to being a Bayesian Analysis. (Cox, 2006)
- It can be applied by a Statistical Modeling System (to solve a statistical modeling task).
- It can range from being a Traditional Statistical Algorithm to being a Modern Statistical Algorithm.

**Example(s):**- Linear Regression Algorithm that induces a Linear Model.
- Logistic Regression Algorithm that induces a Logistic Model.
- Probabilistic Graphical Model Learning Algorithm that induces a Probabilistic Graphical Model.
- Hidden Markov Model Learning Algorithm that induces a Hidden Markov Model.
- Nonparametric Bayesian Learning Algorithm,
- Adaboost Algorithm,
- Lasso Algorithm,
- a Latent Dirichlet Allocation Modeling Algorithm (that produces a Latent Dirichlet Allocation Model),
- Statistical Natural Language Processing Algorithm,
- a Point Estimation Algorithm, such as Maximum-Likelihood, MAP and Least-Squares,
- …

**Counter-Example(s):**- a Machine Learning Algorithm.
- a Data Mining Algorithm, such as C4.5 which informally motivated by on information theory.
- a k-Nearest Neighbor Algorithm, motivated by Pattern Recognition Theory.
- an Inductive Logic Programming Algorithm, from Inductive Logic Theory.
- a Kernel Machine Learning Algorithm.
- a Statistical Inference Algorithm.

**See:**Statistical Experiment, Statistical Relational Learning, Statistical Analysis Task, Computational Statistics.

## References

### 2017

- (StackExchange, 2017) ⇒ Differences between logistic regression and perceptrons, URL (version: 2017-06-07): https://stats.stackexchange.com/q/284013
- QUOTE: ... Long story short, logistic regression is a GLM which can perform prediction and inference, whereas the linear Perceptron can only achieve prediction (in which case it will perform the same as logistic regression). The difference between the two is also the fundamental difference between statistical modelling and machine learning.

### 2009

- (Lafferty & Wasserman, 2009) ⇒ John D. Lafferty, and Larry Wasserman. (2009). “Statistical Machine Learning - Course: 10-702." Spring 2009, Carnegie Mellon Institute.
**Statistical Machine Learning**is a second graduate level course in machine learning, assuming students have taken Machine Learning (10-701) and Intermediate Statistics (36-705). The term "statistical" in the title reflects the emphasis on statistical analysis and methodology, which is the predominant approach in modern machine learning.

- (Freund, 2009) ⇒ Yoav Freund. (2009). “Statistical Machine Learning (Boosting)." Course: UC San Diego, CSE254, Winter 2009. http://seed.ucsd.edu/mediawiki/index.php/CSE254

### 2008

- (Sarawagi, 2008) ⇒ Sunita Sarawagi. (2008). “Information Extraction.” In: Foundations and Trends in Databases, 1(3).
- … We described Conditional Random Fields, a state-of-the-art method for entity recognition that imposes a joint distribution over the sequence of entity labels assigned to a given sequence of tokens. Although the details of training and inference on statistical models are somewhat involved for someone outside the field of
**statistical machine learning**, the models are easy to deploy and customize due to their fairly nonrestrictive feature based framework.

- … We described Conditional Random Fields, a state-of-the-art method for entity recognition that imposes a joint distribution over the sequence of entity labels assigned to a given sequence of tokens. Although the details of training and inference on statistical models are somewhat involved for someone outside the field of

### 2007

- http://www.stat.berkeley.edu/~statlearning/
**Statistical machine learning**merges statistics with the computational sciences --- computer science, systems science and optimization. Much of the agenda in**statistical machine learning**is driven by applied problems in science and technology, where data streams are increasingly large-scale, dynamical and heterogeneous, and where mathematical and algorithmic creativity are required to bring statistical methodology to bear. Fields such as bioinformatics, artificial intelligence, signal processing, communications, networking, information management, finance, game theory and control theory are all being heavily influenced by developments in**statistical machine learning**.- The field of
**statistical machine learning**also poses some of the most challenging theoretical problems in modern statistics, chief among them being the general problem of understanding the link between inference and computation.

### 2006

- (Mitchell, 2006) ⇒ Tom M. Mitchell (2006). “The Discipline of Machine Learning." Machine Learning Department technical report CMU-ML-06-108, Carnegie Mellon University.
- QUOTE: Whereas Statistics has focused primarily on what conclusions can be inferred from data, Machine Learning incorporates additional questions about what computational architectures and algorithms can be used to most effectively capture, store, index, retrieve and merge these data, how multiple learning subtasks can be orchestrated in a larger system, and questions of computational tractability.

- (Cox, 2006) ⇒ David R. Cox. (2006). “Principles of Statistical Inference." Cambridge University Press. ISBN:9780521685672
- QUOTE: Key ideas about probability models and the objectives of statistical analysis are introduced. The differences between frequentist and Bayesian analyses are illustrated in a very special case.

### 2000

- (Gildea & Jurafsky, 2000) ⇒ Daniel Gildea, and Daniel Jurafsky. (2000). “Automatic Lbeling of Semantic Roles.” In: Proceedings of ACL 2000.
- QUOTE: We apply statistical techniques that have been successful for these tasks, including probabilistic parsing and statistical classification. Our
**statistical algorithms**are trained on a hand labeled dataset the FrameNet database (Baker et al, 1998).

- QUOTE: We apply statistical techniques that have been successful for these tasks, including probabilistic parsing and statistical classification. Our

### 1992

- (Brown et al, 1992) ⇒ Peter F. Brown, Peter V. deSouza, Robert L. Mercer, Vincent J. Della Pietra, and Jenifer C. Lai. (1992). “Class-based N-gram Models of Natural Language.” In: Computational Linguistics, 18(4).
- QUOTE: ... We also discuss several statistical algorithms for assigning words to classes based on the frequency of their co-occurrence with other words. ...

### 1991

- (Efron & Tibshirani, 1991) ⇒ Bradley Efron, and Robert Tibshirani. (1991). “Statistical Data Analysis in the Computer Age.” In: Science, 253(5018). 10.1126/science.253.5018.390
- QUOTE: Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators.