Eager Learning Algorithm
From GM-RKB
An Eager Learning Algorithm is a learning algorithm that explores an entire training record set during a training phase to build a decision structure that it can exploit during the testing phase.
- AKA: Eager Learner, Eager Learning.
- Context:
- It can induce a Total Predictive Function.
- It can range from being an Eager Model-based Learning Algorithm to being an Eager Instance-based Learning Algorithm
- It can range from being an Online Eager Learning Algorithm to being a Batch Eager Learning Algorithm.
- It can range from being an Eager Classification Algorithm to being an Eager Regression Algorithm.
- The Predictive Function can subsequently be used on all future prediction requests that occur during a Test Phase.
- Example(s):
- Counter-Example(s):
- an Lazy Learning Algorithm, such as: k-Nearest Neighbor Algorithm.
- See: Supervised Eager Learning Algorithm, Unsupervised Eager Learning Algorithm.
References
1997
- (Mitchell, 1997) ⇒ Tom M. Mitchell. (1997). “Machine Learning." McGraw-Hill. . ISBN:0070428077
- QUOTE: Section 8.6 Remarks on Lazy and Eager Learning: In this chapter we considered three lazy learning methods: the k-Nearest Neighbor algorithm, locally weighted regression, and case-based reasoning. We call these methods lazy because they defer the decision of how to generalize beyond the training data until each new query instance in encountered. We also discussed an eager learning method the method for learning radial basis function networks. We call this method eager because it generalize beyond the training data before observe the new query, committing at training time to the network structure and weights that define its approximation to the target function. In this same sense, every other algorithm discussed elsewhere in this book (e.g., Backpropagation, C4.5) is an eager learning algorithm. ... Lazy methods may consider the query instance x_{q} when deciding how to generalize beyond the training data D. ... Eager methods cannot. By the time they observe the query instance x_{q} they have already chosen their (global) approximation to the target function. ... The key point in the above paragraph is that a lazy learning has the option of (implicitly) representing the target function by a combination of many local approximations, whereas an eager learner must commit at training time to a single global approximation. The distinction between eager and lazy learning is thus related to the distinction between global and local approximations to the target function.