# Difference between revisions of "sklearn.neighbors Module"

m (Text replacement - "a [[" to "a [[") |
m (Remove links to pages that are actually redirects to this page.) |
||

Line 2: | Line 2: | ||

* <B>Context:</B> | * <B>Context:</B> | ||

** It require to call/select a [[Decision Tree Learning System]] : | ** It require to call/select a [[Decision Tree Learning System]] : | ||

− | *** <code>[[sklearn.neighbors]].<span style="font-weight:italic; color:Green">Model_Name(self, arguments)</i></code> or simply <code>[[sklearn.tree]].<span style="font-weight:italic; color:Green">Model_Name()</i></code> <P>where <i>Model_Name</i> is the name of the selected [[K-Nearest Neighbor System]]. | + | *** <code>[[sklearn.neighbors Module|sklearn.neighbors]].<span style="font-weight:italic; color:Green">Model_Name(self, arguments)</i></code> or simply <code>[[sklearn.tree]].<span style="font-weight:italic; color:Green">Model_Name()</i></code> <P>where <i>Model_Name</i> is the name of the selected [[K-Nearest Neighbor System]]. |

*** It can cotain [[Unsupervised kNN Learning System]]s, [[Supervised kNN Classification System]]s and [[Supervised kNN Regression System]]s. | *** It can cotain [[Unsupervised kNN Learning System]]s, [[Supervised kNN Classification System]]s and [[Supervised kNN Regression System]]s. | ||

* <B>Example(s)</B>: | * <B>Example(s)</B>: | ||

Line 51: | Line 51: | ||

=== 2016 === | === 2016 === | ||

* (Scikit-Learn, 2016) ⇒ "1.6. Nearest Neighbors" http://scikit-learn.org/stable/modules/neighbors.html | * (Scikit-Learn, 2016) ⇒ "1.6. Nearest Neighbors" http://scikit-learn.org/stable/modules/neighbors.html | ||

− | ** QUOTE: [[sklearn.neighbors]] provides functionality for [[unsupervised neighbors-based learning method|unsupervised]] and [[supervised neighbors-based learning method]]s. </s> ... <P> ... The classes in [[sklearn.neighbors]] can handle either [[Numpy array]]s or [[scipy.sparse matrice]]s as input. </s> For [[dense matrice]]s, a large number of possible [[distance metric]]s are supported. </s> For [[sparse matrice]]s, [[arbitrary Minkowski metric]]s are supported for [[search]]es. </s> There are many [[learning routine]]s which rely on [[nearest neighbor]]s at their core. </s> One example is [[kernel density estimation]], discussed in the [[density estimation]] section. </s> | + | ** QUOTE: [[sklearn.neighbors Module|sklearn.neighbors]] provides functionality for [[unsupervised neighbors-based learning method|unsupervised]] and [[supervised neighbors-based learning method]]s. </s> ... <P> ... The classes in [[sklearn.neighbors Module|sklearn.neighbors]] can handle either [[Numpy array]]s or [[scipy.sparse matrice]]s as input. </s> For [[dense matrice]]s, a large number of possible [[distance metric]]s are supported. </s> For [[sparse matrice]]s, [[arbitrary Minkowski metric]]s are supported for [[search]]es. </s> There are many [[learning routine]]s which rely on [[nearest neighbor]]s at their core. </s> One example is [[kernel density estimation]], discussed in the [[density estimation]] section. </s> |

---- | ---- | ||

__NOTOC__ | __NOTOC__ | ||

[[Category:Concept]] | [[Category:Concept]] |

## Latest revision as of 20:45, 23 December 2019

An sklearn.neighbors Module is a nearest neighbors system within sklearn.

**Context:**- It require to call/select a Decision Tree Learning System :
`sklearn.neighbors.Model_Name(self, arguments)`

or simply`sklearn.tree.Model_Name()`

where

*Model_Name*is the name of the selected K-Nearest Neighbor System.- It can cotain Unsupervised kNN Learning Systems, Supervised kNN Classification Systems and Supervised kNN Regression Systems.

- It require to call/select a Decision Tree Learning System :
**Example(s)**:- Unsupervised kNN Learning Systems:
`sklearn.neighbors.BallTree`

, for solving a Fast Generalized N-points Task.`sklearn.neighbors.KDTree`

, for solving a Fast Generalized N-points Task.`sklearn.neighbors.DistanceMetric`

, a Distance Metric Algorithm.`sklearn.neighbors.KernelDensity`

, for solving a Kernel Density Estimation Task`sklearn.neighbors.LocalOutlierFactor`

, an Unsupervised Outlier Detection System that uses the Local Outlier Factor (LOF) Algorithm.`sklearn.neighbors.NearestNeighbors`

, an Unsupervised Learning System for implementing neighbor searches.

- Supervised kNN Classification Systems:
`sklearn.neighbors.KNeighborsClassifier`

, a Classification System that implements a K-Nearest Neighbors Voting Algorithm.`sklearn.neighbors.RadiusNeighborsClassifier`

, a Classification System that implements a vote among neighbors within a given radius.`sklearn.neighbors.NearestCentroid`

, a Nearest Centroid Classification System.

- Supervised kNN Regression Systems:
`sklearn.neighbors.KNeighborsRegressor`

, a Regression System based on K-Nearest Neighbors Algorithm.`sklearn.neighbors.RadiusNeighborsRegressor`

, a Regression System based on neighbors within a fixed radius.

- Weighted Graphs:
`sklearn.neighbors.kneighbors_graph`

, a Weighted graph of k-Neighbors for points in X.`sklearn.neighbors.radius_neighbors_graph`

, a Weighted graph of Neighbors for points in X.

- Unsupervised kNN Learning Systems:
**Counter-Example(s):**`sklearn.manifold`

, a collection of Manifold Learning Systems.`sklearn.tree`

, a collection of Decision Tree Learning Systems.`sklearn.ensemble`

, a collection of Decision Tree Ensemble Learning Systems.`sklearn.metrics`

, a collection of Metrics Subroutines.`sklearn.covariance`

,a collection of Covariance Estimators.`sklearn.cluster.bicluster`

, a collection of Spectral Biclustering Algorithms.`sklearn.linear_model`

, a collection of Linear Model Regression Systems.

**See:**kNN System.

## References

- (Scikit-Learn, 2017) ⇒ "sklearn.neighbors: Nearest Neighbors" http://scikit-learn.org/stable/modules/classes.html#module-sklearn.neighbors Retrieved: 2017-11-12
- QUOTE: The sklearn.neighbors module implements the k-nearest neighbors algorithm.
User guide: See the Nearest Neighbors section for further details.

`neighbors.BallTree`

BallTree for fast generalized N-point problems`neighbors.DistanceMetric`

DistanceMetric class`neighbors.KDTree`

KDTree for fast generalized N-point problems`neighbors.KernelDensity([bandwidth, …])`

Kernel Density Estimation`neighbors.KNeighborsClassifier([…])`

Classifier implementing the k-nearest neighbors vote.`neighbors.KNeighborsRegressor([n_neighbors, …])`

Regression based on k-nearest neighbors.`neighbors.LocalOutlierFactor([n_neighbors, …])`

Unsupervised Outlier Detection using Local Outlier Factor (LOF)`neighbors.RadiusNeighborsClassifier([…])`

Classifier implementing a vote among neighbors within a given radius`neighbors.RadiusNeighborsRegressor([radius, …])`

Regression based on neighbors within a fixed radius.`neighbors.NearestCentroid([metric, …])`

Nearest centroid classifier.`neighbors.NearestNeighbors([n_neighbors, …])`

Unsupervised learner for implementing neighbor searches.`neighbors.kneighbors_graph(X, n_neighbors[, …])`

Computes the (weighted) graph of k-Neighbors for points in X`neighbors.radius_neighbors_graph(X, radius)`

Computes the (weighted) graph of Neighbors for points in X

- QUOTE: The sklearn.neighbors module implements the k-nearest neighbors algorithm.

### 2016

- (Scikit-Learn, 2016) ⇒ "1.6. Nearest Neighbors" http://scikit-learn.org/stable/modules/neighbors.html
- QUOTE: sklearn.neighbors provides functionality for unsupervised and supervised neighbors-based learning methods. ...
... The classes in sklearn.neighbors can handle either Numpy arrays or scipy.sparse matrices as input. For dense matrices, a large number of possible distance metrics are supported. For sparse matrices, arbitrary Minkowski metrics are supported for searches. There are many learning routines which rely on nearest neighbors at their core. One example is kernel density estimation, discussed in the density estimation section.

- QUOTE: sklearn.neighbors provides functionality for unsupervised and supervised neighbors-based learning methods. ...