sklearn.linear model.Ridge

Jump to: navigation, search

A sklearn.linear model.Ridge is a ridge regression system within sklearn.linear_model class.

  • Context
    • Usage:
1) Import Linear Regression model from scikit-learn : from sklearn.linear_model import Ridge
2) Create design matrix X and response vector Y
3) Create Lasso Regression object: model=Ridge(alpha=alpha[,fit_intercept=True, normalize=False,...])
4) Choose method(s):
  • Fit Ridge regression model:, Y[, check_input]))
  • Predict Y using the linear model with estimated coefficients: Y_pred = model.predict(X)
  • Return coefficient of determination (R^2) of the prediction: model.score(X,Y[, sample_weight=w])
  • Get estimator parameters: model.get_params([deep])
  • Set estimator parameters: model.set_params(**params)
Input: Output:
from sklearn.linear_model import Ridge
from sklearn.model_selection import cross_val_predict
from sklearn.datasets import load_boston
from sklearn.metrics import explained_variance_score, mean_squared_error
import numpy as np
import pylab as pl
boston = load_boston() #Loading boston datasets
x = # Creating Regression Design Matrix
y = # Creating target dataset
ridgereg= Ridge(alpha=0.5) # Create Ridge regression object with alpha=0.5,y) # Fit linear regression
yp = ridgereg.predict(x) # predicted values
yp_cv = cross_val_predict(ridgereg, x, y, cv=10) #Calculation 10-Fold CV
ridge boston10fold.png
(blue dots correspond to 10-Fold CV)

#Calculaton of RMSE and Explained Variances

RMSE =np.sqrt(mean_squared_error(y,yp))
RMSECV =sqrt(mean_squared_error(y,yp_cv)_
Method: Ridge Regression
RMSE on the dataset: 4.6857
RMSE on 10-fold CV: 5.8428
Explained Variance Regression Score on the dataset : 0.7399
Explained Variance Regression 10-fold CV: 0.5956



    • QUOTE: class sklearn.linear_model.Ridge(alpha=1.0, fit_intercept=True, normalize=False, copy_X=True, max_iter=None, tol=0.001, solver=’auto’, random_state=None)

      Linear least squares with l2 regularization.

      This model solves a regression model where the loss function is the linear least squares function and regularization is given by the l2-norm. Also known as Ridge Regression or Tikhonov regularization. This estimator has built-in support for multi-variate regression (i.e., when y is a 2d-array of shape [n_samples, n_targets]).