Flow
sklearn.linear_model.ridge.Ridge

sklearn.linear_model.ridge.Ridge

Visibility: public Uploaded 18-12-2019 by George Volkov sklearn==0.21.2 numpy>=1.6.1 scipy>=0.9 5 runs
0 likes downloaded by 0 people 0 issues 0 downvotes , 0 total downloads
  • openml-python python scikit-learn sklearn sklearn_0.21.2
Issue #Downvotes for this reason By


Loading wiki
Help us complete this description Edit
Linear least squares with l2 regularization. Minimizes the objective function:: ||y - Xw||^2_2 + alpha * ||w||^2_2 This model solves a regression model where the loss function is the linear least squares function and regularization is given by the l2-norm. Also known as Ridge Regression or Tikhonov regularization. This estimator has built-in support for multi-variate regression (i.e., when y is a 2d-array of shape [n_samples, n_targets]).

Parameters

alphadefault: 0.17
copy_XIf True, X will be copied; else, it may be overwrittendefault: true
fit_interceptWhether to calculate the intercept for this model. If set to false, no intercept will be used in calculations (e.g. data is expected to be already centered)default: true
max_iterMaximum number of iterations for conjugate gradient solver For 'sparse_cg' and 'lsqr' solvers, the default value is determined by scipy.sparse.linalg. For 'sag' solver, the default value is 1000default: null
normalizeThis parameter is ignored when ``fit_intercept`` is set to False If True, the regressors X will be normalized before regression by subtracting the mean and dividing by the l2-norm If you wish to standardize, please use :class:`sklearn.preprocessing.StandardScaler` before calling ``fit`` on an estimator with ``normalize=False``default: false
random_stateThe seed of the pseudo random number generator to use when shuffling the data. If int, random_state is the seed used by the random number generator; If RandomState instance, random_state is the random number generator; If None, the random number generator is the RandomState instance used by `np.random`. Used when ``solver`` == 'sag' .. versionadded:: 0.17 *random_state* to support Stochastic Average Gradient.default: null
solverdefault: "auto"
tolPrecision of the solution solver : {'auto', 'svd', 'cholesky', 'lsqr', 'sparse_cg', 'sag', 'saga'} Solver to use in the computational routines: - 'auto' chooses the solver automatically based on the type of data - 'svd' uses a Singular Value Decomposition of X to compute the Ridge coefficients. More stable for singular matrices than 'cholesky' - 'cholesky' uses the standard scipy.linalg.solve function to obtain a closed-form solution - 'sparse_cg' uses the conjugate gradient solver as found in scipy.sparse.linalg.cg. As an iterative algorithm, this solver is more appropriate than 'cholesky' for large-scale data (possibility to set `tol` and `max_iter`) - 'lsqr' uses the dedicated regularized least-squares routine scipy.sparse.linalg.lsqr. It is the fastest and uses an iterative procedure - 'sag' uses a Stochastic Average Gradient descent, and 'saga' uses its improved, unbiased version named SAGA. Both methods also use ...default: 0.001

0
Runs

List all runs
Parameter:
Rendering chart
Rendering table