RidgeRegressor
RidgeRegressor
A model type for constructing a ridge regressor, based on MLJLinearModels.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
RidgeRegressor = @load RidgeRegressor pkg=MLJLinearModels
Do model = RidgeRegressor()
to construct an instance with default hyper-parameters.
Ridge regression is a linear model with objective function
$
|Xθ - y|₂²/2 + n⋅λ|θ|₂²/2 $
where $n$ is the number of observations.
If scale_penalty_with_samples = false
then the objective function is instead
$
|Xθ - y|₂²/2 + λ|θ|₂²/2 $
.
Different solver options exist, as indicated under "Hyperparameters" below.
Training data
In MLJ or MLJBase, bind an instance model
to data with
mach = machine(model, X, y)
where:
X
is any table of input features (eg, aDataFrame
) whose columns haveContinuous
scitype; check column scitypes withschema(X)
y
is the target, which can be anyAbstractVector
whose element scitype isContinuous
; check the scitype withscitype(y)
Train the machine using fit!(mach, rows=...)
.
Hyperparameters
lambda::Real
: strength of the L2 regularization. Default: 1.0fit_intercept::Bool
: whether to fit the intercept or not. Default: truepenalize_intercept::Bool
: whether to penalize the intercept. Default: falsescale_penalty_with_samples::Bool
: whether to scale the penalty with the number of observations. Default: truesolver::Union{Nothing, MLJLinearModels.Solver}
: any instance ofMLJLinearModels.Analytical
. UseAnalytical()
for Cholesky andCG()=Analytical(iterative=true)
for conjugate-gradient. Ifsolver = nothing
(default) thenAnalytical()
is used. Default: nothing
Example
using MLJ
X, y = make_regression()
mach = fit!(machine(RidgeRegressor(), X, y))
predict(mach, X)
fitted_params(mach)
See also ElasticNetRegressor
.