LassoRegressor
LassoRegressor
A model type for constructing a lasso regressor, based on MLJLinearModels.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
LassoRegressor = @load LassoRegressor pkg=MLJLinearModels
Do model = LassoRegressor()
to construct an instance with default hyper-parameters.
Lasso regression is a linear model with objective function
$
|Xθ - y|₂²/2 + n⋅λ|θ|₁ $
where $n$ is the number of observations.
If scale_penalty_with_samples = false
the objective function is
$
|Xθ - y|₂²/2 + λ|θ|₁ $
.
Different solver options exist, as indicated under "Hyperparameters" below.
Training data
In MLJ or MLJBase, bind an instance model
to data with
mach = machine(model, X, y)
where:
X
is any table of input features (eg, aDataFrame
) whose columns haveContinuous
scitype; check column scitypes withschema(X)
y
is the target, which can be anyAbstractVector
whose element scitype isContinuous
; check the scitype withscitype(y)
Train the machine using fit!(mach, rows=...)
.
Hyperparameters
lambda::Real
: strength of the L1 regularization. Default: 1.0fit_intercept::Bool
: whether to fit the intercept or not. Default: truepenalize_intercept::Bool
: whether to penalize the intercept. Default: falsescale_penalty_with_samples::Bool
: whether to scale the penalty with the number of observations. Default: truesolver::Union{Nothing, MLJLinearModels.Solver}
: any instance ofMLJLinearModels.ProxGrad
. Ifsolver=nothing
(default) thenProxGrad(accel=true)
(FISTA) is used. Solver aliases:FISTA(; kwargs...) = ProxGrad(accel=true, kwargs...)
,ISTA(; kwargs...) = ProxGrad(accel=false, kwargs...)
. Default: nothing
Example
using MLJ
X, y = make_regression()
mach = fit!(machine(LassoRegressor(), X, y))
predict(mach, X)
fitted_params(mach)
See also ElasticNetRegressor
.