PCA
PCA
A model type for constructing a pca, based on MultivariateStats.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
PCA = @load PCA pkg=MultivariateStats
Do model = PCA()
to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in PCA(maxoutdim=...)
.
Principal component analysis learns a linear projection onto a lower dimensional space while preserving most of the initial variance seen in the training data.
Training data
In MLJ or MLJBase, bind an instance model
to data with
mach = machine(model, X)
Here:
X
is any table of input features (eg, aDataFrame
) whose columns are of scitypeContinuous
; check column scitypes withschema(X)
.
Train the machine using fit!(mach, rows=...)
.
Hyper-parameters
maxoutdim=0
: Together withvariance_ratio
, controls the output dimensionoutdim
chosen by the model. Specifically, suppose thatk
is the smallest integer such that retaining thek
most significant principal components accounts forvariance_ratio
of the total variance in the training data. Thenoutdim = min(outdim, maxoutdim)
. Ifmaxoutdim=0
(default) then the effectivemaxoutdim
ismin(n, indim - 1)
wheren
is the number of observations andindim
the number of features in the training data.variance_ratio::Float64=0.99
: The ratio of variance preserved after the transformationmethod=:auto
: The method to use to solve the problem. Choices are:svd
: Support Vector Decomposition of the matrix.:cov
: Covariance matrix decomposition.:auto
: Use:cov
if the matrices first dimension is smaller than its second dimension and otherwise use:svd
mean=nothing
: ifnothing
, centering will be computed and applied, if set to0
no centering (data is assumed pre-centered); if a vector is passed, the centering is done with that vector.
Operations
transform(mach, Xnew)
: Return a lower dimensional projection of the inputXnew
, which should have the same scitype asX
above.inverse_transform(mach, Xsmall)
: For a dimension-reduced tableXsmall
, such as returned bytransform
, reconstruct a table, having same the number of columns as the original training dataX
, that transforms toXsmall
. Mathematically,inverse_transform
is a right-inverse for the PCA projection map, whose image is orthogonal to the kernel of that map. In particular, ifXsmall = transform(mach, Xnew)
, theninverse_transform(Xsmall)
is only an approximation toXnew
.
Fitted parameters
The fields of fitted_params(mach)
are:
projection
: Returns the projection matrix, which has size(indim, outdim)
, whereindim
andoutdim
are the number of features of the input and output respectively.
Report
The fields of report(mach)
are:
indim
: Dimension (number of columns) of the training data and new data to be transformed.outdim = min(n, indim, maxoutdim)
is the output dimension; heren
is the number of observations.tprincipalvar
: Total variance of the principal components.tresidualvar
: Total residual variance.tvar
: Total observation variance (principal + residual variance).mean
: The mean of the untransformed training data, of lengthindim
.principalvars
: The variance of the principal components. An AbstractVector of lengthoutdim
loadings
: The models loadings, weights for each variable used when calculating principal components. A matrix of size (indim
,outdim
) whereindim
andoutdim
are as defined above.
Examples
using MLJ
PCA = @load PCA pkg=MultivariateStats
X, y = @load_iris ## a table and a vector
model = PCA(maxoutdim=2)
mach = machine(model, X) |> fit!
Xproj = transform(mach, X)
See also KernelPCA
, ICA
, FactorAnalysis
, PPCA