KNNClassifier
KNNClassifierA model type for constructing a K-nearest neighbor classifier, based on NearestNeighborModels.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
KNNClassifier = @load KNNClassifier pkg=NearestNeighborModelsDo model = KNNClassifier() to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in KNNClassifier(K=...).
KNNClassifier implements K-Nearest Neighbors classifier which is non-parametric algorithm that predicts a discrete class distribution associated with a new point by taking a vote over the classes of the k-nearest points. Each neighbor vote is assigned a weight based on proximity of the neighbor point to the test point according to a specified distance metric.
For more information about the weighting kernels, see the paper by Geler et.al Comparison of different weighting schemes for the kNN classifier on time-series data.
Training data
In MLJ or MLJBase, bind an instance model to data with
mach = machine(model, X, y)OR
mach = machine(model, X, y, w)Here:
Xis any table of input features (eg, aDataFrame) whose columns are of scitypeContinuous; check column scitypes withschema(X).yis the target, which can be anyAbstractVectorwhose element scitype is<:Finite(<:Multiclassor<:OrderedFactorwill do); check the scitype withscitype(y)wis the observation weights which can either benothing(default) or anAbstractVectorwhose element scitype isCountorContinuous. This is different fromweightskernel which is a model hyperparameter, see below.
Train the machine using fit!(mach, rows=...).
Hyper-parameters
K::Int=5: number of neighborsalgorithm::Symbol = :kdtree: one of(:kdtree, :brutetree, :balltree)metric::Metric = Euclidean(): anyMetricfrom Distances.jl for the distance between points. Foralgorithm = :kdtreeonly metrics which are instances ofDistances.UnionMinkowskiMetricare supported.leafsize::Int = algorithm == 10: determines the number of points at which to stop splitting the tree. This option is ignored and always taken as0foralgorithm = :brutetree, sincebrutetreeisn't actually a tree.reorder::Bool = true: iftruethen points which are close in distance are placed close in memory. In this case, a copy of the original data will be made so that the original data is left unmodified. Setting this totruecan significantly improve performance of the specifiedalgorithm(except:brutetree). This option is ignored and always taken asfalseforalgorithm = :brutetree.weights::KNNKernel=Uniform(): kernel used in assigning weights to the k-nearest neighbors for each observation. An instance of one of the types inlist_kernels(). User-defined weighting functions can be passed by wrapping the function in aUserDefinedKernelkernel (do?NearestNeighborModels.UserDefinedKernelfor more info). If observation weightsware passed during machine construction then the weight assigned to each neighbor vote is the product of the kernel generated weight for that neighbor and the corresponding observation weight.
Operations
predict(mach, Xnew): Return predictions of the target given featuresXnew, which should have same scitype asXabove. Predictions are probabilistic but uncalibrated.predict_mode(mach, Xnew): Return the modes of the probabilistic predictions returned above.
Fitted parameters
The fields of fitted_params(mach) are:
tree: An instance of eitherKDTree,BruteTreeorBallTreedepending on the value of thealgorithmhyperparameter (See hyper-parameters section above). These are data structures that stores the training data with the view of making quicker nearest neighbor searches on test data points.
Examples
using MLJ
KNNClassifier = @load KNNClassifier pkg=NearestNeighborModels
X, y = @load_crabs; ## a table and a vector from the crabs dataset
## view possible kernels
NearestNeighborModels.list_kernels()
## KNNClassifier instantiation
model = KNNClassifier(weights = NearestNeighborModels.Inverse())
mach = machine(model, X, y) |> fit! ## wrap model and required data in an MLJ machine and fit
y_hat = predict(mach, X)
labels = predict_mode(mach, X)
See also MultitargetKNNClassifier