Partial Least Squares (PLS), Kernel-based Orthogonal Projections to Latent Structures (K-OPLS) and NIPALS based OPLS
PLS regression algorithm based on the Yi Cao implementation:
K-OPLS regression algorithm based on this paper.
OPLS implementation based on the R package Metabomate using NIPALS factorization loop.
$ npm i ml-pls
;var X =01 002025 101095 001101 096;var Y =1 01 01 00 1;var options =latentVectors: 10tolerance: 1e-4;var pls = options;pls;
// assuming that you created Xtrain, Xtest, Ytrain, Ytest;;var kernel = 'gaussian'sigma: 25;var cls =orthogonalComponents: 10predictiveComponents: 1kernel: kernel;cls;varprediction // predictionpredScoreMat // Score matrix over predictionpredYOrthVectors // Y-Orthogonal vectors over prediction= cls;
// get the famous iris dataset;// get dataset-metadata;// get frozen folds for testing purposeslet cvFolds = ;let x = iris;let oplsOptions = cvFolds nComp: 1 ;// get labels as factor (for regression)let labels = metadata headers: 'iris' ;let y = labelsvalues;// get modellet model = x y oplsOptions;
The OPLS class is intended for exploratory modeling, that is not for the creation of predictors. Therefore there is a built-in k-fold cross-validation loop and Q2y is an average over the folds.
should give 0.9209227614652857
If for some reason a predictor is necessary the following code may serve as an example
let testIndex = 0testIndex;let trainIndex = 0trainIndex;// get datalet data = ;// set test and training setlet testX = data;let trainingX = data;// convert to matrixtrainingX = trainingX;testX = testX;// get metadatalet labels = ;let testLabels = labels;let trainingLabels = labels;let trainingY = trainingLabels headers: 'iris' ;let testY = testLabels headers: 'iris'values;let model =trainingXtrainingYvalues;let prediction = model;console;