Abstract
Linear regression is a widely used method for modelling relationships between an effect variable and one or more concomitant variables. However, multicollinearity (high correlation among concomitant variables) undermines the stability and accuracy of parameter estimates, often reflected in the small eigenvalues (significant roots) of the design matrix. This study investigates the impact of significant roots on parameter estimation and explores Partial Least Squares (PLS) regression as a remedy. PLS addresses multicollinearity by constructing latent variables that maximize covariance with the response offering improved prediction and stability. Through simulation study across various sample sizes (20 – 250) and eleven multicollinearity levels (0 – 0.999), the performance of PLS and Ordinary Least Squares (OLS) estimators is compared using Mean Squared Error (MSE). Results show that OLS performs best at low multicollinearity, while PLS, particularly at 1% significance level, excels when multicollinearity is high. Findings highlight the diagnostic role of eigenvalues and confirm PLS as a robust alternative for achieving efficient regression estimates under multicollinearity thereby validating that PLS regression provides more stable and accurate parameter estimates than OLS in the presence of high multicollinearity.

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Copyright (c) 2025 Tech-Sphere Journal for Pure and Applied Sciences