The infrared (IR) spectra of whole blood EDTA samples, in the range between 1500 and 750 cm<sup>−1</sup>, obtained from the patient population of a general hospital, were used to compare different multivariate calibration techniques for quantitative glucose determination. Ninety-six spectra of whole undiluted blood samples with glucose concentration ranging between 44 and 291 mg/dL were used to create calibration models based on a combination of partial least-squares (PLS) and artificial neural network (ANN) methods. The prediction capabilities of these calibration models were evaluated by comparing their standard errors of prediction (SEP) with those obtained with the use of PLS and principal component regression (PCR) calibration models in an independent prediction set consisting of 31 blood samples. The optimal model based on the combined PLS-ANN produced smaller SEP values (15.6 mg/dL) compared with those produced with the use of either PLS (21.5 mg/dL) or PCR (24.0 mg/dL) methods. Our results revealed that the combined PLS-ANN models can better approximate the deviations from linearity in the relationship between spectral data and concentration, compared with either PLS or PCR models

PDF Article

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.

Contact your librarian or system administrator
Login to access OSA Member Subscription