A method is introduced for prediction of the true spectral absorbance at a given wavelength from an absorbance measurement that is affected by various deficiencies of the spectral measurement channel such as finite bandwidth, cross-over at unwanted wavelength, and signal bypassing the sample. These are unwanted effects that violate the validity of the Beer-Lambert law by introducing a nonlinear dependency between the concentration and the measured absorbance. The method is based on a simple physical model of the spectral measurement channel. It may be used even for applications where only one or a few spectral channels are used, e.g., for instruments based on optical filters. The method is applied to a wastewater NO<sub><i>x</i></sub> sensor consisting of two filter-based spectral channels for absorption measurement in the UV region. Simulations show that dramatic improvement in the accuracy is obtained by introducing only one additional parameter for nonlinear correction as compared to the linear model. Calibration of the model is discussed.
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.
Contact your librarian or system administrator
Login to access OSA Member Subscription