This term has at least two distinct meanings in spectroscopic methodology. The first meaning
refers to division of each point in a spectrum by the spectrum y-axis standard deviation so that each spectrum is given the same weight when used for calibration. In this first definition,
normalization refers to division of each data point in the spectrum by a correction factor (for example, pathlength, area, standard deviation, reference band intensity, or multiplicative signal correction). The second meaning refers to the use of spectra for discriminant analysis, and is the
method used to make the multidimensional size of the data from each sample type the same. The simplest normalization divides the spectrum by the maximum y-axis value, so that all spectra show relative intensity values from 0 to 1.