Experiments were first conducted to measure the spectral normal emissivity values of a variety of aluminum alloys at 600, 700, and 800 K. Multispectral radiation thermometry (MRT) using linear emissivity models (LEM) and log-linear emissivity models (LLE) were then applied to predict surface temperature. Results show that the spectral emissivity decreases with increasing wavelength and increases with increasing temperature. Alloy effect becomes evident at higher temperature. The surface oxidation becomes fully-developed after the first hour heating and results in constant emissivity. Half of temperature predictions by MRT emissivity models provide the absolute temperature error under 10% and a quarter if the results are under 5%. The better emissivity model to suitably represent the real surface emissivity behaviors the more accurate inferred temperature by MRT can be achieved. Increasing the order of emissivity model and increasing the number of wavelengths cannot improve temperature measurement accuracy. More accurate temperature measurement by MRT can be achieved at higher temperature. Overall, three emissivity models give good results most frequently and provide the best compensation for different alloys, the number of wavelengths, and temperatures.