A method of temperature measurement behind reflected shock waves in argon is investigated, which utilizes the integrated emission over a large number of rotational lines in the 2Σ->2II band system of OH. In the majority of experiments the OH emission is generated by the addition of 0·5% (H2+O2) to the argon, and the intensity, from two spectral regions, is monitored by two monochromators. From the ratio of the observed intensities the gas temperature can be calculated from the known spectroscopic data for OH. Theoretically derived intensity ratios for several bands, compared with that of the (0,0) band, are given as a function of temperature.
Values of spectroscopic temperatures, over the range 3000-6000 K, agree with the gas dynamic values, derived from incident shock wave velocities with Mach numbers varying from 3·5 to 5·2, to within the predicted experimental error. The method is shown to be inapplicable at initial pressures greater than about 80 Torr because of self-absorption and at pressures lower than 20 Torr radiative disequilibrium may equally limit its use. Moreover, the pre-association reaction of oxygen and hydrogen atoms is found to enhance the intensity of most bands except the (0,0) and (1,1); only the latter bands may therefore be used to obtain the time variation of temperature behind the shock front.