This thesis examines the use of gas indicators to monitor the stages of self-heating in a high volatile bituminous coal from the Hunter Valley. Three tests were conducted on the same coal: HV1, HV2 and HV3. HV1 and HV3 were standard self-heating tests in which a hot spot developed within the coal. However, during the HV2 the entire mass of coal was heated up to specific temperatures in an effort to determine if there is a difference in gas evolution between when the entire coal mass is heated and when the coal naturally develops a hot spot.
When the data was analysed it was found that, as previous literature has suggested, moisture plays a significant part in the self-heating of the coal. It was found that between the different tests the Graham’s Ratio (GR) increased due to the removal of moisture during the tests.
It was also found that there was a roll over effect at higher temperatures in the GR. This is due to the dilution that occurs during the ‘hot-spot’ mode of testing. During this mode the GR is ‘diluted down’ by the bulk of the coal which is at a lower temperature. This effect was observed above 160°C. Unfortunately the HV2 test did not reach this temperature and as such this was not demonstrated by this test, however, when calculations were completed using results from the SIMTARS small scale test, this theory was confirmed and a correct GR was able to be calculated based on the temperature profile of the HV1 test.
Hydrogen was also detected at temperatures as low as 24°C. This is a significant finding as previous authors have suggested that H2 is not generated until temperatures above 100°C are reached.
Overall, it was found that no gas indicator can be used solely to estimate temperature. A combination of gas indicators should be used to estimate the actual temperature. Of specific use for this particular coal are the Graham’s, Young’s, Jones-Trickett and Morris ratios, which could be used at lower temperatures. Ethane and ethylene production values are also useful at elevated temperatures.