Spoiler alert: The answer is NO. But read on for technical details.
Let’s say you invent a new super-duper moisture detection technology and want to prove it works. How do you check its accuracy?!! If you take it to NIST (National Institute of Standards and Technology) and ask them to test the new device, they will check it against a chilled-mirror or a dew point generation system. Why is that?
Let’s presume there is a dispute on moisture measurement between a supplier and a pipeline. How does one resolve the dispute? They measure the moisture of the gas with a chilled-mirror. Why are chilled-mirrors the reference method for moisture measurement in gas?!! The reason is simple. It is because a chilled-mirror measurement is the reference “First-Principle Measurement”. A “First-Principle’ measurement simply refers to measuring a property as directly as possible, based on a core physical phenomenon. (Click here for a brief discussion of the “First-Principle” measurement)
Then why is it that more chilled-mirrors are not used for detection of moisture in natural gas, but are widely used in other industries? There is a historical reason, but that reason is no longer valid. In the old days, it was difficult to distinguish water dew point from hydrocarbon dew point. In natural gas you have both. That’s why other technologies such as TDLs, QCM (Quartz Crystal Monitor), and ceramic oxides were used. But the new generation of automated chilled-mirror based on CEIRS™ (Chilled-mirror Evanescent IR Spectroscopy) overcomes this problem. So now, one can make a moisture measurement based on the universally accepted chilled-mirror reference measurement without worrying about condensation of other contaminants.
A closer look…
Let’s compare the case of measuring moisture with an automated chilled-mirror to measuring it using a TDL analyzer. When you use a chilled-mirror, you cool a surface until you detect condensation on the surface. The temperature at which condensation occurs is the dew point. No ifs and buts about it. The only thing you are relying on for accuracy is your temperature sensor, which these days are very accurate and repeatable. You can make the measurement at process pressure. No need to reduce the pressure before the measurement. Thanks to the spectral recognition feature of CEIRS™ Technology, you know it is water dew point.
Now let’s say one tries to make the same measurement using a TDL analyzer. In a TDL analyzer, a laser beam passes through some volume of gas, bouncing it off mirrors hundreds of times. Then the amount of light loss (hopefully due to absorption by water molecules) is measured. The following assumptions have to hold true if you expect an accurate measurement:
1- There are no interfering molecules that absorb at the same wavelength as moisture. Depending on the gas mixture and the wavelength of the laser, this may or may not be true.
2- The laser wavelength should not drift, remaining at the same wavelength as it was designed. If it does drift, you end up measuring a different absorption line and perhaps a different molecule. The wavelength of the diode laser is a function of its temperature. So unless the temperature of the laser chip is tightly controlled, there will be drift. Given the small size of the laser chips (less than 0.001 in2 or 1mm2), and the heat density they generate, this is no simple task.
3- The light detectors and the detection circuitry must remain stable over time. Any drift in the sensitivity of detectors over time, and the measurement loses accuracy.
4- You assume there are no water molecules (or anything else for that matter) sticking to the surface of the mirrors; not a valid assumption given the composition of natural gas.
5- The temperature of the cell is all important because adsorption of water (and other molecules) on the cell mirrors is a function of temperature. The mechanical distortion of the cell is also a function of its temperature. Thus the cell really needs to be maintained at an elevated and constant temperature to avoid this error.
6- TDL measurements have to be done at very low pressures, typically ~1 bar (~15psi). Why? Because at higher pressures, you get pressure broadening of the absorption lines making the measurement very inaccurate. When one reduces the pressure, unless it is done very carefully in multiple stages and with plenty of heat, the gas composition WILL CHANGE, particularly its moisture content. This problem is very well documented.
The BAD news is that, to get an accurate measurement using TDLs, you have to assume that all of the above holds true. In reality, often one or more of these assumptions are far from valid. The WORSE news is that aluminum oxide (ceramic) sensors are even less suitable for natural gas applications.
So why is it that people still use TDLs, or for that matter aluminum oxide sensors? And why is it that these content measurement tools are called “Dew Point analyzers”, when in reality there are not dew point analyzers? The culprits are probably hanging out in marketing departments somewhere. From a technical point of view, this is extremely misleading and technically inaccurate.
To summarize…….Automated Chilled-mirror analyzers, are really the best method to accurately measure moisture in natural gas. Recent advances in this technology such as CEIRS™ have really made them the expert users’ choice for this important measurement. For more information on ZEGAZ Instruments’ analyzers for measurement of moisture in natural gas, see the Products section on this website or contact info@zegaz.com .
Comments