For decades pH measurement has been the mainstay of water quality testing in cleanrooms and labs. However Mark Bosley, Business Support Divisional Manager, SUEZ Water Purification Systems, outlines some common pitfalls
It is important to understand the limitations of pH measurement for water purity tests
Many cleanroom applications rely on the availability of water at a known level of purity. Measuring the pH of that water can be one of the quickest ways of determining its purity. This is because many common contaminants, including dissolved air, carbon dioxide and ionic salts, have a direct effect on acidity or alkalinity.
There is, however, a catch. It is actually quite difficult to measure pH directly, especially in production environments as both the measuring instruments and the sampling procedures required to use them are prone to induced errors.
As a result, it is far more common to make use of a side-effect of dissolved ions: their ability to conduct electricity.
Ions dissolved in water can have different charges, and will move at different velocities. Cations (with a positive charge) include H+, Na+ and Mg++. Anions (with a negative charge) can include OH-, Cl- and SO42 -.
However, the most mobile ions commonly found in water are hydrogen (H+) and hydroxyl (OH-). As a result, highly acidic or alkaline solutions will usually have the greatest conductivity.
Using a conductivity meter, it is possible to measure the flow of electricity through a fluid. The resulting reading is directly proportional to the concentration of ions in the fluid, their charge and level of activity: the higher the reading, the greater the concentration of ions.
Conductivity is measured in Microsiemens per centimetre (µS/cm). Sometimes a measurement device will actually record the reciprocal of conductivity – resistivity – measured in Megohm centimetres (MΩ.cm).
Ultrapure, or deionised, water has a natural pH of 6.998 and a conductivity of 0.055S/cm, or 18.2MΩ.cm at 25°C.
It is common practice to use conductivity to measure water with a high concentration of ions and resistivity to measure water containing fewer ions. It is important to note that those numbers are given for a specific temperature. Changes in temperature will significantly alter the conductivity of a sample.
In water with the purity of typical mains supply, for example, conductivity will change by around 2% for each degree Celsius. As a result, conductivity measurements are always referenced to 25°C, to allow different samples to be compared.
However, temperature doesn’t just affect the performance of measuring equipment.
Since pH is a measure of the activity of hydrogen ions in a solution, and ions are more active at higher temperatures, the actual pH of a solution will vary according to its temperature, with a higher temperature meaning more ion activity.
At 0°C the pH of a neutral solution like pure water rises to 7.5, for example, while at 100°C it falls to 6.2.
Commercially available hand-held conductivity or resistivity meters use probes inserted into a liquid sample to measure the concentration of hydrogen ions against a reference source. Older designs typically use two separate probes, while more modern ones incorporate a single combined sensor.
Usually, the device also takes a temperature reading to allow it to calculate the actual pH value of the sample more accurately.
As well as pH, the meter may be able to display other values, like resistivity, conductivity and total dissolved solids (TDS).
The pH of a solution will vary according to its temperature
The measurement of highly purified water using hand held probes can be prone to errors.
The most significant cause of error arises because as soon as a sample is drawn from the system it begins to absorb impurities from a number of sources, including the sampling container itself and the surrounding air.
In particular, carbon dioxide from the air will react with the water to form carbonic acid in solution; this in turn disassociates to release conductive ions. Just a few parts per million (PPM) of CO2 dissolved in a sample of ultrapure water can reduce the pH to around 4.0.
This means that a pure water sample cannot be exposed to air without affecting the accuracy of pH measurement. The rate of sample contamination will depend on the surface area of the sample exposed and the time elapsed.
To achieve consistent measurement using hand held devices, care should be taken to ensure that instrument probes and sample containers are thoroughly cleaned before use, that the measurement is taken as soon as possible after sample collection and that consistent procedures are used to maintain the comparability of measurement between samples.
It is usually best to take the measurement instrument to the source, rather than transporting the sample to the instrument. Likewise, the probe should be fully immersed at the bottom of the sample container, with the sample allowed to overflow.
Failure to ensure consistency in measurement practice can lead to changes in pH reading, and the incorrect assumption that water purification or process equipment has malfunctioned.
The problem of sample contamination can be avoided through the use of in-line measurement devices that analyse closed, flowing samples. This approach is the norm in more demanding applications, where the instrumentation is connected directly to higher level automated process control systems.
Even these kinds of sensors, however, are still exposed to the risk of contamination and to the effects of temperature changes.
Fouling, scaling or chemical poisoning can rapidly affect the accuracy of sensors in process equipment and they should be recalibrated at frequent intervals to account for such effects.
Similarly, it is important that the measurement device selected is suitable for the process in question. Many devices are calibrated for samples containing relatively high levels of contamination, for example, making them unsuitable for the monitoring of ultrapure water, unless they are fitted with specialised probes.
While pH measurements will remain a mainstay of water quality analysis in cleanroom applications, any organisation, using the technique should take care to ensure that staff understand the limitations of their instrumentation and measurement procedures, and that they have selected the right equipment, calibration and sampling protocols to suite the needs of their application.
The pH Scale
The pH scale provides a standardised method for the measurement of the acidity or alkalinity (basicity) of a solution. It was originally defined by Danish chemist Søren Sørensen, who was head of the Carlsberg laboratory’s Chemical Department at the time.
pH is calculated as the negative logarithm of the activity of hydrogen ions in a solution. That activity is then affected by the concentration of hydrogen ions and their temperature. A lower pH number indicates more hydrogen ion activity and a more acidic solution. Expressed on a scale from 0 to 14, the logarithmic nature of the definition means that a single point’s difference in pH indicates a factor of 10 change in basicity.
pH values of common materials at 25°C:
• Lemon juice 2.5
• Beer 4.5
• Pure water 7
• Household bleach 12.5