One of the earliest pressure measuring instruments is still in wide use today because of its inherent accuracy and simplicity of operation. It’s the U-tube manometer, which is a U-shaped glass tube partially filled with liquid. This manometer has no moving parts and requires no calibration. Manometer measurements are functions of gravity and the liquid’s density, both physical properties that make the U-tube manometer a NIST standard for accuracy. Only the increments and digits are traceable to NIST making recalibration of the unit unnecessary. This is the same method used by NIST to calibrate and certify other manufactured calibrating equipment and is still today the most accurate method of measuring very low pressures.
We use the Mercury Manometer as out calibration method. Water pressure is applied to both sides of the differential gauge, simulating actual field testing. The manometer registers the difference in the pressures applied and is accurate to 0.0455 psi, which is ten times more accurate than the backflow test gauges.
Triple distilled mercury is used and the only other factor in its accuracy is the location of the manometer in longitude and latitude or gravitational pull. At out location and elevation the calculated error is +/-0.0000455 psi.
The mercury is contained within glass tubes and never has any direct or indirect contact with the gauge being calibrated. This method allows for the testing of all components of the gauge including valves and manifolds under water pressure for leakage or malfunctions.