Article ID Journal Published Year Pages File Type
1689911 Vacuum 2015 16 Pages PDF
Abstract
High vacuum pressure measurements and calibrations below ≈1 × 10−6 Torr are problematic. Specifically, measurement accuracies change drastically for vacuum gauges when pressures are suddenly lowered in vacuum systems. How can gauges perform like this? To answer this question, a brief system description is first required. Calibrations were performed using a vacuum calibration chamber with attached vacuum gauges. To control chamber pressures, vacuum pumps decreased the chamber pressure while nitrogen tanks increased the chamber pressure. By balancing these opposing pressures, equilibrium in the chamber was maintained at selected set point pressures to perform calibrations. When pressures were suddenly decreased during set point adjustments, a sudden rush of gas from the chamber also caused a surge of gas from the gauges to decrease the pressures in those gauges. Gauge pressures did not return to equilibrium as fast as chamber pressures due to the sparse distribution of gas molecules in the system. This disparity in the rate of pressure changes caused the pressures in different gauges to be different than expected. This discovery of a new theory was experimentally proven to show that different gauge designs return to equilibrium at different rates, and that gauge accuracies vary for different gauge designs due to fluid transients in molecular flow.
Related Topics
Physical Sciences and Engineering Materials Science Surfaces, Coatings and Films
Authors
, ,