I have a somewhat mundane question that I hope somebody can help me with. I am working with several standards for performing leak testing on hermetically sealed electronics packages using helium bombing. The question I have is about the engineering units used in the leak test specifications. Both the NASA Leakage Testing Handbook (NASA CR-952) and the Nondestructive Testing Handbook - Leak Testing, 3rd Edition, published by the American Society for Nondestructive Testing indicate that the units to be used for specifying a gas leak are Pressure (P) x Volume (V) / Time (t). Pressure x Volume reduces to Energy (see, for example, the Ideal Gas Law, which is an energy balance equation: PV = nRT) and Energy / Time is Work or Power. This being the case, I would expect the units for a gas leak specification to be in terms of Power (Watts or other Power units). Instead, in every standard I have seen, the units are left in their unreduced form such as Atmosphere-Cubic Centimeters per Second, Pascal-Cubic Meters per Second, or Millibar-Liters per Second. To add to the confusion, some standards employ the abbreviation STD to indicate a leak rate at some "standard", sometimes unstated, conditions. An example of this would be STD cm^3 / sec. In this case they seem to have dropped the units of Pressure since, I am assuming, the Pressure is defined as part of the standard conditions (STD). Anyway, bottom line, I am looking for an explanation as to why unreduced engineering units are universally used in gas leak standards.