Very new to these matters, so please bear with me if my basic HVAC understanding is imprecise: To start, I had a system in which airflow through a pipe was A) filtered for contaminants before flowing on to B) a compressor which outputted the flow at an increased discharge pressure. The A) filter was rated for a “capacity” of up to 100 cfm. The B) air compressor was equivalently rated for an “inlet capacity” of 100 cfm along with a “discharge pressure” of 125 PSIG. Before long a back-suction acting on the inlet side of the compressor caused some of the seals to pull out. This back-suction was believed to be caused by a “pressure drop” (not sure if that’s the correct term) occurring during air flow between the air filter and compressor, meaning the air compressor was straining to speed up airflow well below its inlet capacity. Does this sound plausible? After the filters were removed the compressors handled the air flow without setback but the air filtering was compromised. Before simply adding a filter with increased cfm capacity to mitigate the "pressure drop" between filter and compressor, what exactly is the “capacity” of a filter? Is this the “inlet capacity,” meaning after air flow understandably decreases through the filter it can be expected to be noticeably less than at inlet? Or is it the expected outflow? Second, is combating the decreased air flow into the compressor as easy as increasing the capacity of the air filter? To 200 cfm?