I reference this forum all the time and you guys are always good at breaking down confusing questions so I decided I needed to make an account and get some help with a mental struggle I have been having for quite some time! Two commonly discussed ways of improving a compressor's efficiency in a standard refrigeration cycle are: 1) Increase suction pressure- Less dp across compressor, less work. 2) Reduce suction gas temperature- Denser gas, less work. This is confusing to me and I am hoping someone can clarify. My thoughts: 1) When the pressure is increased, the temperature must also increase as well, right? Pv=nRT. 2) And when the temperature of the gas is decreased, the pressure must also decrease as well right? -assuming v,n and R stay constant in the closed cycle... So how can both of these measures be true? You are negatively effecting one by improving the other so can't only one of these be correct? Or is it a balance of both? I can't wrap my head around it, any help is appreciated.