JMack23 said:
So I can follow the sequence of operations: the refrigerant enters the compression stage as a saturated vapour(a gas?) and it's compressed raising the temperature of the refrigerant, it gives off some of this heat to the surrounding ambient air, causing a decrease in temperature and an according phase change to liquid(does it all change to liquid at this point or is there a mix of liquid and vapour?)
In an ideal cycle you have a sub-cooled liquid at this point (i.e. not two phase). In reality, you may have some quality here depending on the environmental conditions, heat load, and other variables.
It's then put through an expansion device which causes the pressure to drop and expanding the liquid. This is what I don't get, if it's in a liquid state prior to this stage then why use the expansion device? Is it for further cooling?
Also, intuitively it makes sense to me that there has to be some work input to the system to be able to extract the heat but it seems kinda strange to me since the main aim of the cycle is to make the refrigerant as cool as possible to be able to extract heat from the inside of the fridge so why would we want to make the initial starting condition HOTTER?
The purpose of this cycle is to pump heat from a cold zone to a hot zone. Since heat only moves from hot to cold (recall the second law of thermodynamics) the temperature in the condenser must be WARMER than the surrounding air to be able to reject any heat.
Now you have your sub-cooled, high pressure fluid and it is expanded. When you expand any fluid, without work or heat transfer, the temperature will drop. If this doesn't make sense imagine an ideal gas.
P=\rho R T
As P drops, T must as well to compensate in order for the relationship to hold. The same idea follows for the expansion process in a refrigeration cycle, although the relationship between P and T is MUCH more complicated for a refrigerant in the two phase region.
Back on point, as the fluid expands it has to expand to a temperature that is lower than the internal space you are trying to cool. Then, the cold space can move heat into the evaporator.
In addition, compressors are susceptible to damage if they pull two phase flow at the inlet of the compressor (called slugging). Slugging can cause massive pressure fluctuation inside the compressor and damage components. For this reason, system designers actually like to superheat the fluid (maybe 5-10K) at the compressor suction to ensure only gas enters the compressor. This reduces cycle efficiency but increases reliability of the compressors.
aroc91 said:
As the pressurized liquid rapidly expands, it boils, taking heat away from its surroundings.
No, this is no true. An ideal expansion process is isenthalpic (enthalpy is constant) you can check with the first law. Meaning, there is is no work or heat transfer during an ideal expansion process. The temperature change from an expanding fluid during expansion comes only from the change in pressure, this is an intrinsic change and does not require heat transfer to occur.