I looked into this a little more because it's not intuitive (to me, at least) how the temperature of a gas could increase during "decompression." We could use that word to describe a Joule-Thomson experiment (constant and unequal pressures on either side of a porous plug), but it seems more accurate to say that we are pushing the gas irreversibly through the plug, and we are definitely doing work on the system. This, of course, has a different connotation than a free decompression, in which no positive work enters the system (and the gas may end up doing work on the environment, decreasing the temperature of the gas).
In the Joule-Thomson arrangement, we do work on the gas before it enters the plug, and recover that energy as work when the gas expands on the other side. In the case of an ideal gas, the amounts are equal and the temperature doesn't change. In the case of a real gas, atomic/molecular repulsion increases at high pressures and temperatures, and we must do more work on the upstream side to obtain a given pressure. This excess work ends up heating the gas.