Magnetic entropy vs. magnetic field

1. May 28, 2012

fuzhendong

I measured the heat capacity of a sample under various magnetic fields. When I calculated the magnetic entropy as a function of temperature, I found the magnetic entropy measured at 0.5 Tesla is higher than that measured at 0 T. As far as I know, the magnetic field will try to allign the magnetic moments, causing the magnetic entropy drops. But why the magnetic entropy that I measured increases as the magnetic field increases?

2. May 29, 2012

M Quack

A very good question.

Usually you get the entropy from integrating the measured specific heat from low temperature. So if there is some phase transition or other funky stuff going on below your lowest measured temperature, you will miss the entropy associated with that. If the 0.5T field kills or shifts that feature to above the lowest measured temperature it looks as if the in-field state has higher entropy.

The usual way to check is to see the entropy at high temperature, safely above all phase transitions and other anomalies. Do you get the same value for all your data sets?

BTW, what is the sample, over what temperature range have you measured, etc?

3. Jun 1, 2012