- #1
fuzhendong
- 3
- 0
I measured the heat capacity of a sample under various magnetic fields. When I calculated the magnetic entropy as a function of temperature, I found the magnetic entropy measured at 0.5 Tesla is higher than that measured at 0 T. As far as I know, the magnetic field will try to allign the magnetic moments, causing the magnetic entropy drops. But why the magnetic entropy that I measured increases as the magnetic field increases?