Magnetic entropy vs. magnetic field

fuzhendong
Messages
2
Reaction score
0
I measured the heat capacity of a sample under various magnetic fields. When I calculated the magnetic entropy as a function of temperature, I found the magnetic entropy measured at 0.5 Tesla is higher than that measured at 0 T. As far as I know, the magnetic field will try to allign the magnetic moments, causing the magnetic entropy drops. But why the magnetic entropy that I measured increases as the magnetic field increases?
 
Physics news on Phys.org
A very good question.

Usually you get the entropy from integrating the measured specific heat from low temperature. So if there is some phase transition or other funky stuff going on below your lowest measured temperature, you will miss the entropy associated with that. If the 0.5T field kills or shifts that feature to above the lowest measured temperature it looks as if the in-field state has higher entropy.

The usual way to check is to see the entropy at high temperature, safely above all phase transitions and other anomalies. Do you get the same value for all your data sets?

BTW, what is the sample, over what temperature range have you measured, etc?
 
Thank you very much for your reply, M. Quack.
This sample is a molecular magnet {V15}. The difficulty of this sample is the specific-heat contribution from lattice is difficult to be subtracted since I got no nonmagnetic reference. Therefore the magnetic specific heat cannot be obtained precisely at high temperatures.
The measurements were performed from 60 mK to 300 K. And the magnetic entropy were obtained from 60 mK to 8 K. Only at low temperatures (<8 K), where the lattice contribution can be assumed to be proportional to T^3, I managed to fit the total specific heat with lattice contribution (~T^3) and magnetic contribution (two Schottky anomalies). Indeed a part of the Schottky anomaly is below 60 mK. Though the data were fitted quite well above 60 mK and the magnetic entroy was calculated using the fitting curve from zero temperature, maybe some of the magnetic contribution is not well reproduced below 60 mK.
Now I am considering to do another measurement with a better-calibrated puck to exclude the instrumental facts. If it's not due to the instrumenal problem, then I guess some of the magnetic specific heat is shifted to higher temperature under higher fields, which makes this problem unsolvable unless a nonmagnetic reference is ready.

M Quack said:
A very good question.

Usually you get the entropy from integrating the measured specific heat from low temperature. So if there is some phase transition or other funky stuff going on below your lowest measured temperature, you will miss the entropy associated with that. If the 0.5T field kills or shifts that feature to above the lowest measured temperature it looks as if the in-field state has higher entropy.

The usual way to check is to see the entropy at high temperature, safely above all phase transitions and other anomalies. Do you get the same value for all your data sets?

BTW, what is the sample, over what temperature range have you measured, etc?
 
Hi. I have got question as in title. How can idea of instantaneous dipole moment for atoms like, for example hydrogen be consistent with idea of orbitals? At my level of knowledge London dispersion forces are derived taking into account Bohr model of atom. But we know today that this model is not correct. If it would be correct I understand that at each time electron is at some point at radius at some angle and there is dipole moment at this time from nucleus to electron at orbit. But how...
Back
Top