Error in bandgap measurements of n-type Ge

joel.martens
Messages
15
Reaction score
0
In conducting an experiment to measure the bandgap of n-type Ge using the 4-probe method, heating the sample and measuring the change in voltage across the probes as it cooled, it was necessary to change regularly between the milliammeter and the millivoltmeter setting on the instrument to ensure the current remained constant while voltages were recorded at temperature intervals. It was found that the act of switching to the ammeter and back caused large anomalies in the recorded voltages, specifically it caused the voltage readings to drop considerably lower than the ones taken previously. The voltages should steadily increase as the temperature drops gradually from around 170 dergrees to room temperature, levelling off to almost constant when extrinsic conduction begins to dominate.
Can anyone explain this fault, i was told that it may be that the millivoltmeters impedance doesn't match the milliammeters impedance but i am not familiar enough with impedance and how it relates to the operation of the millivoltmeter / milliameter to understand how this causes the error.
 
Physics news on Phys.org
Hi Joel-
Use a constant-current regulator for your power source, and two meters, one for current and one for voltage. Make sure the voltmeter is not measuring the voltage drop across the miiliammeter. Correct the milli-amp reading for the current drawn by the voltmeter. Lastly, understand the instruments you are using.
Bob S
 
From the BCS theory of superconductivity is well known that the superfluid density smoothly decreases with increasing temperature. Annihilated superfluid carriers become normal and lose their momenta on lattice atoms. So if we induce a persistent supercurrent in a ring below Tc and after that slowly increase the temperature, we must observe a decrease in the actual supercurrent, because the density of electron pairs and total supercurrent momentum decrease. However, this supercurrent...
Hi. I have got question as in title. How can idea of instantaneous dipole moment for atoms like, for example hydrogen be consistent with idea of orbitals? At my level of knowledge London dispersion forces are derived taking into account Bohr model of atom. But we know today that this model is not correct. If it would be correct I understand that at each time electron is at some point at radius at some angle and there is dipole moment at this time from nucleus to electron at orbit. But how...
Back
Top