Every year we do an experiment to find the resistivity of Nichrome wire, and every year the result is the same: 5 x 10^-7 instead of 1 x 10^-6. For the life of me I haven't been able to track down why it's a factor of 2 off. We use a Wheatstone bridge that has a 1 m length Nichrome wire stretched over a meter stick. A Heathkit power supply (either model SP-2710, IP-2711, or SP-2720) feeds the current through a Fluke 75 multimeter set as an ammeter; a patch cord from the meter's COM terminal is clamped via an alligator clip to a sliding contact that moves along the meter stick, and the supply's negative terminal connects to the plug-in at the zero end of the wire. A second Fluke 75 multimeter serves as a voltmeter, with its patch cords accordingly plugged into those patch cords previously mentioned. They start at the 5 cm mark and work out to the 80 cm mark in 5 cm increments, measuring the voltages with a constant current of 0.5 A. The instructions say the wire's diameter is about 0.5 mm--I got 0.515 mm when I checked it with a micrometer, so that's not the problem. If I measure the resistance of the wire with a multimeter directly, I get about 2 ohms; this is exactly what they get in the first part of the experiment when they use a 75 cm length and measure corresponding voltages for currents from 0.05 A to 0.5 A in 0.05 A increments. I checked the patch cords and found they do not lend any appreciable resistance to the circuit (they all measured 0 ohms with the multimeter when connected together). I'm out of ideas as to what else to check to track down the discrepancy. Any suggestions?