Different instruments measure the potential of a wire differently due to their input impedance and the presence of current. Real wires have non-zero resistance, causing voltage to decrease along their length when current flows, necessitating amplification at intervals. While current remains constant throughout a wire, voltage varies, leading to discrepancies in measurements based on the instrument used. High-impedance digital voltmeters provide more accurate readings compared to lower-quality analog meters, especially when no current is drawn. Understanding these principles is crucial for accurate electrical measurements in practical applications.