When people say 'signal' they usually mean 'voltage signal' -- i.e. the potential present on a wire. If you apply a battery (referenced to ground) to one end of a very long wire, and measure the potential on the other, you'll see that the potential decreases along the wire. Ideal wires have zero resistance, and thus the potential is the same everywhere along their length -- but real wires have small, but non-zero resistances. Thus, voltages at the far end of a wire get smaller as the wire length gets longer. There are a couple of solutions: use amplifiers, or use current signals instead. Current is the same everywhere in a wire, even though voltage is not. If you pump 10 mA of current into a wire at one end, you can rest assured 10 mA must be coming out the other end.
Would not the POTENTIAL of the wire be constant, as long as there IS NO CURRENT. This would mean that the potential measurement would VARY with instrument used. If you measured the voltage with respect to ground of your long wire (any wire for that matter!) would get a different result if you used a cheap analog meter or a high impedance Digital Voltmeter. If you could measure the potential of the wire with no current draw you would measure source voltage at any point on the wire.