Recognitions:

Total Current Draw Of LED And Resistor Combined

A resistor getting hot does use power but the LED current is the same.
You seem to be using the terms power and current as if they were the same thing.

LEDs are semiconductor devices which draw almost no current until they have a certain voltage across them and then they can draw enough current to destroy themselves.

So it is necessary to put a resistor in series with them to limit the current they can draw. In your case, this was 25 mA but we settled on a resistor that would let the LED draw a bit less than this.

Sometimes, we have a voltage we have to use and this determines the power we will use in the resistor.
The resistor is there to protect the LED, not to rob power from it.

You have some choices with your setup. I would have two wires going the full distance down your path and then take a resistor and a LED off at each point where you need a LED.

At each point you would have 20 mA flowing from the 12 volt line, so there would be 0.24 watts being used. (12volts * 0.02 amps = 0.24 watts)
0.07 watts of this would be going to the LED, so it isn't very efficient, but there isn't a lot of power being used so it probably doesn't matter.

One possible choice would be that you could put up to 3 LEDs in series for each (smaller) resistor.
It may be obvious how you would do this, but I will describe it if you like.
This would be more efficient, but the wiring would be more messy.

Another choice that may suit you is to get Christmas lights which are already designed for your mains voltage and then you just string them along your driveway. You would then just find some way to get mains voltage to the string of lights and do it safely.

 Quote by vk6kro At each point you would have 20 mA flowing from the 12 volt line, so there would be 0.24 watts being used. (12volts * 0.02 amps = 0.24 watts) 0.07 watts of this would be going to the LED, so it isn't very efficient, but there isn't a lot of power being used so it probably doesn't matter.
I like watts and amps, but I was wondering how you came up with .07 watts going to the Led this seems what I want to know? Those numbers figure out to be 70% loss in inefficiency if that is correct?

John

Recognitions:
 Quote by John1397 I like watts and amps, but I was wondering how you came up with .07 watts going to the Led this seems what I want to know? Those numbers figure out to be 70% loss in inefficiency if that is correct? John
3.5 volts * 0.02 amps = 0.07 watts.

Some of this inefficiency is necessary if you have to use a 12 volt supply.

If you had a 6 volt supply, the total power used would be 6 volts * 0.02 amps or 0.12 watts.
So the efficiency would be about 58 %. (0.07 watts / 0.12 watts * 100 = 58.333 % efficient.)

These losses are trivial, but larger LEDs can use currents of up to an amp and there are highly efficient regulators that can limit their current to safe levels without wasting a lot of power.

Recognitions:
Gold Member