sailmike said:
I think I understand, I can't make the LED's 'eat' more current than they want for the given voltage?
That's right. If you repeat the "LED and battery" simulation for more voltages between 0 and 12V, you can plot a graph of current against voltage. It won't be a straight line, but more voltage will always give more current. Whatever circuit you use, the LED will always be operating at some point along that graph.
I've attached a simulation with the power source at 12V and R2 at 5 ohms. The data sheet gives the max voltage with 115mA as 12 volts, so is 12 volts too much?
The data sheet says 12V maximum, 10.7V typical to get 115mA. Unless you were unlucky and bought a component that was right on the margin of meeting the specification, 12V would give you a higher current (but probably not high enough to damage the LED).
Looking at your results, the voltage at the "bottom" end of the LED is 1.228V. So the voltage
across the LED is 12 - 1.228 = 10.772V. The simulation of 110mA at 10.77V is pretty close to the "typical" specification of 115mA at 10.7V.
If you increased the supply from 12V to 13V, the voltage at the bottom of the LED will also increase by about 1V to about 2.2V. The voltage across the LED and the current through it will stay about the same. That's how the circuit is supposed to work.
On the other hand if you reduce the supply voltage too much, the circuit can't create more voltage from nowhere to boost the voltage across the LED back to 10.7V, so the current will be less (as in your first simulation).
Try it with a supply of 10, 11, 12, 13, 14V (leave R2 at 5 ohms) and see what happens to the current.
You could increase the supply voltage a lot higher than 12V without damaging the LED, but that would just waste energy making Q2 hot, so it isn't a very useful thing to do in real life.