This is very difficult to put into words clearly, so bear with me. You always hear that setting your thermostat higher expends more energy. My question is why? Let's assume we have two houses of the same size, type, surface area, and insulation (and in all other ways that i'm not thinking of the same). One house we set the thermostat at 78* and the other house we set at 68*. Also assuming a variance of 2*, such that the thermostat will drop to, say, 66* before it kicks on and warms the house to 70*, drops back down to 66* and so on. Just for kicks, let's say it's 50* outside(cuz it is right now) What extra energy(besides maintenance) would be used beyond the initial expenditure, what was required to heat the house up the extra 10*? The way I'm seeing it, (which is probably wrong) it shouldn't take any more energy after that point(besides maintenace, of course). Neither of the two houses should lose heat any faster than the other. That being the case the only 'waste' would be the additional energy required to bring the 1st house up to 78* in the first place. right? Am i missing any info here? or is the post clear and i'm not? Hit me back with anything i'm missing.