Recently I've been in a discussion with someone online who claimed that the efficiency of a computer with regards to energy is not a linear relationship. I assumed that if you were to halve the power going into a computer chip (assuming it was designed to deal with that and wouldn't just turn off) it was logical it would only work at half the capacity. They claimed that there are some fundamental reasons why this isn't the case and that actually the relationship is computation rate in bits/sec = power^(3/4). In other words if you drop the power by to 50% you only drop the computation rate to 59%. I can't get my head around this and unfortunately they heard it from someone else (that I'm reasonably certain is a credible source but is difficult for me to contact). I've done a lot of searching around online and can't find any details on this. Is it true? Is there any relationship like this? Thanks!