I've often read posts where the conservation of energy is questioned regarding redshift due to expansion. The question arises because the energy level of photons is directly related to the light's frequency, and this alludes to photons "losing" energy as the frequency lowers. The reasoning I've read most often on PF is that conservation of energy only applies locally, or simply doesn't apply to expansion. I'd like to approach this from a different perspective. Consider a distant light source that oscillates one second on and one second off. Since it's a thought experiment, why not make it monochromatic - 800THz in low UV. Now imagine that source was at the appropriate distance at the appropriate time in the past so that today the light is arriving to us with a 100% redshift. The frequency arriving is now 400THz, indicating that when we detect a single photon, it will measure half the energy as we would have detected from a single photon near the 800THz light source. However, each original one-second pulse is now arriving to us over a period of two seconds - energy is arriving for twice as much time. If we measured the sum total energy of each two-second pulse arriving on an appropriately sized surface area, would it not be equal to the energy arriving in a one-second pulse if expansion didn't occur? The thought experiment isn't perfect because an "appropriately sized" detector for each setup would be a bit outrageous due to the spherical nature of propagation. I could have suggested an even more outrageous kind of Dyson sphere detector to capture/measure each entire pulse, but in any case I hope the spirit of my question is clear, wanting to learn if energy may actually be conserved when all is considered.