I read this in article "Misconceptions about the big bang" that was mentioned in another thread: "For example, when a star explodes as a supernova, it brightens and then dims — a process that takes about two weeks for the type of supernova that astronomers have been using to map out space. During these two weeks, the supernova emits a train of photons. The tired-light hypothesis predicts that these photons lose energy as they propagate but that the observer always sees a train that lasts two weeks." This article can be found http://astronomy.case.edu/heather/us211.07/misconceptions.pdf" [Broken] as well. I have some doubts about such a simple prediction for tired light. From quantum mechanics we know that photons are not like billiard balls. They have phase and as a result interference effects can take place. Now if we increase wavelength and therefore decrease frequency phase differences inside photon train will change. As a result there should appear some interference effects that will change photon train on the whole not only individual photons. Any comments?