If the amplitude of a light wave increases does that mean that the intensity of the wave increases, and thus the number of photons? I'm thinking here re the wave/particle duality of light.
But if you use DeBroglie's wave/particle formula on a large object, say a rock, you get momentum=h/wavelength, so a big rock, at the same speed as a little rock, has bigger momentum yet smaller wavelength - yet it's made up from more than one particle.
At a classical physics level, physically big equates to big mass, but at the sub-atomic level, small seems to equate to big mass i.e. (short wavelength big mass relationship. "momentum=h/wavelength"). Any ideas why there is this complete contrast?
When looking at say water waves, long wavelength means high energy e.g. a tsunami, and waves in a rough sea - compared say to ripples on a pond. But when looking at photons and electrons and other "matter waves", short wavelength equals high energy. Why is it completely the opposite?
Like photons, all particles have a wave/particle duality, so when energy is added to an electron, say in a particle accelerator, why does the "amplitude" of the electron wave never increase (say as an increase in the actual number of electrons) - why is it that the energy added always just comes...
When light is red shifted due to the expansion of the universe, it loses energy (E=hf). Doesn't the "conservation of energy rule" apply in this case? Where does all that energy vanish to?
Sorry about the double posting. It's the first time I've posted so not familiar with the protocol.
Does the amplitude get stretched? And therefor the intensity?
Expansion of spacetime stretches wavelengths and produces the red shift. Does it also stretch the amplitude of the wave, and make distant stars look brighter and therefor nearer?
The expansion of spacetime stretches and red shifts the wavelength of light. Is the amplitude of the wave stretched as well? So that very distant stars appear brighter, and therefor nearer?