Sunday 29 May 2022

Redshifting as ‘tired light’ does not lose energy over distance

There seems to be a flaw in the assumption that cosmological redshift of galaxies first observed by Hubble and others in the 1920s cannot be explained by tired light because there is no explanation as to how light ‘loses’ energy over distance. The argument being a “photon” of light at 100nm has more energy than one at 200nm. But this seems to overlook a fundamental point which is that a light beam with a wavelength range of 100-200nm when redshifted to 200-400nm still has the same total energy as the rest frame emitted range. But just spread out across a range double that of the original rest frame emission range.

My question is: A source emits a constant amount of energy as EMR with a range of 100-200 nm. Will the measured total energy of that emission by an observor be the same for the rest frame beam of 100-200nm as it would be for the same beam redshifted to 200-400nm beam during the same observation time frame?

My assumption is that where 100-200 nm gets redshifted to a longer wavelength the energy is *still conserved*. Just spread out across a larger wavelength range. Contrary to and negating the argument used by Big Bang theorists that a tired light non expanding universe would have to explain how light “loses” energy.