The standard explanation for the cosmological redshift is that photons emitted from far away galaxies have their wavelengths lengthened as they travel through the expanding Universe.

But perhaps the photons do not lose energy as they travel but rather the atoms in our detectors are more energetic in comparision with the atoms that emitted those photons a long time in the past leading to an apparent redshift effect?

**Addition** (having had a comment exchange with @rob, see below) : My hypothesis is that the Planck mass $M_{pl} \propto a(t)$ where $a(t)$ is the Universal scale factor.

**Addition 2** Of course if the Planck mass $M_{pl}$ is changing then $G=1/M^2_{pl}$ is changing so that we no longer have standard GR!

I've asked this question before, see Cosmological redshift interpretation, but this time I'm including a little bit of theory to back up my hypothesis.

For simplicity let us assume a flat radial FRW metric:

$$ds^2=-dt^2 + a^2(t)\ dr^2$$

Consider the null geodesic path of a light beam with $ds=0$ so that we have:

$$dt = a(t)\ dr\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (1)$$

Now at the present time $t_0$ we define the scale factor $a(t_0)=1$ so that we have:

$$dt_0 = dr\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (2)$$

Substituting equation (2) into equation (1) we have:

$$dt = a(t)\ dt_0$$

In order for the interval of time $dt$ to stay constant as the scale factor $a(t)$ increases we must have the corresponding interval of present time $dt_0$ varying inversely with the scale factor:

$$dt_0 \propto \frac{1}{a(t)}$$

Thus as cosmological time $t$ increases, and the Universe expands, equal intervals of cosmological time $dt$ correspond to smaller and smaller intervals of present time $dt_0$.

Now the energy of a system is proportional to the frequency of its oscillation which in turn is inversely proportional to its oscillation period:

$$E(t) \propto \frac{1}{dt}$$

The corresponding energy of the system in terms of the present epoch $t_0$ is given by

$$E(t_0) \propto \frac{1}{dt_0}$$

$$E(t_0) \propto a(t)$$

Thus an atom at time $t$ is a factor $a(t)$ times more energetic than the same atom at time $t_0$.

As the energy scale is ultimately set by the Planck mass then the Planck mass must be increasing as the Universe expands: $M_{pl} \propto a(t)$.

This effect alone would account for the gravitational redshift of distant galaxies without the assumption that photons travelling from those galaxies lose energy due to wavelength expansion.

Addition: I believe this hypothesis leads to a linear cosmological expansion $a(t)\propto t$ (see comments below).

This post imported from StackExchange Physics at 2014-05-04 11:22 (UCT), posted by SE-user John Eastmond