Photon Decay

Don Herbison-Evans ,

written 4 December 2015, updated 27 June 2017

ASSUMPTIONS (approximations, from Wikipedia)

c    velocity of light    3.0 x 10 8 metres/sec
H    Hubble constant    2.3 x 10 -18 Hertz (70 Km/s/Mpc, 1 Mpc=3.1 � 1022 metres)
P    inverse Hubble constant    3.0 x 10 17 secs (ln(2)/H = 9.7 x 109 years)
L    α Lyman radiation frequency    2.5 x 1016 Hertz (122 nm)
M    peak microwave background frequency    1.6 x 1011 Hertz (160.2 GHz)


Occam's Razor may be interpreted as saying:
Case 1. Initially assume any function of a dependent variable is constant unless proved otherwise;
Case 2. If it is not constant, then assume the function is linear unless proved otherwise;
Case 3. If it is not constant or linear: then the function is curved. The function may be assumed arbitrarily in this case to be any curved form one considers appropriate: maybe quadratic or hyperbolic or Lorentzian or exponential, or whatever. Occam's razor gives no guidance in this case.

The simple linear assumption that the Hubble constant can be applied to all distant objects implies that objects at a distance of c/H light-seconds will be travelling at the speed of light. This has led to the suggestion that there was a Big Bang 1/H seconds ago.

Subtle modifications of this conclusion have been made, allowing for relativity and the delay in arrival of the photons, but the arguements still depend on the "Case 2", the simple linear theory. The arguements about the acceleration of the expansion of the universe appear to be about the "Case 3" in Occam's Razor: how to introduce curvature. Curvature can instead be introduced by reinterpreting the observed red-shift a different way.


If the observed red-shift of distant galaxies is interpreted not as a Doppler Shift, but as a decay of the photon's energy and frequency, one arrives at an equivalent interpretation of the simple linear Hubble theory. The decay in photon's energy over time is linear in this simple theory, so that after 1/H seconds: the photons have zero energy. This seems rather abrupt and non-physical. I think this calls for a 'Case 3' situation.

My suggestion here is to introduce curvature by taking the analogy from studies of other particles with limited lifetimes, where their decay has been found to correspond well with an exponential function of time. We might simplistically take the half-life of a photon to be ln(2)/H = P seconds. This explains Olbers' paradox: why most of the sky is black. Photons from stars in the black regions have decayed in frequency to be below our visible spectrum.

Current models of fundamental particles do not easily accommodate the concept of photons decaying. But then they also do not predict the values of many of the fundamental physical dimensionless constants, so these models are in need of some modification.


Olbers' Paradox is wrong of course. There is a uniformly bright sky, but not in the visible spectrum. The brightness peaks in the microwave region of the spectrum: around 160 GHz.

Conventionally the microwave radiation is thought to be red shifted radiation from the Big Bang. Perhaps this was the collision of two half-universe sized black holes. Presumably: during the infinite life of the universe: black holes inexorably accrete other black holes, eventually there will be only two, and then this collision has to happen.

If instead we assume the microwaves are red-shifted α Lyman radiation from very distant stars, then this suggests that while there is no evidence of the traditional Big Bang in the decaying photon theory, at an epoch P.log2(L/M) secs ago, there was a peak in the number of observable stars in the universe:

= P.log2(1.4 x 105)
= 17.1 P = 5.1 x 1018 secs = 166 Billion years ago.

The big problem with this theory is that the "Olbers' Microwave Background" in an infinite universe should continue increasing below 160GHz all the way to DC, and it doesnt. It peaks at 160 GHz. We have an "Sub-Microwave Catastrophe". Perhaps like the Ultra-Violet Catastrophe, there is a quantum effect that leads to a truncation of emissions at frequencies below 160 Ghz.

The obvious solution to "where does the decaying photon energy go?" is into the microwave background.

Maybe instead Hydrogen Alpha-Lyman photons actually emit 160 GHz photons as they travel, like a sort of Cherenkov radiation. A Hydrogen Alpha-Lyman photon at 2.5 x 1015 Hz, in travelling from the edge of our universe (4.4 x 1026 metres) would have to emit 1.5 x 104microwave photons before it turns into one itself. That's an average of 1 photon every 3 x 1022 metres, but of course, it should be an exponential decay rather than linear.


Two photons going around each other in a photonic atom, each attracted by the graviational effect of the energy of the other, will cancel each other out, because each has a wave with a peak where the other has a trough, and so together they equal empty space. So conversely, every infinitesimal point of space is composed of an infinite number of pairs of photons of all possible frequencies going around each other.

Maybe it is not every infinitesimal point in space. Maybe all pairs with a photons of a given wavelength are kind of spaced by that wavelength into a kind of grid of bubbles across the universe. So there is a lowest frequency = c/R = 3.0 x 108 / 4.4�1026 = 6.8 x 10-19 Hz with just one pair of photons, and we do not have to go to DC after all. These 2 photons represent the zero-point energy of the universe: 2hf = 2 x 6.7 x 10-34 x 6.8 x 10-19 = 9.2 x 10-52 joules.

Maybe the Cherenkov radiation from the decaying Hydrogen Alpha-Lyman photons consists of the excitation of successively lower frequency latent bi-photons until they get down to 6.8 x 10-19 Hz.