Reading the article “Dark Matter from Scalar Field Fluctuations” made me think about photons. The actual reasoning path isn’t important here.
I see things through programming glasses, call it a professional deviation. I sense computational complexity in problems.
Going back to photons — they don’t exist as particles, they are only tensors that transfer energy. They do not exist in between. Thinking about photon’s own time — photon never “comes into existence” as the moment it comes being emitted it disappears being scattered. There is no time between these two events, only from the outside we perceive it being stretched in time. Emission-scattering events work without the wave in between.
Going back to computational complexity. The speed of light depends on a medium. It’s highest in vacuum. It’s rational — when there are no computations to perform the code just skips to a next iteration at highest possible pace close to a clock rate of a processor. A “while true” loop. But when there is a medium, depending on its structure and probability of scattering incident, computations are slower.
Let’s consider a double slit experiments and delayed choice. Possibly for complexity reasons a wave form is easier to calculate than the exact position. An observer forces the wave to collapse. And then we have the quantum eraser. Which acts contrary to our expectations, but is rational provided an algorithm managing energy transfers is simple enough to work fast and therefore cannot branch.
I wonder whether by coupling double slit with quantum eraser and different mediums slowing the pace of energy transfer (photons) making it computationally intensive we couldn’t find a moment when calculations diverge even further from expectations and show base tick speed of the simulation or specifics of its processing.
The above is a random thought, possibly physicists already went this way or it doesn’t make sense at all.