Hi Alex,
Certainly you have an interesting area you're working in, and no doubt made more difficult due to pushing against the mainstream views.
From reading your paper (and my previous experience in academic research albeit limited), the question really relates to how you're going to produce an experiment with inherent unquestionable timing accuracy to prove the proposed theory.
Straight away the idea of delaying a signal by 100ms, with an accuracy to be able to measure, and more importantly prove beyond question a 100ns shift is a very tall ask. Even in your paper (playing devils advocate here), I'm immediately brought to question how you ensured the delay path through the two channels of the oscilloscope were calibrated, along with the delays of the attenuator, the stability of the oscilloscopes clock source etc.
All electrical instruments have clock jitter, aging drift, short term instability and so on. Analogue devices like those that have been discussed so far will be many orders of magnitude too inaccurate for what you're asking. Even the use of an A/D, digital delay, D/A would need to be clocked by some form of precision oscillator (like maybe one of the Rb sources in Daves video), but you'd have to deal with possibly amplifying the heavily attenuated signal and any delay paths introduced in that, then any effects the analogue path from the D/A back to the wave guide.
Hope I'm not going off on a tangent here, but I see this as a study of timing accuracy in all the elements of the experimental setup.
Good luck with your paper!