Just go back to the roots.
Imagine an ADC with a 0DBFS sine wave at it's input.
Zoom in at the zero crossing of the Sine.
If a sample is too early, or too late, you will sample a too low or too high voltage.
If you do this on a piece of paper, you instantly see the relation between time jitter and measured signal noise.
I do not care if you want to measure those errors in peaks, RMS, or potatoes.
You should be able to do that yoursef if you know what your units mean.