Also, just in case that I'm missing something... what IS the recommended way to measure the time interval between two arbitrary points in a waveform, as accurately as possible?
Cursors are not the tool of choice for accurate measurements – they are largely a thing from the past when DSOs were not powerful enough to do proper waveform analysis but had to resort to tools from the analog scope era instead. Nowadays I would only use them to illustrate measurements, but let the DSO calculate the actual results.
It doesn’t make much sense to demand an accurate time interval measurement on truly arbitrary points – such as different arbitrary positions on a flat region of the waveform. It will rather come down to measuring the interval between two transitions.
If it is about a single trace, then we have the Time measurements, like Period and positive/negative Pulse Width.
If we want to measure the interval between two different traces, there are the Delay measurements (Time 1-4 in upper class instruments with latest firmware), which cover the interval between first/last rising/falling edge in one channel to first/last rising/falling edge in another channel.
If first/last of the whole captured waveform does not cover the relevant interval and changing the timebase doesn’t help, then we can always use the Measurement Gate (or Analysis Gate in case of the SDS3000X HD) to precisely define the measurement.
The big advantage of automated measurements is that we don’t need to zoom in as long as the measurement resolution given by the sample rate is sufficient. With only a single channel active on an SDS800X HD, the sample rate is 2 GSa/s and the time resolution is 500 ps. Measurements can make use of that up to 10 Mpts record length, hence even at a slow timebase of 500 µs/div, we can still measure time intervals with 500 ps tolerance in the main window.
Likewise, with two channels active in half channel-mode on an SDS3000X HD, the sample rate is 4 GSa/s, the time resolution 250 ps and measurements can use up to 40 Mpts. Once again we can go as slow as 1 ms/div and still measure with only 250 ps tolerance.
If we demand even greater resolution, then we either need to restrict ourselves to short time intervals and record lengths below the screen width, such as <50 ns/div on an SDS800X HD and <20 ns/div on the SDS3000X HD in order to force internal interpolation for better time resolution (which will restrict the maximum measurable interval to 200 ns for the SDS800X HD and 100 ns in case of the SDS3000X HD), or we better use the advanced math to create an interpolated copy of the original waveform and measure that instead. This way we can get up to 25 ps resolution for the SDS800X HD and 12.5 ps for SDS3000X HD even for longer intervals.
We should still be aware that the math trace cannot get longer than the maximum analysis length (10 Mpts for SDS800X HD, 40 Mpts for SDS3000X HD), so with the maximum interpolation we can only use 50 µs/div timebase max. This still allows maximal accurate measurements for intervals up to 500 µs.
EDIT: corrected the resolution for the SDS800x HD; originally, the numbers were given for an SDS1004X-E).