Thanks folks!
What still eludes me is that at different time-bases I get different readings and I don't know which one I should take, is this because the waveform is not stable?
For instance, I increased the timebase to 1ms to capture more pulses and 500MS/s sample rate I get fluctuating rise times between high 4ns up to 9ns. There is sometimes an exclamation mark next to the reading which I assume means the values are bogus.
If I decrease the timebase to 400uS and increase the sample rate to 1.25GS/s, the rise time values show up with the exclamation mark (between 3.6ns up to 7ns sometimes).
If I further increase the sample rate to 2.5GS/s, rise time fluctuates between 3.8ns up to 4.9ns.
If I set a one time edge trigger, I get different rise time values for each reading and they can be as high as 6.9ns or as low as 3.4ns (at 400uS at 2.5GS/s).
I do understand that these snapshots are taken for the signal at that point in time and that they can vary, but if I need to add them into some calculations, what values do I use, do I average them?