Scopes are really meant to measure the relative change of a signal over time, not an absolute voltage. For absolute voltage measurements, you're looking at one decimal place of accuracy, I.e., 12.0V, 12.1V, etc. at best.
Modern digital scopes are better at this with their digital readouts, historically analog scopes were even less accurate, since you had to count which grid line the trace was on to get your measurement (if you're set at 10V/div and the trace is on the third division from the bottom your signal is roughly 30V, assuming you zeroed it out at the bottom).
To get the most accurate absolute voltage measurement, make sure you're in DC coupled mode and use the voltage readout function for that channel. (AC coupled mode places a capacitor in series with the input, which charges up to the incoming voltage; so if you had, say, a pure 15V DC signal, the scope would indicate a 0V flat line. But, say that 15V DC signal had a 100mV RMS AC sine wave riding on top of it, then you could set your scope to 50mV/div sensitivity and see the sine wave clearly, whereas if you were in DC coupled mode you'd have to be in 5V/div mode and that 100mV sine wave riding atop the 15V wouldn't be visible. AC coupled mode is great for looking at power supply ripple for this very reason!)
Sent from my Tablet