As mentioned above, the values are calculated by what is displayed on the screen. This can be confirmed by many different tests. Input a 5v signal, and set your scope to 100mv/division, and use the Vmax function. It will display stars because it is unknown. If it displayed 5v, it would be accurate, but it wouldn't make sense, because you're not looking at 5 volts. You can try the frequency measurement as well. Input a sine wave and turn on the hardware counter and the software freq function. At a long enough time base, both will display the same value. But shorten the time base so the displayed wave is less than 1 period, and you will see the stars again. It is more logical (on an oscilloscope) for the value to reflect what is displayed.
Some settings also reflect more accurate values than others. Input a 1v signal, and observe it at relatively close settings, like 500mv/div, 1v/div, 2v/div. The software function for Vmax will display 1v for all those settings. But try going up to 10v or higher (x10 probe setting). Even though the signal is still visible on screen, the numerical value will not be 1v. Most likely a limitation of the hardware as to a firmware issue where the hardware isn't communicating correctly with the software.
Of course this discrepancy is easy to detect because you know what the actual value of the input signal is, but on the other hand you are looking for values that are incorrect. I've found that more vertical and horizontal settings give the accurate value compared to the wrong value. So if 5 different time bases give you 500rms, and 1 gives you 750rms, then the 5 are sure to be more accurate. You can most certainly find the correct value with quick tweaking.