Is there a mathimatical way to convert the reading to RMS

If you know the

**shape** of the waveform you're measuring, and you have an accurate model of your multimeter, then yes.

If you assume that the multimeter has a linear response (i.e., doubling the input signal doubles the measured value), then that calculation will be a simple scaling factor.

But then again, if you have the tools required to do all this calibration (i.e., an oscilloscope), then you should just be making your RMS measurements directly with the oscilloscope.

If you assume that your multimeter is not just linear, but a theoretically perfect averaging multimeter (a remarkable assumption to make given that that's as difficult and expensive as making a true RMS meter), then the conversion factors can be calculated.

Finally, I would point out that people are, in general, overly obsessed with getting True RMS readings, as if True RMS is the One True Reading. One example of a specific case where RMS is appropriate is power dissipation

in a resistor -- the power dissipated in a resistor is equal to Vrms^2 / R, because it's the square of the instantaneous voltage that determines the instantaneous power, and if you write out the integration that calculates average power over time, it looks similar to the integration that generates the RMS reading. If you can't explain

why True RMS is the right measurement to make like this, then you're likely wasting your time making the wrong measurement.