It's been awhile, but I think the general routine is as follows :
1) Measure Positive Test Voltage at 1 Volt
2) Press -TV Button to get a Negative Test Voltage and measure
3) Turn Zero dial until the + and - voltages match.
4) Connect 10 MOhm resistance standard.
5) Select Rate and Test Voltage settings which yield the best accuracy for a 10 MOhm setting. (I think would be a rate of x1 @ 1V)
6) Step Wayyyyyy back and let reading settle.
7) Turn CAL dial until your reading matches the known resistance value.

Select -TV button
9) Step Wayyyyy back and let reading settle.
10) Turn CAL dial until reading matches the known resistance value.
11) This will change your +TV value, so find a nice middle ground between your + and - Measurements.