First, realize that to calibrate means to get a piece of paper (or PDF file) that says whether the unit was within set specifications. Be sure to actually specify if you want adjustments.
Every decent test instrument will come with specifications like the displayed value should be within 0.9%+0.5%, which means 0.9% of the reading (e.g 1.5V) and 0.5% of range (e.g. 6V). Or 0.9%+5 digits, which is 0.9% of the reading + 5 least significant digits.
If you measure the same signal with two meters in parallel, and both meters are within their specifications, then the intervals of the minimum and maximum actual values calculated from their displayed reading plus and minus the uncertainty calculated from the accuracy specifications should overlap. The actual value should be in the overlapping range.
Be sure to pay attention to little footnotes like "only valid after zeroing", "only valid from 5%-100% of full scale", "only valid for sinusoidal signals from 40 Hz to 500 Hz" etc.
If after this exercise the meters are not overlapping, then one or more will need adjustment. If they are overlapping, then you might be expecting too much. A brand new Fluke meter should be well within its specifications, so I'd trust that (within its specified accuracy).
Check the manual for adjustment procedure. While there are some affordable standards like the DMM check plus that can be used as a sanity check, adjusting a DMM generally requires a range of signals from 100 mV to 1000 V DC and AC, etc. Some instruments allow you to just adjust a single range, other require going through all ranges and functions before they will save the adjustments.