The answer to your question is slightly more complicated than using the manufacturer's specifications. Those specifications are based on the average expected accuracy of the meters. Your meter or may not meet these specifications exactly. In order to get the numbers for your meter, you need to consider the quality of calibration (ie. the uncertainty of calibration), and you have to determine your meter's drift by comparing two or more successive calibrations. From that data, you can determine which specification is appropriate.
Example: Our reference 3458a's last 2 calibrations on the 10V range had an uncertainty of 0.3ppm. According to the calibrations, our meter drifted 2ppm in 15 months. So worst case, 2ppm + 0.3ppm x 2 = 2.6ppm drift in 15 months. According to the datasheet for an option 002 meter, the 90 day spec is 2.65ppm for a 10Vdc measurement. In this case, for our meter, we can use the 90 day spec all year.
Of course, the quality of your cables and environmental conditions can affect the readings significantly when you're talking about extreme accuracy calibrations, end measurements. Just trying to give you an idea of how you would estimate your error.
This is our experience.I hope this is useful to you.
TomG.
PS - We used a Fluke 732c to verify the meter's accuracy, and it was within the limits, as expected.