Observations from a rookie with test equipment:
I don't have nearly the knowledge, skill, or experience of many of the veterans here so take this with a BIG grain of salt. (And this applies to my thinking as an enthusiast, not as a pro who is designing/building/testing/repairing gear for a fee.)
My experience is that some (generally newer and more expensive) equipment will be somewhat more accurate than other (generally older and not as expensive) equipment; however, sometimes older and/or less expensive can be better than newer and more expensive. - But if it's worth keeping it should all be in the ballpark.
How do you know what is accurate? You could spend a fair amount on collecting all the gear needed to calibrate all your other gear or you could send it out periodically to have it calibrated. Or you can spend some $ on products like a DMM check, etc. and use these to check your other equipment (in this case for voltage). In the process you will probably find that some of your DMMs and other equipment disagree with each, or they might agree at some voltages but not others. What you will probably come to conclude is that one of your pieces of equipment gives you the most confidence and it will be a "ruler" or reference point. You might be able to adjust some of your other equipment to come into alignment to some degree with your "ruler" but it's possible that this won't be feasible (or even desirable) with your other gear.
So what you wind up with is a bunch of gear that varies to some extent (maybe hundredths of a volt or thousandths of an amp). IME, voltage measurements (at least on some equipment) are somewhat harder to get dialed-in with precision (some number of zeros) than frequency measurements. As mentioned elsewhere in this thread oscilloscopes don't seem to be real good at measuring voltage (I trust my DMMs more than my scopes on voltage). On the other hand, my scopes and frequency counters and frequency generators seem to consistently line up with one another to a decent level of precision on frequency. (This could just be a coincidence among my gear.) Someone here will point out that accuracy and precision are not the same - which is true - but when several pieces of my equipment (most of which have not been tweaked to agree with one another) line up on a reading I begin to have confidence in the accuracy and as the precision goes farther out in unison I get still more confident. (Even so this is faith and not science since not much of my stuff is freshly calibrated, but hey I'm not launching any mission critical projects on most days.)
I'd really like everything to line up precisely all the time but I've come to the conclusion that I have to work with reasonably accurate values and often just relative or comparative values and/or trends - and therefore I have to settle for concepts or principles being manifested with less accuracy than I would like. To me one of the beauties of test equipment is that it provides tremendous feedback for learning. And no doubt if you had complete confidence in all measured values it would be easier to ferret out the principles at work. If the budget wasn't a consideration it would be cool to having everything measureable to micros, nanos, picos, and femtos.
I'm pretty new to test equipment (a couple years) but this is what I've found - so far.
YMMV
PS, having said all that, the couple posts just above mine look like pretty good approaches....