AFAIK the expensive Fluke 8858 give some info on the confidence interval, but that is an exception.
It is mainly a software only thing, so we may see it more frequent in the future, though many use cases don't really care.
It would make sense at the higher end, when the quality of the calibration standards could make a difference - so no fixed specs but confidence range dependends of the CAL grade.
Still the specs and confidence intervals are only crude estimates made in the initial phase. So don't take the specs to litteral and as absolute accurate. Some instruments later turn out to barely hold the specs, while other usually do much better or for a longer time. Especially the very high resistance range that is implemented with a prallel 10 M resistor may not really fit the simple format of percent of reading plus percent of range.
The over-range is kind of limited and it can come with an increased error. So a warning at more than the nominal range makes sense, though this is also easy to see for the user.
Ideally one would have a database with specs - so one would enter the meter and not that specs. That step would be the real benefit of a computer program.