Well, oscilloscopes generally aren't "precision" instruments per say, with 3-5% vertical accuracy, typically. Even if there was some effect on accuracy due to temperature, it probably won't have much effect on your ability to diagnose/debug circuits.
But even for your other instruments, I think the references they used are usually qualified across a pretty wide operating range. Even $50 multimeters with SoC multimeter ASICs typically have bandgap references with 75ppm/C drift or less. With 20 degree C shift, you might be off by 0.1-0.2% or so, including other drifts, particularly gain of the ADC(s). In that case, you'll probably still be well within the spec of your $50 multimeter. If the ambient temperature falls within the operating temperature range of your equipment, then it should meet all of the specs on the datasheet.
Similarly, if you have a $1K+ bench multimeter, it will likely have a much more stable reference, <1ppm/C or so, and many times are temperature compensated (via a heater). In this case, these meters should still meet their specs even with elevated operating temperatures. Their references and amplifiers should be equally better to allow solid performance across wide operating range.
Long term though, I'm not too sure. I know references experience time-related drift, related to physically how long the references have been powered up, but I'm not sure if being exposed to elevated temperatures can have a similar effect.