Products > Test Equipment
Hacking the Siglent SDM3055 Bench DMM
<< < (19/35) > >>
klausES:
Hi there,

I think I read somewhere that the temperature (at which the factory was calibrated) was 23.5 degrees Celsius?!?
(I'm not sure anymore, can't find the text anymore).

If I remember correctly, it would be the temperature at which the voltage reference and the relevant part of the board later
most closely correspond to the factory calibration in measurement mode.

Are there measurements or empirical values by how much these values in practice, e.g. 30 or 35 degrees C device (inside) temperature could differ
(to what extent the temperature compensation corrects this or not)?

The question has a background (a thought), but I'd rather wait and see if someone answers.
Kleinstein:
How much the readings can change with temperature is sometimes noted in the instructions. Usually some +-5 K is included in the base accuracy  for more I would expect something on the order of 2-15 ppm/K as an upper limit. The lower ranges (200 mV and 2 V) that don't use the input divider can be better, as the reference used and the ADC have a relatively low TC. The input divider for the higher ranges (20 V, 200 V,...) is expected to have lower accuracy, due to the TC of the divider that is more in the 5-15 ppm/K range.

I would not expect much of an extra correction in software, as this would need an extra step at the calibration.
klausES:
Thank you for your answer.

Had the following considerations.

It is clear that if constant values ​​are to be compared, it would be advantageous to keep the instrument at the same temperature at all times.
If it is almost all about constant (much more than the best absolute values),
it would not be advantageous to maintain a constant but higher temperature than the constantly fluctuating ambient temperature.

A precisely controlled temperature that is always x degrees higher than the ambient temperature
is much easier to implement inside the device than a possible cooling to keep it the same.
In winter my room has e.g. 18 degrees, but also 30 degrees in midsummer (or sometimes just> 30 degrees, unfortunately no climate).

If the device were now "always from the start" kept at, for example, exactly 30 degrees
(controlled by a precisely temperature-controlled heating and in a control loop proportional fan speeds),
fluctuating room temperatures in the range of 18 to approx. ... 29 degrees would hardly have any changing effects on the temperature in the device.

hence the question:
How much would the absolute values ​​at a constant 30 degrees differ from those of the factory calibration at 23.5?

Edit:
corrected a text error that led to confusion (see above).  :-\
Kleinstein:
The TC specs for the ADC and reference are at some 1 ppm/K. The actual performance may be a little better than this at moderate / common temperature. So the 6.5 K higher temperature my change the reading by some 5-10 ppm, so hardly detectable with a 5 digit meter.
The ranges with input divider and current ranges are likely less accurate so more like 10 times the change.

For the 5 digit resolution there is not much need for a constant temperature. This would be something for a 7 digit meter. At that level a oven for the reference (e.g. LM399 or LTZ1000) is the normal solution.
iMo:

--- Quote ---hence the question:
How much would the absolute values ​​at a constant 30 degrees differ from those of the factory calibration at 23.5?
--- End quote ---
In case you have a temperature sensor mounted in the box, you may simply compensate the readings based on that internal temperature. I did it with my meter at 10V range (34401A, I had to mount the sensor).
The absolute accuracy of the temperature sensor is not important here. A resolution of 0.1C would be enough for your meter, imho.
Then you will calibrate against that internal temperature. The DMM starts cold, and you will be logging the external ref voltage readings against the internal temperature - as the internal temperature of the DMM rises it will sweep up the temperature for you. You will get a "Vref" voltage vs. internal temperature dependency.
And the rest is math like y=ax+b (or whatever your dependency is like) within the internal temperature range of interest.
The internal temperature of your meter (after it stabilizes, it may take hours) is always Tdmm = Tx + Tambient, where Tx is a constant (an assumption here is your meter is not internally thermostated).
Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod