That temperature drift from -10 to 60C is very concerning, and what the hell is causing it.
...
I just don't get it.I allow it to sit at the two extremes for a half hour to stabilize. There's not a lot of room in that box and there is a decent sized fan. At -10 we had 1.003mV, 20ish 0.997mV and 60C 0.928mV.
It’s difficult to tell exactly as this part of your video is so time compressed, and is perhaps the overlayed chamber camera not really in sync, but:
121gw showing 0.899mV at -8.9c, control program (chamber temp) is shoving about 20c:
121gw showing 0.871mV at 20.9c, control program (chamber temp) is shoving about 43c:
121gw showing 0.997mV after 2h in 20.2c “room temperature”
so I get 0.899mV when chamber temp is shoving about 20c and then 0.997mV at 20c “room temperature”, why the big difference?
That temperature drift from -10 to 60C is very concerning, and what the hell is causing it.
...
I just don't get it.Different parts inside the meter are changing at different rates when I ramp the box, even at this slow rate. With this meter, it causes a fair bit of error. I allow it to sit at the two extremes for a half hour to stabilize. There's not a lot of room in that box and there is a decent sized fan. At -10 we had 1.003mV, 20ish 0.997mV and 60C 0.928mV.
But if you leave it in your car overnight in the dead of winter (much colder here than -10C) and bring it inside to use it, expect there to be a fair bit of error until it settles. Normally we would test something like this with a shock chamber but in my home lab, you have to settle for my cardboard box.
Probably the voltage reference is shitty ... pretty obvious
It’s difficult to tell exactly as this part of your video is so time compressed, and is perhaps the overlayed chamber camera not really in sync, but:
121gw showing 0.899mV at -8.9c, control program (chamber temp) is shoving about 20c:
121gw showing 0.871mV at 20.9c, control program (chamber temp) is shoving about 43c:
121gw showing 0.997mV after 2h in 20.2c “room temperature”
so I get 0.899mV when chamber temp is shoving about 20c and then 0.997mV at 20c “room temperature”, why the big difference?That temperature drift from -10 to 60C is very concerning, and what the hell is causing it.
...
I just don't get it.Different parts inside the meter are changing at different rates when I ramp the box, even at this slow rate. With this meter, it causes a fair bit of error. I allow it to sit at the two extremes for a half hour to stabilize. There's not a lot of room in that box and there is a decent sized fan. At -10 we had 1.003mV, 20ish 0.997mV and 60C 0.928mV.
But if you leave it in your car overnight in the dead of winter (much colder here than -10C) and bring it inside to use it, expect there to be a fair bit of error until it settles. Normally we would test something like this with a shock chamber but in my home lab, you have to settle for my cardboard box.
Yes it's VERY compressed. There's about 6 hours of data, maybe more. It takes several hours for that peltier setup to cool down to -10. You get to see the last half hour, the ramp and the hold at high temp. About 2 hours, compresses into a few seconds.
I have a sinking feeling that you feel that if you took a 1lb metal block, placed it in your freezer, left it overnight, take it out the next day and held it in your hand that it would be warm because it was no longer in the freezer. You don't seem to understand that there is a lag and it will take time for it to settle. We need to consider the thermal mass of the meter. The meter's case will provide some insulation.
I realize just because the 121gw internal sensor is showing a certain temperature doesn’t mean the rest of the meters internals is at that same temp (yet). But 0.870mV at 20.2C in the thermal chamber seems very far from 0.997mV at “room temperature” 20.2C.
I realize just because the 121gw internal sensor is showing a certain temperature doesn’t mean the rest of the meters internals is at that same temp (yet). But 0.870mV at 20.2C in the thermal chamber seems very far from 0.997mV at “room temperature” 20.2C.
As you can see I already mentioned there's a lag.
I still think it's strange that your chamber camera could catch 121gw showing 0.871mV at 20.9C on its internal sensor.
The thermistor measurement inside 121GW is not that good ... I see some self heating at room temperature so at low temperature could show way higher and then heating faster than the rest of the board .
It even seems like the CAT rating could be bogus.Fungus, just keep in mind the CAT requires the operator survivability, not the meter.
It even seems like the CAT rating could be bogus.Fungus, just keep in mind the CAT requires the operator survivability, not the meter.
True, but a loud bang and a jolt in your hand can easily make you fall off a ladder with surprise (or take a step back, put your foot in a bucket, fall over and grab a busbar on your way down...)
If this meter has sparks jumping around inside and tracks being vaporized at 2000V then it's not going to be pleasant to hold in your hand when 6000V@ 3000A hits it (ie. CAT III 600V). I don't know about you but I want a lack of surprises in my tools when I'm around dangerous things.
I realize just because the 121gw internal sensor is showing a certain temperature doesn’t mean the rest of the meters internals is at that same temp (yet). But 0.870mV at 20.2C in the thermal chamber seems very far from 0.997mV at “room temperature” 20.2C.
As you can see I already mentioned there's a lag.
I still think it's strange that your chamber camera could catch 121gw showing 0.871mV at 20.9C on its internal sensor.Why? Do you feel that the meter is homogeneous in regards to temperature? There is going to be a slight temperature gradient when looking at the top of the meter near the LCD to the bottom near the leads. While the chamber is small and offers a fair amount of air flow, as we get closer to the Peltier's output, we are going to see a different temperature than whats going to their input. Again, one way to remove this effect is to allow everything to stabilize. You can't take a snapshot while things are changing and expect much more than to note that things are changing...
The thermistor measurement inside 121GW is not that good ...
Looks like it takes about a half hour to warm up. Even after a 10min warmup I was seeing a bit of drift in the resistance measurements. Again, pretty typical. The odd ball is really just how stable that UNI-T UT181A is.
So it is good for volt scale , millivolt uses that transil diode D13 . If the leakage current is significant and variable with temperature a small variable voltage divider to ground is formed with the input thermistor + resistor 1K2+1K . Of course just a theory .
I can't recalibrate ... for some reason using a 10nF like in instructions result in 0.000nF readings in this range for any cap , so the only option is to reload the saved calibration . Maybe it's a bug , I tried many times even with different values , I don't think I did something wrong
Also pay attention if you try to re-calibrate the capacitance ranges.
There is no entry for zero value calibration in the small table inside the chapter ZERO OFFSET CALIBRATION.
In the big colored table, though, there is a scrambled entry, 'R1 : ', which might indicate that the first calibration point with open leads sets the offset, and the 2nd one with the nominal reference capacitor sets the gain.
That might explain CDaniels problem, that 0.000 is displayed for a 10nF capacitor, after calibration.
Frank
Yes , is working that way , have you discovered by chance ? Weird anyways ... the offset calibration is for zeroing the meter with nothing connected for stray capacitance or whatever
Now I have the meter without D13 and recalibrated for caps ... so far is not drifting .