Author Topic: Automated DP832 Calibration  (Read 4788 times)

0 Members and 1 Guest are viewing this topic.

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #100 on: July 19, 2020, 11:25:19 pm »
Is there a way to read the calibration points?

Also, how many points maximum can the calibration handle?


You can use the manual calibration (menu), but its bit tedious...(you can just follow the calibration process, but not save at the end...)

It might be possible to read these via SCPI (but may require that "magic" usb drive in the usb port):

338 :PROJect:CALIbration:DATA:VOLTage:WRITe
339 :PROJect:CALIbration:DATA:VOLTage:READ?
340 :PROJect:CALIbration:DATA:CURRent:WRITe
341 :PROJect:CALIbration:DATA:CURRent:READ?
342 :PROJect:CALIbration:DATA:CURRent:ADDRess?
343 :PROJect:CALIbration:INFO:WRITe
344 :PROJect:CALIbration:INFO:READ?
345 :PROJect:CALIbration:INFO:ADDRess?

(these are from: dp800_all_commands.txt found in https://www.eevblog.com/forum/testgear/need-help-hacking-dp832-for-multicolour-option/msg2325633/#msg2325633)


 

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #101 on: July 19, 2020, 11:49:35 pm »
Could the ADC (meter accuracy) be improved by adding more steps if we knew its maximum?

Adding more calibration point doesn't directly correlate (meter) accuracy, but in general more points will yield better results. However, you might need only
handful of calibration points if the "difference" is mostly linear...  With low/limited number of points (like is likely case with DP800 series), carefully selecting the calibration points could yield significantly better calibration results. Key is to choose the calibration points on places where the ratio between set/measured values changes significantly..

Ideally a calibration script could run "sweep" through the entire range (in very small steps 10-100mV / mA), recording the differences on each point.
This would yield a graph of the entire range showing the "error", then curve fitting could be used to (mathematically) fit a continuous piecewice linear function (with section/line count less than max number points supported by DP800) to match the data.
Starting and end points of sections in this linear piecewice function should also be near optimal calibration points...
 

Offline bson

  • Supporter
  • ****
  • Posts: 1707
  • Country: us
Re: Automated DP832 Calibration
« Reply #102 on: July 20, 2020, 09:01:02 pm »
At some point I think calibration requires so many sample points, for example to pick the best 80 calibrations, that it would take forever - and when sitting on the typical non-climate controlled lab bench conditions like humidity and temperature will vary over the day, to the point that the resulting drift has greater negative impact than the benefit of grinding out more points.

One useful feature might be to do a quick check of the supply after calibration (and after restart).  In particular to make sure it can output the full voltage scale, and maybe check a few points to validate the calibration data installed.  With a big negative offset like Alank's for example, my concern would be that the supply can still output the full 30V post cal.  This quick check could then be run standalone, by itself and not necessarily only after a full cal.  (Does Rigol provide a check procedure?)
 

Offline bson

  • Supporter
  • ****
  • Posts: 1707
  • Country: us
Re: Automated DP832 Calibration
« Reply #103 on: July 20, 2020, 09:04:54 pm »
I can see the indenting, but how does python know that "self._psu._write" is not part of the if?  I am used to brackets or and endif or something.
It looks at the indent. :)
 

Offline aristarchus

  • Regular Contributor
  • *
  • Posts: 84
  • Country: 00
Re: Automated DP832 Calibration
« Reply #104 on: July 20, 2020, 09:53:06 pm »
I can see the indenting, but how does python know that "self._psu._write" is not part of the if?  I am used to brackets or and endif or something.
It looks at the indent. :)

The fathers of those guys that created python most likely are the good old COBOL devs..   :-))))
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #105 on: July 21, 2020, 08:15:06 pm »

Calibration points can be viewed manually using the (manual) calibration menu. No need to calibrate anything.

Go to calibration menu and choose "Cal Item", then to view calibration points select "Cal Point" and then press "Meas Val", this will display the calibration point.
Now, you can press "Cal Point" again to back out (then just select new cal point and repeat...)


I recorded the factory calibration points for my unit, since wanted to test automating calibration, but want to be sure I can put things back to as they were if needed...

[attachimg=1]

Looks like factory calibration is dynamically selecting calibration points for each channel/unit. This makes sense since calibration points chosen based on individual unit/channel characteristics likely yields significantly better "calibration" than using fixed calibration points...

 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #106 on: July 21, 2020, 08:28:14 pm »
Interesting.  What is the story with the calibration password.  I've seen it said to be 2012, but the java script garrettm has used 11111.  Will it take any password at the remote command?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #107 on: July 21, 2020, 08:43:28 pm »
sequoia - if you enter cal mode and do a set command to see what it will output, what is the measured output for the factory defaults?  (ch1 0.73, ch2 1, ch3 0.115).  Just below zero?  Just above zero?  Further above zero?  Part of me wonders if it is further above zero, how it interpolates without a lower value.  I wonder if it looks at cal points 1 and 2 (using your graph example) and assumes a cal 0 based on the 1/2 if that makes sense.  I am assuming that VDAC 0.730V has some conversion to an integer DAC value that is handled in their firmware.

 

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #108 on: July 21, 2020, 09:43:07 pm »

Seems as if factory calibration has chosen the first calibration points so that measurement is near 0V....

V DAC:

(channel: 1st cal point / measured value)
CH1: 0.730 / -0.05206
CH2: 1.000 /  0.32402
CH3: 0.115 /  0.08842


 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1477
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #109 on: July 21, 2020, 09:59:11 pm »

Seems as if factory calibration has chosen the first calibration points so that measurement is near 0V....

V DAC:

(channel: 1st cal point / measured value)
CH1: 0.730 / -0.05206
CH2: 1.000 /  0.32402
CH3: 0.115 /  0.08842

Plausible. Cal with negative ADC values makes no sense. What I did is
- use a manual voltage/current list with a high resolution at very low voltages (steps 0.1V)
- ignore negative calibration points

Result:
- All voltages/curents *above* the ignored points are accurate
- The points belows (i.e. very low voltages/currents) are so inaccurate anayway that they should not be used.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #110 on: July 21, 2020, 10:35:58 pm »
On your CH2 sequoia if you set it to 100mV, 50mV, 20mV, 10mV, 5mV, 2mV, 1mV - does it do a good job of outputting those?
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #111 on: July 21, 2020, 10:50:33 pm »
On your CH2 sequoia if you set it to 100mV, 50mV, 20mV, 10mV, 5mV, 2mV, 1mV - does it do a good job of outputting those?

Error is below 0.5mV which seems well within the specifications.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #112 on: July 21, 2020, 11:13:10 pm »
Superb - I have a feeling they must have a way to look at the 1st point (1.00V) and the next higher one up and use that to extrapolate even below the 1st point.
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 120
  • Country: us
Re: Automated DP832 Calibration
« Reply #113 on: July 21, 2020, 11:56:37 pm »
Superb - I have a feeling they must have a way to look at the 1st point (1.00V) and the next higher one up and use that to extrapolate even below the 1st point.

Maybe they just use the "difference" from fist calibration point for any values below first calibration point...I.e. there will always be "0" calibration point
for 0.000 that has same value as first calibration point, so nothing to extrapolate as long as last calibration point is set to maximum value of the range (which seems to be case with factory calibrations).


« Last Edit: July 22, 2020, 12:02:35 am by sequoia »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #114 on: July 22, 2020, 03:41:21 pm »
I am not sure I am following you exactly.

I'm thinking that perhaps the thing to do on the cal is to use a binary search to find the 0.000 point (start at 0.5V and move up/down depending on the reading).  Once that point is established, split the other points (36, 46, ?) to get to the maximum value of 32V.  Let's say that 0.52V is my 0 point.  Then I take (32-0.52V)/35 to calculate the increase for the other steps.  I'll attach an Excel screenshot.  This is even, but does it give enough points at the lower level or should they be more dense?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #115 on: July 30, 2020, 12:49:02 am »
Anything new on this?

Any comments on the DAC's being more linear and the ADC's being more logarithmic?

Looking at the standard values for it seems the DAC for the 32V channels ranges from a 0.3V increase between steps to a final 1.8V increase.  If I determine what the value is that is 0, is there any downside to just using uniform steps between that and 32V instead of the default ones?

Then the ADC is: 0, 0.05, 0.1, 0.5, 1, 5, 10, 12.8, 20, 30, 32 - should these steps just be left as is?

 

Offline bson

  • Supporter
  • ****
  • Posts: 1707
  • Country: us
Re: Automated DP832 Calibration
« Reply #116 on: July 30, 2020, 03:20:52 am »
I'm thinking that perhaps the thing to do on the cal is to use a binary search to find the 0.000 point (start at 0.5V and move up/down depending on the reading).
I don't think that will work.  The values aren't necessarily monotonically increasing below 0, which eliminates all bisection algorithms (of which binary search is one).

(Actually, I take that back; binary search will work, while many other bisection algorithms won't.)
« Last Edit: July 30, 2020, 03:27:16 am by bson »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2100
Re: Automated DP832 Calibration
« Reply #117 on: July 30, 2020, 04:24:02 pm »
Here is my plan:

CH1/CH2/CH3 volt dac - try to determine the zero point by doing a binary search between 0V-1V.  Once I find it, then do a linear range with the other 35 points up to 32V.
CH1/CH2/CH3 volt adc - use Rigol's points.
CH1/CH2/CH3 amp dac - use Rigol's points.
<Delay between these steps at least 15 minutes>
CH1/CH2/CH3 amp adc - use Rigol's points.

Has anyone ever seen the same problem with the amp dac that we've seen with the volt dac in that a higher point was a lower value, or is that only in the volt range?
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf