Author Topic: Automated DP832 Calibration  (Read 16507 times)

0 Members and 1 Guest are viewing this topic.

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Automated DP832 Calibration
« on: March 17, 2020, 10:35:35 am »
This thread is planned to be a step by step guide to calibrating a Rigol DP832(A) Power Supply using different languages such as Python and Java.  I managed to do this successfully using my Keysight 34461A meter but my Python environment is not right and I hope to figure out the correct way to make it all work and share that with the forum.

This is the best and latest method using Python & Pycharm, it worked for me.

There are problems associated with Windows fighting the Python environment on a security basis, it has been suggested that Python can be run in the (free) Windows Visual Studio, I will try this out.

This first post will get updated periodically to hold the latest, proven instructions on how to set it all up properly.
« Last Edit: March 19, 2020, 03:36:26 pm by Gandalf_Sr »
If at first you don't succeed, get a bigger hammer
 
The following users thanked this post: tangram, Daruosha, Simon_RL

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #1 on: March 17, 2020, 11:04:06 am »
So the Rigol DP832 - Firmware List and Bugs thread was getting clogged up with the info on calibration with a competition between a Java solution (which erased my DP832 Cal and then crashed) and a Python solution.  In the end I made the Pythin solution work but there are many pitfalls and I ended up with a badly installed Python program that would need setup changes every time I opened up Python - I'm trying to fix this and then put the results in the first post of the thread.  My setup is specifically using a Keysight 34461A 6.5 digit multimeter as the accurate reference meter but, maybe we can figure out how to adapt it to other meters too?  All of what follows is my understanding so, if I'm wrong, please chip in and correct me and I will edit the posts to be as accurate as possible - in th eend, I want to make the first post the go-to, tested instructions on how to set things up.

So let's look at the basics:

1. Python is an interpreted language which means that it doesn't compile to a .exe file that can run natively on a PC, instead it needs an environment to run in that takes a line at a time and translates that to actions such as sending and receiving signals to/from remote devices.

2. We need a user-friendly interface and the de facto choice is PyCharm which is an Integrated Development Environment (front end) for Python but PyCharm doesn't include the interpreter.

This is the first gotcha as to why my environment is dodgy.... There's a version of Python offered by Microsoft through the search menu but it doesn't work properly with PyCharm - it seems MS in their bid to protect us stop programs being able to do simple stuff like have us make changes to the code.  To get around this, PyCharm goes to great lengths to create a virtual environment by copying the Python interpreter into a virtual environment which we can add extra stuff to without upsetting the MS security bot that lives inside every Windows system.

So here's the first steps to get the environment set up properly:

1. Go to www.python.org and download Python (3.8.2 at time of writing) taking all the default settings.
2. Go to www.jetbrains.com/pycharm/ and download the (free) communty version of PyCharm taking all the default settings.

To be continued
« Last Edit: March 17, 2020, 11:13:24 am by Gandalf_Sr »
If at first you don't succeed, get a bigger hammer
 
The following users thanked this post: klamath

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #2 on: March 17, 2020, 11:18:59 am »
It seems clear that, what is wrong with my Python project is that it isn't running in a virtual environment that is created and maintained by PyCharm.  If I create a new project fro PyCharm's opening window, the virtual environment is created correctly and I can close PyCharm and reopen and everything is the same as I left it.

So the thing I need help on right now is how to move the files supplied by JDubU to the newly created environment.
If at first you don't succeed, get a bigger hammer
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #3 on: March 17, 2020, 12:55:06 pm »
To establish a default environment for projects in PyCharm:

In PyCharm, select menu item:      File > Other Settings > Settings for New Projects...

You can make whatever changes or additions you want there but, at least, click on Project Interpreter in the left column and select the (already installed) default version of Python that you want to use.
When you do that, it should show all of the add in packages that have been installed for that version of Python (e.g. python-ivi, python-vxi11, numpy).  If not, you can add them here using the '+' button on the top of the column on the right.

When opening an existing project, you need to select the folder that contains all of the project files (not the top level .py file).  This will let you open and view all of the files and any documentation that is associated with that project from within PyCharm.  They will appear in the left column of the PyCharm IDE.
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #4 on: March 17, 2020, 02:27:44 pm »
I'm working through this process and here's the notes I have so far...

PyCharm setup
   1. Download ivi project folder and place in C:\Users\ted\PycharmProjects folder
   2. I renamed the long file folder of "Python 3.x DP832 calibration - 34461A version" to "MyDP832Cal" so the full folder was…
   C:\Users\ted\PycharmProjects\MyDP832Cal\venv\Lib\site-packages\ivi\agilent
   3. Run up PyCharm and, from the front screen click the folder icon in the center
   4. Navigate to C:\Users\ted\PycharmProjects\MyDP832Cal and select the folder and click open
   5. It opens with what looks like a web page but you can click on the left side to open up the project explorer view (picture)
   6. Exited PyCharm and Ran up PyCharm and got edited project back (this is promising)
   7. Went to File>Settings and add Packages which all took
   except numpy which gives an error saying it needs Visual C++ vn xxx? But I think I already have that???
   8. So I exited PyCharm and went to a PowerShell where I was able to pip install numpy successfully
   9. At this point I updated the files at C:\Users\ted\PycharmProjects\MyDP832Cal\venv\Lib\site-packages\ivi\agilent
   10. Back into PyCharm and numpy is not one of the packages with my project and it still won't install

So I have a problem which looks like the Virtual environment can't see the Visual studio needed to install, any ideas?

[Edit] Maybe a reboot will fix this, I have to stop tinkering and do some work now.  Thanks for the help.
« Last Edit: March 17, 2020, 02:43:26 pm by Gandalf_Sr »
If at first you don't succeed, get a bigger hammer
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #5 on: March 17, 2020, 03:32:24 pm »
I think there is a misunderstanding about the add in library packages. They are associated with the Python interpreter and not with an individual project.  Think of them as modifications of the Python interpreter.

At this point, I would recommend uninstalling everything and starting over with a clean installation of Python, PyCharm and project folders.

Installation sequence:

1) Install Python 3.8.  Do not change any default installation settings. 
    It should automatically be installed at: C:\Users\ted\AppData\Local\Programs\Python\Python38

2) Install PyCharm with its default installation settings.

3) Start PyCharm and set the default Python interpreter ( File > Other Settings > Settings for New Projects...) to Python 3.8 and verify that it shows the above installation path in the drop down box.

4) Use the '+' button to add python-ivi, python-vxi11 and numpy to the Python 3.8 interpreter.

5)In Windows, navigate to:    C:\Users\ted\AppData\Local\Programs\Python\Python38\Lib\site-packages\ivi\agilent
and copy/replace the three Keysight/Agilent specific DMM files

6) In PyCharm, open a clean version of the DP832 Calibration project folder.  It should contain only:

  calibrate,py
  README.md,
  cal folder
       __init__.py
       DP832Cal.py

anything else in the project folder or cal folder should be deleted.

7) Run calibrate.py the first time by right clicking on its name in the Project tree (left top column) and selecting:  'Run calibrate'  from the pop up menu.
After that, you can run it with the menu item: Run > Run 'calibrate'

Edit: Modified step 7 to simplify first run (auto creation of run/debug configuration parameters)
« Last Edit: March 18, 2020, 03:54:02 am by JDubU »
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #6 on: March 17, 2020, 05:30:38 pm »
Thanks, I'll try that later.
If at first you don't succeed, get a bigger hammer
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #7 on: March 17, 2020, 07:54:21 pm »
@GandalfSr - May you should have try MS Visual Studio Code as I said to you before . It is more straightforward than PyCharm . You dont need to reinstall packages on every project .
https://code.visualstudio.com/
« Last Edit: March 17, 2020, 08:20:51 pm by skander36 »
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #8 on: March 17, 2020, 08:20:03 pm »
Siglent SDM3065X owners.
Attached is the version of DP832 Automated Calibration script fork by @JDuBU for Python 3 with added snippets for changing from DMM DC Current an back to DC Voltage . It was tested with a Siglent DMM3065X but is should work also with 3045 and 3055 series .

You need a Python IDE (MS VisualCode - recomended , or PY-Charm) and following packages :
python-ivi
python-vxi11
numpy

For testing you should ensure that line 68 in file calibrate.py has value : update_calibration = False . After you are sure that simulation of calibration has been done succesfully you must edit line with: update_calibration = True and do a real calibration .
Remember : You do this on your own risk !

 
« Last Edit: March 19, 2020, 07:32:36 pm by skander36 »
 
The following users thanked this post: tautech, klamath

Offline thm_w

  • Super Contributor
  • ***
  • Posts: 6349
  • Country: ca
  • Non-expert
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #9 on: March 17, 2020, 09:08:20 pm »
@GandalfSr - May you should have try MS Visual Studio Code as I said to you before . It is more straightforward than PyCharm . You dont need to reinstall packages on every project .
https://code.visualstudio.com/

https://www.jetbrains.com/help/pycharm/installing-uninstalling-and-upgrading-packages.html

Quote
Reuse installed packages

Create a new virtual environment and install packages that you want to be used in other projects. Then you can specify this virtual environment as a project interpreter for the target project and all the needed packages will be available.
Profile -> Modify profile -> Look and Layout ->  Don't show users' signatures
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #10 on: March 17, 2020, 09:13:34 pm »
Thanks for all the suggestions, I'll try to get to this tomorrow.
If at first you don't succeed, get a bigger hammer
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #11 on: March 18, 2020, 12:07:54 am »
@GandalfSr - May you should have try MS Visual Studio Code as I said to you before . It is more straightforward than PyCharm . You dont need to reinstall packages on every project .
https://code.visualstudio.com/
So do you run Python inside MS Visual Studio?
If at first you don't succeed, get a bigger hammer
 

Offline typoknig

  • Regular Contributor
  • *
  • Posts: 103
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #12 on: March 18, 2020, 02:06:13 am »
1. Python is an interpreted language which means that it doesn't compile to a .exe file that can run natively on a PC, instead it needs an environment to run in that takes a line at a time and translates that to actions such as sending and receiving signals to/from remote devices.

Python scripts can be bundled into an EXE.  There are several modules that can do that.  The one I use is PyInstaller.
 

Offline tautech

  • Super Contributor
  • ***
  • Posts: 28327
  • Country: nz
  • Taupaki Technologies Ltd. Siglent Distributor NZ.
    • Taupaki Technologies Ltd.
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #13 on: March 18, 2020, 03:11:43 am »
Siglent SDM3065X owners.
Attached is the version of DP832 Automated Calibration script fork by @JDuBU for Python 3 with added snippets for changing from DMM DC Current an back to DC Voltage . It was tested with a Siglent DMM3065X but is should work also with 3045 and 3055 series .

You need a Python IDE (MS VisualCode - recomended , or PY-Charm) and following packages :
python-ivi
python-vxi11
numpy

For testing you should ensure that line 68 has value : update_calibration = False . After you are sure that simulation of calibration has been done succesfully you must edit line with: update_calibration = True and do a real calibration .
Remember : You do this on your own risk !
Cool thanks for this.
There has been something developed for SDM user adjustment at the factory and one of my customers is currently trialling it with his SDM3065X however at this time Siglent are nervous about releasing anything into the public domain should users not have accurate enough references for manual calibration.

Could I trouble you to post about this in the SDM3000 thread:
https://www.eevblog.com/forum/testgear/siglent-new-bench-dmm-sdm3055/
TIA
Avid Rabid Hobbyist
Siglent Youtube channel: https://www.youtube.com/@SiglentVideo/videos
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #14 on: March 18, 2020, 12:54:29 pm »
Hi Tautech,
The target device is Rigol DP832 . Siglent DMM is only the source for calibration values not he device under troubleshooting .
Finally I found that SCPI implementation of Siglent is about the same as Keysight ar Keythely so I was able to run scripts by changing only the ports used for SCPI through Telnet .
You can post this info to Siglent thread .
I post this here because is about Rigol DP832 (using various DMM's) .
Thank you !
 
The following users thanked this post: tautech

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #15 on: March 19, 2020, 09:29:04 am »
@GandalfSr - May you should have try MS Visual Studio Code as I said to you before . It is more straightforward than PyCharm . You dont need to reinstall packages on every project .
https://code.visualstudio.com/
So do you run Python inside MS Visual Studio?

Yes . You can try for yourself .
 
The following users thanked this post: Gandalf_Sr

Offline TurboTom

  • Super Contributor
  • ***
  • Posts: 1389
  • Country: de
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #16 on: March 19, 2020, 12:34:13 pm »
About two or three years ago when I calibrated my DP832(A) with one of the "old scripts" (sorry, don't remember which one it actually was), I also noticed the discrepancy of the low voltage cal points, especially on one of the channels (at that time, it didn't matter to me since the voltage settings I normally use are far off this critical range and those were more than accurate enough to me).

Comparing the cal lists recently published, it seems the offset of the ADC in the low range is about 300something millivolts negative. The problem is that the output stage won't be able to reach that negative voltages during calibration of the 0mV and 200mV point (at least), resulting in wrong calibration value feedback and the observed errors.

So what about trying a little "cheat"? First, just test the 0.5V and 1.2V cal points and evaluate the offset, and then (linearly) calculate readback value for the 200mV (and possibly 0V) cal point to feed back during the "real" calibration pass? Anybody wants to give it a try? Unfortuantely, my coding skills are close to none and moreover, currently I haven't got a cal setup running, otherwise I'ld give it a try. Using this approach, we may get a much better low-voltage calibration result than currently possible.

Cheers,
Thomas
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #17 on: March 19, 2020, 01:30:57 pm »
This is a good ideea, maybe JDubU can fork python script to do that .
But you can post this in the other thread (https://www.eevblog.com/forum/testgear/rigol-dp832-firmware-updates-and-bug-list/new/?topicseen#new)  where maybe garrettm can adapt his javascript to do this offset evaluation .
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #18 on: March 19, 2020, 01:43:03 pm »
I started the thread to stop clogging up the bug list thread and have one on PC Calibration - I don't mind narrowing the title to exclude PyCharm but sending Tom over to the firmware updates & bug list thread seems wrong IMHO.

Just trying to keep some semblance of organization on these threads.
If at first you don't succeed, get a bigger hammer
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #19 on: March 19, 2020, 02:16:05 pm »
Then you may change title of thread . It limits the language to python . The other script is in Java . Maybe if we discuss about it here you may send us to create another thread ... :D
L.E. Suggestion title : Automatic calibration of DP832/A
« Last Edit: March 19, 2020, 02:18:08 pm by skander36 »
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #20 on: March 19, 2020, 02:24:29 pm »
I've changed it to "Automated DP832 Calibration".
If at first you don't succeed, get a bigger hammer
 
The following users thanked this post: skander36

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #21 on: March 19, 2020, 03:00:42 pm »
TurboTom aproach is very good especially for cases like me , where I have by default, negative offset on CH2. If I will use different sets of cal. data for each channel I can solve the problem . I have measured the offset and created a different data for CH2 (cal_dacv1/cal_dacv2 for my PSU) but as Tom said we need a loop that measure the offset and set an apropriate value for calibration for more confort.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: DP832 Calibration using Python & PyCharm Running on Windows
« Reply #22 on: March 21, 2020, 03:03:58 am »
Comparing the cal lists recently published, it seems the offset of the ADC in the low range is about 300something millivolts negative. The problem is that the output stage won't be able to reach that negative voltages during calibration of the 0mV and 200mV point (at least), resulting in wrong calibration value feedback and the observed errors.

So what about trying a little "cheat"? First, just test the 0.5V and 1.2V cal points and evaluate the offset, and then (linearly) calculate readback value for the 200mV (and possibly 0V) cal point to feed back during the "real" calibration pass? Anybody wants to give it a try? Unfortuantely, my coding skills are close to none and moreover, currently I haven't got a cal setup running, otherwise I'ld give it a try. Using this approach, we may get a much better low-voltage calibration result than currently possible.

That would be very easy to do.

But to clarify, you want to avoid using cal points that return a negative readback value, correct? Or are you suggesting to return a calculated value (rather than a measured value) for those lower cal points that measure negative?

I've been busy with work lately, but I've been meaning to write a new script to read back the entire output range in 10 mV increments to see what the raw output looks like from the calibration mode. This data might assist in figuring out where more cal points might actually help with linearizing the output.
« Last Edit: March 21, 2020, 03:18:45 am by garrettm »
 

Offline TurboTom

  • Super Contributor
  • ***
  • Posts: 1389
  • Country: de
Re: Automated DP832 Calibration
« Reply #23 on: March 21, 2020, 05:28:31 pm »
@Garrettm: Initially, I meant to use calculated, extrapolated readback voltages for the lowest calibration points. My suggestion is more or less based on an educated guess that may explain the peculiar behavior of the PSU at low output voltages. This is based on the evidence from the recently published cal output lists and also on my own findings that suggest that there is a (more or less) constant negative offset of the instrument output that gets truncated by the hardware at the lowest cal points. So in case of these points, a proper calibration isn't possible. If the instrument's internal calibration routine actually accepts negative readback values, this approach may lead to a more accurate calibration.

But further reasoning may even suggest that the internal calibration routine of the DP832 ignores the negative readback values alltogether and already is using the offset of the first positive readback to correct all the lower output voltages. If there is no slope approximation used, this may actually explain why one channel may be fairly accurate and another off by several ten millivolts at the low end.

I guess my first suggestion is worth a try anyway. But if the cal routine really ignores negative voltages, and it's possible to use arbitrary cal points, it may be a good idea to run a test cal to find the lowest voltage preset that will result in a positive output and use this as the first calibration point. This way, the calibration for all the lower voltages may be more accurate.  ...once again an educated guess...  ;)

Cheers,
Thomas
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #24 on: March 22, 2020, 02:28:32 am »
I've modified the Python calibration script to help optimize the near zero calibrations. 
It finds the lowest value calibration point that produces a non-negative dmm read back and uses it for the first calibration point in the sequence.  Any calibration point in the tables that would produce a negative dmm read back are skipped

The file DP832Cal.py is in the attached zip file.  Just replace the one that is already in the cal folder.  Be sure to save the old file, this one has had very limited testing and may have bugs.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #25 on: March 22, 2020, 04:29:16 am »
You've given me some great ideas Thomas. I'll play around with both interpolated "measured" values for cal points that measure negative and also try skipping negative cal points as JDubU has done. If either of these actually make a difference using the standard point list from TooOldForThis, I'll upload a new version of the Java cal script using the improved routine.

It'll probably be a few days until I can work on this, but it shouldn't take much effort to do. I only need to run voltage calibrations for ch 1 and 2 a couple of times and run another script to save measurements of the output as a CSV file for plotting and comparison.
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #26 on: March 22, 2020, 09:22:43 am »
I've modified the Python calibration script to help optimize the near zero calibrations. 
It finds the lowest value calibration point that produces a non-negative dmm read back and uses it for the first calibration point in the sequence.  Any calibration point in the tables that would produce a negative dmm read back are skipped

The file DP832Cal.py is in the attached zip file.  Just replace the one that is already in the cal folder.  Be sure to save the old file, this one has had very limited testing and may have bugs.
Thanks, is this one 34461A-compatible?
If at first you don't succeed, get a bigger hammer
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #27 on: March 22, 2020, 01:01:04 pm »
Thanks, is this one 34461A-compatible?

Yes, there are no changes to the way it communicates with the dmm.
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #28 on: March 22, 2020, 02:46:34 pm »
I was running the calibration using your last version. Output attached.
Negative offset on CH2 is no more present. Channels 1&2 are outputting positive values at about 2 mV but they are a little unstable  as they slow vary between 0.4 -2 mV. I thing that the source must be warmed more time (over two hours) before calibration , but this will affect values at start-up .
CH3 is the more precise and stable .
I think that this approach is a good way as it eliminate negative offset .
Thank you for your work .
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #29 on: March 22, 2020, 03:50:57 pm »
Negative offset on CH2 is no more present. Channels 1&2 are outputting positive values at about 2 mV but they are a little unstable  as they slow vary between 0.4 -2 mV. I thing that the source must be warmed more time (over two hours) before calibration , but this will affect values at start-up .

Thanks for testing the new version.
I've also found that the DP832 definitely must warm up and stabilize in temperature before doing a calibration.
I see from your calibration log that your DP832 has a 0.356V negative DAC-V offset on CH1 and a 0.41V negative DAC-V offset on CH2.  As an experiment, it might help to remove the 0.5V calibration point from those calibration tables so that the next point is farther away at 0.7V.
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #30 on: March 22, 2020, 04:42:07 pm »
Thanks for testing the new version.
I've also found that the DP832 definitely must warm up and stabilize in temperature before doing a calibration.
I see from your calibration log that your DP832 has a 0.356V negative DAC-V offset on CH1 and a 0.41V negative DAC-V offset on CH2.  As an experiment, it might help to remove the 0.5V calibration point from those calibration tables so that the next point is farther away at 0.7V.
Hi ,
Attached is the output without 05. value.
Unfortunately I have found that running the script with "update_calibration = False" is not preserving previous values. It delete calibrated values and leave the default one.
I has mention negative offset on output of source as no more . Before, I always used to have on CH2 -40 mv for 0V set.
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #31 on: March 22, 2020, 05:17:06 pm »
If "update_calibration = False", the previous calibration data is temporarily cleared but should return if you power cycle the DP832. 
Setting "update_calibration = True" records the new calibration data into non-volatile memory just before the script finishes. 
If you stop the script part way through the calibration, power cycling should also return the previous calibration. 

Did removing the 0.5V calibration point make any improvement in the accuracy near zero?
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #32 on: March 22, 2020, 05:34:01 pm »
If "update_calibration = False", the previous calibration data is temporarily cleared but should return if you power cycle the DP832. 
Setting "update_calibration = True" records the new calibration data into non-volatile memory just before the script finishes. 
If you stop the script part way through the calibration, power cycling should also return the previous calibration. 

Did removing the 0.5V calibration point make any improvement in the accuracy near zero?

Hi JDubU,
As you can see from the script , it was done until his end , so no ctrl+c was press . It was worked in previous versions , maybe something has changed and is affecting this function. Values has not been restored after power cycling .
Removing 0.5V step was not improved accuracy , it was slightly worse . Better results I has get after two consecutive calibrations with 0.5 V value .
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #33 on: March 22, 2020, 06:38:45 pm »
As you can see from the script , it was done until his end , so no ctrl+c was press . It was worked in previous versions , maybe something has changed and is affecting this function. Values has not been restored after power cycling .
Removing 0.5V step was not improved accuracy , it was slightly worse . Better results I has get after two consecutive calibrations with 0.5 V value .

Nothing was changed that would save the new calibration data when "update_calibration = False". 
It would print "Updating calibration data for channel CH<channel number>" in the log if the command that saves the calibration was called, so I cannot explain why it happened.

I just noticed that it does show "Updating calibration data for channel CH<channel number>" in your first log (22032020_calib.txt) but not in your second log (22.032020 without 05.txt). 
Did you use the same "calibrate.py" file for both or maybe it was a problem with the Python file cache not being cleared?
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #34 on: March 22, 2020, 07:06:51 pm »
I think that it was not saying in last because I have used calibration=false. This way I have found that now is no more working  :)
I am using Ms Visual Studio Code , I am not aware about a cache clear method.
This is not a big problem , it is just secondary , I just want to use simulate mode to not loose a perfect calibration data that I have aquired today . But hasn't to be ...  :)
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #35 on: March 24, 2020, 02:13:01 am »
I did an experiment to read back the output voltages corresponding to a section of one of the DP832 internal calibration tables after its calibration points had been cleared. 
The idea is to try and discover native non-linearities between the table and the output voltage that would help in optimizing the location of the calibration points.
For this initial test, I only read back a portion of DAC-V of channel 2 between 1V and 2V in 10mV increments.  The captured data is in the attached spreadsheet.
 
Also in the spreadsheet are two plots. 
The first is actual output voltage (with no calibration) vs. expected voltage (if it had been perfectly calibrated). 
The second plot is the difference between expected (calibrated) voltage and actual (uncalibrated) voltage vs expected voltage.  If it were perfectly calibrated this would be a horizontal line at zero volts.
As expected, there is a linear gain and offset difference in the uncalibrated output, but there are also some non-linearities that are more obvious in the second plot.

Comments?
« Last Edit: March 24, 2020, 03:16:57 am by JDubU »
 
The following users thanked this post: thm_w, TurboTom, garrettm

Offline TurboTom

  • Super Contributor
  • ***
  • Posts: 1389
  • Country: de
Re: Automated DP832 Calibration
« Reply #36 on: March 24, 2020, 02:28:10 am »
That's what is to be expected if the test increments (i.e. millivolt steps) don't coincide with a whole-number multiple of the ADC DAC steps. The "ripples" of the error is just the result of the "interference" of the two, slightly offset "frequencies" (simplified speaking...  ;)).


Edit: meant DAC but typed ADC... I hope this eliminates the confusion, sorry for that.
« Last Edit: March 24, 2020, 12:29:07 pm by TurboTom »
 

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #37 on: March 24, 2020, 02:39:26 am »
That's what is to be expected if the test increments (i.e. millivolt steps) don't coincide with a whole-number multiple of the ADC steps. The "ripples" of the error is just the result of the "interference" of the two, slightly offset "frequencies" (simplified speaking...  ;)).

When you say "ADC steps" are you talking about the external 6 1/2 digit dmm and not the DP832's readback ADC?  I would think that the DP832's readback ADC was not involved in the voltage output unless it is part of a voltage output feedback loop.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #38 on: March 24, 2020, 03:30:22 am »
Thanks for uploading that Excel sheet JDubU. The error plot looks to be a 2nd order polynomial, though we would need to see the whole output, probably in larger 100mV increments, to really know. Hopefully I'll have time this weekend to contribute some data on this.

The dips are where the output remains the same value. You can see there is an alternating pattern to how many increments until the next digit isn't expressible. I observed this when printing custom characters as a volume bar on a display for my ES9018 DAC (attached the Excel doc that shows this graphically). This means the folks at Rigol likely used a 14-bit DAC, as 1 mV adjustment at 32V FS requires a minimum of 15 bits to express each possible output value.
« Last Edit: March 24, 2020, 03:47:40 am by garrettm »
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #39 on: March 24, 2020, 08:27:25 am »
All very interesting guys, thanks for sharing.

After PyCharm Cal with my 34461A, when I dial up 5.000 V from my DP832A, I get 5.000 V.

So my question is, is this discrepancy only there at <2V levels?
If at first you don't succeed, get a bigger hammer
 

Offline TurboTom

  • Super Contributor
  • ***
  • Posts: 1389
  • Country: de
Re: Automated DP832 Calibration
« Reply #40 on: March 24, 2020, 11:42:52 am »
I'ld rather say Rigol uses at least 16 bit DACs in the DP832 PSU which are sufficient for a 1mV resolution at an output voltage range of 32V. The problem is that -- due to tolerances of the analog circuitry, and some DAC range "wasted" at the bottom and top end of the adjustable range for calibration purposes -- it cannot be made certain that every millivolt preset step can be represented by an integer number of LSBs of the DAC. I tested on my DP832 the 1mV increments to be more like 1.05mV with every tenth or so being only 0.5mV to stay within range (i.e. to compensate for the cumulative error). I assume that I see mostly 2LSB increments and 1LSB in case of the "small" step. CH3 is likely equipped with the same DAC, but since the voltage range is much smaller, here this problem isn't as visible.

At least every individual preset change is accompanied by an output voltage change in the correct direction. What else to ask for?  8)
« Last Edit: March 24, 2020, 12:33:30 pm by TurboTom »
 
The following users thanked this post: JDubU, thm_w, Gandalf_Sr

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #41 on: March 25, 2020, 12:15:05 am »
Using the Python program that eliminates the negative calibration points, I did a full calibration of my DP832.  I then ran sweeps of each of the channels and plotted the difference between the set and actual output voltages vs the set voltages.  The results are attached.   A spreadsheet of the calibration points used for each channel is also attached.

On channel 3, you can see that the zero crossovers in the difference sweep are close to the small number of calibration points. 
Not so much for channels 1 and 2 (much wider voltage range and many more calibration points).



Edit:  Fixed plot axis ranges for channels 1 and 2
« Last Edit: March 25, 2020, 01:13:06 am by JDubU »
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #42 on: March 25, 2020, 11:38:35 am »
I decided to try the Visual Studio running Python route as things were weird in my PyCharm setup.  I am however worried about the issue reoccurring that hoses my DP832 Cal.

Any chance of a few pointers as to how to proceed now that I've installed MS Visual Studio Community Edition with Python3.x 64 bit?  Can I get the calibration code from GitHub?  VS is offering me to get a project from GitHub.
If at first you don't succeed, get a bigger hammer
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #43 on: March 25, 2020, 10:01:40 pm »
I dont think you need to use GitHub unless you want to share a project ...
Just open the file and press F5 key (CTRL+F5 to debug).
Of course you have to install packages needed .
But what kind of problems do you have with PyCharm ?
« Last Edit: March 26, 2020, 08:35:52 am by skander36 »
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #44 on: March 25, 2020, 10:38:33 pm »
....But what kind of problems do you have with PyCharm ?
Every time I open the project, the plugins are missing and the changes I made are gone. I'm pretty sure it's because Windows is fighting the installation from a security position.
If at first you don't succeed, get a bigger hammer
 

Offline skander36

  • Frequent Contributor
  • **
  • Posts: 722
  • Country: ro
Re: Automated DP832 Calibration
« Reply #45 on: March 26, 2020, 08:35:35 am »
It seem that you have set interpreter into the temp folder . Its location (base interpreter) must not include a TEMP in the path .
Anyway Visual Code need installing the packages you need (python-ivi + python-vxi11 + numpy) only one time (using pip3), and the they will be shared with other projects you will work in the future .
 

Offline Neuromodulator

  • Regular Contributor
  • *
  • Posts: 67
  • Country: cl
Re: Automated DP832 Calibration
« Reply #46 on: April 17, 2020, 02:08:07 am »
Hello,

I coded a script from scratch to perform calibration, but I've some questions about the cal points.
When I try to enter calibration measurements manually I get some default measured values that are different to the cal point voltage list. The  chan 1 DAC V sequence begins with "0.2, 0.6, 1.2, 2.0, 3.2, 4.1, ...", while the list I found in many  scripts begins with "0.2, 0.5, 1.2, 2, 3.2, 4.1, ...". There are 38 items on the manual input list while scripts contain just 36.

Can I use just any arbitrary value?
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #47 on: April 17, 2020, 09:57:47 am »
Can I use just any arbitrary value?

Basically yes. You can make your own cal point list at upto 80 or 52 points. Note that adding more points can improve accuracy, but also does not guarantee getting better accuracy. Channels 1 and 2 are tricky to get into better cal than the default point list while channel 3 is very easily improved with extra points. At least from my experience. I've been meaning to play around with this futher, but work has derailed me at the moment, along with other projects I've been working on.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #48 on: April 17, 2020, 10:02:04 am »
 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #49 on: April 28, 2020, 04:34:38 pm »
Hi to all,

first thanks to all for the very useful links and scripts I found here, I have quite a few DP832s, and they *are* a bit drifty and *long* for a calibration after some time.
Neither sending them in not doing a manual cal was much help, so I decided to build a SCPI-based appliance that does all the signal routing from the PSU to the DMM (Keysight 34461A or better), plus a Python script that does all the settings, so the procedure now is just 1 click.

If interested, I'll put some more info on the web.

https://electronicprojectsforfun.wordpress.com/an-automatic-calibrator-for-rigol-dp832a-power-supplies/

 
The following users thanked this post: tv84, Simon_RL

Offline JDubU

  • Frequent Contributor
  • **
  • Posts: 441
  • Country: us
Re: Automated DP832 Calibration
« Reply #50 on: April 28, 2020, 06:05:17 pm »
Wolfgang:

Will you be sharing your Arduino SCPI-based appliance code?
 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #51 on: April 28, 2020, 06:11:09 pm »
Hi Peter,

thanks for the tip ! I try it on my DP832 and let you know if it works.
 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #52 on: April 28, 2020, 06:20:45 pm »
Hi Peter,

works perfectly. I also tried PNG, but that does not seem to work, so I stayed with BMP. Thanks again !
 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #53 on: April 28, 2020, 06:21:47 pm »
Wolfgang:

Will you be sharing your Arduino SCPI-based appliance code?


Yes, under GPL V3. Just cleaning it up a little bit.
 
The following users thanked this post: JDubU

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #54 on: April 28, 2020, 06:46:04 pm »
Wolfgang:

Will you be sharing your Arduino SCPI-based appliance code?


I just uploaded it onto the webpage. Have fun !
 
The following users thanked this post: JDubU

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #55 on: April 30, 2020, 04:51:39 pm »
I just bought a second DP832 from Tequipment.net.  I have 2 main workbench areas and I unburdened myself of a Siglent SPD3303X-E a few months back.  Just sold some stuff on eBay and decided I would try and boost the economy so the deal is done  :)
If at first you don't succeed, get a bigger hammer
 

Offline tv84

  • Super Contributor
  • ***
  • Posts: 3217
  • Country: pt
Re: Automated DP832 Calibration
« Reply #56 on: April 30, 2020, 09:38:55 pm »
Hi Peter,

works perfectly. I also tried PNG, but that does not seem to work, so I stayed with BMP. Thanks again !

Wolfgang,

Look here. Maybe you'll find others that interest you.

BTW, it only saves in .BMP mode.
« Last Edit: April 30, 2020, 09:44:47 pm by tv84 »
 
The following users thanked this post: Wolfgang

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #57 on: April 30, 2020, 11:10:25 pm »
Thanks !,

I'll add some when I find time and I have collected some feedback.
I plan to put all the instruments in a DB instead of into the program as now.

For what I have (DP832(A),DG1000Z Series, DL3021A, DS1000Z Series, DSA800 Series, M300)
everything worked now except the M300, but you dont have that either.

The better parts in your list (3000/4000/5000/7000/8000) I dont have, but could be added to the list for other users.

Please report errors if you find any.

regards
  Wolfgang DL1DWG
 

Offline czecht

  • Newbie
  • Posts: 9
  • Country: us
Re: Automated DP832 Calibration
« Reply #58 on: June 20, 2020, 08:23:07 pm »
I have just the DP831, so I don't know if the DP832 hack works also on DP831.
I do not understand the commands you are talking here about - since I'm newbie and most of the time I don't know what I'm doing, sorry, I need any help I can get.
Because of financial limitations, I can't get a better one, I need to update mine as much as I can, but on youtube they are all talking about DP832, not much about mine DP831.
Thank you very much!
Tony
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #59 on: June 20, 2020, 08:29:47 pm »
I don't know for sure but I'm guessing that the 'magic' USB drive will allow the DP831 to be converted to an higher hardware-compatible model.  But what does the 831 do that the 832 doesn't?
If at first you don't succeed, get a bigger hammer
 

Offline mjkuwp

  • Supporter
  • ****
  • Posts: 259
  • Country: us
  • mechanical engineering defector
    • The Mz Lab
Re: Automated DP832 Calibration
« Reply #60 on: June 21, 2020, 01:50:34 pm »
Regarding Python and the IDE.  this is what I have done on several machines and it works:
1. download Python from the Python site python.org and install.
2. do not use Virtual Environments.

I've never tried using Python from the Windows Store.

I've done a lot with Python and Pycharm and do my development for in-house engineering tools on Windows and then most often the project is eventually run on a Raspberry pi.

in PyCharm you may need to point each project to the location of your Python interpreter.
[File][Settings...][Project:yourproject][Project interpreter] and then browse to your python.exe file.

PyCharm is an awesome tool, I think is a really good aid for productivity.

After a Raspberry Pi computer is running I sometimes use Visual Studio Code to edit over ssh via the tools that software provides.


 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #61 on: June 26, 2020, 07:14:01 pm »
Does anyone know if any of the calibration scripts for the DP832 support th Fluke 8845A?
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #62 on: July 10, 2020, 11:03:42 pm »
Does anyone know if any of the calibration scripts for the DP832 support th Fluke 8845A?

My Telnet script using Ethernet works with the Fluke 8845A and 8846A, as it was written and tested using a Tektronix DMM4050 (rebranded Fluke 8846A). The script is written in Java and is pretty simple to use, less to configure and set up than the Python script in my opinon.

My script as well as the Python scripts use standard SCPI commands to control the instruments. So I'd wager your Fluke will work with the Python script if you prefere to go that route.

I've attached my script if you would like to use it. If you do, run "test_run.bat" in the complied directory first to check your setup. Then, if everything worked okay, run "calibrate.bat". Read the "Example Output of Cal Routine.txt" to see what a successful calibration looks like and what options to select.

 
The following users thanked this post: tv84

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #63 on: July 11, 2020, 01:30:27 am »
That looks awesome - thanks garrettm I'll give it a try.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #64 on: July 11, 2020, 08:21:44 pm »
I installed Java for windows 10, but it says:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.UnsupportedClassVersionError: TelnetTest ha
s been compiled by a more recent version of the Java Runtime (class file version
 57.0), this version of the Java Runtime only recognizes class file versions up
to 52.0

I also tried to download the JDK 13 and install it as well.

Any ideas?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #65 on: July 11, 2020, 09:07:24 pm »
Ok, I go the java worked out.

Ran the test, here are the results:

Should these values be this far off?  The display of the power supply (DP832) did show a much closer number when the steps were selected to the meter and not what the steps are...

Code: [Select]
---------------------------- START OF TEST ---------------------------

Please disconnect all cables from the DMM

Enter 1 to begin automated voltage readback tests
1

Testing remote voltage measurement
meas val:  0.0015v
meas val:  0.0015v
meas val:  0.0015v
meas val:  0.0015v
meas val:  0.0015v

Testing local voltage measurement

Please check that instrument is in DC volts, 10npcl, local mode

Enter 1 to begin automated current readback tests
1

Testing remote 1A to 10A current measurement
meas val: -0.0001A
meas val: -0.0001A
meas val: -0.0001A
meas val: -0.0001A
meas val: -0.0001A

Testing local current measurement

Please check that instrument is in DC current, 10npcl, local mode

Enter y to test reading output of PSU cal points or n to skip
y

---------------------- SELECT READBACK TEST TYPE ----------------------

Note 1: If you're having trouble with voltage or current calibrations
performing a full voltage + current calibration will fix this

Note 2: Automated current calibration requires DMM with >= 3A range

Enter value of desired test type for ch1

0: skip ch1 test
1: voltage + manual current test
2: voltage + automated current test
3: voltage test (automated entry)
4: current test (manual entry)
5: current test (automated entry)
2

-------------------- VOLTAGE READBACK ROUTINE --------------------

Please connect DMM voltage input to ch1 output of PSU

Enter 1 to begin automated voltage readback
1

ch1 DAC-V readback
step  0, cal point:  0.2v, meas val:  0.0119v
step  1, cal point:  0.5v, meas val:  0.3104v
step  2, cal point:  1.2v, meas val:  1.0007v
step  3, cal point:    2v, meas val:  1.7894v
step  4, cal point:  3.2v, meas val:  2.9779v
step  5, cal point:  4.1v, meas val:  3.8682v
step  6, cal point:  5.2v, meas val:  4.9582v
step  7, cal point:  6.9v, meas val:  6.6497v
step  8, cal point:  7.5v, meas val:  7.2426v
step  9, cal point:  8.7v, meas val:  8.4378v
step 10, cal point: 10.1v, meas val:  9.8239v
step 11, cal point: 11.8v, meas val: 11.5075v
step 12, cal point: 12.6v, meas val: 12.3022v
step 13, cal point: 13.5v, meas val: 13.1974v
step 14, cal point:   15v, meas val: 14.6870v
step 15, cal point: 15.8v, meas val: 15.4879v
step 16, cal point: 16.5v, meas val: 16.1856v
step 17, cal point: 17.3v, meas val: 16.9773v
step 18, cal point: 18.5v, meas val: 18.1661v
step 19, cal point: 19.1v, meas val: 18.7651v
step 20, cal point: 19.9v, meas val: 19.5579v
step 21, cal point: 20.2v, meas val: 19.8532v
step 22, cal point: 20.8v, meas val: 20.4471v
step 23, cal point: 21.8v, meas val: 21.4467v
step 24, cal point: 22.4v, meas val: 22.0422v
step 25, cal point: 22.7v, meas val: 22.3420v
step 26, cal point: 23.9v, meas val: 23.5388v
step 27, cal point: 24.3v, meas val: 23.9386v
step 28, cal point: 25.7v, meas val: 25.3328v
step 29, cal point: 26.9v, meas val: 26.5311v
step 30, cal point: 27.9v, meas val: 27.5246v
step 31, cal point: 28.5v, meas val: 28.1186v
step 32, cal point: 28.9v, meas val: 28.5138v
step 33, cal point: 29.8v, meas val: 29.4050v
step 34, cal point: 30.2v, meas val: 29.8022v
step 35, cal point:   32v, meas val: 31.5926v

ch1 ADC-V readback
step  0, cal point:    0v, meas val: -0.1861v
step  1, cal point: 0.05v, meas val: -0.1374v
step  2, cal point:  0.1v, meas val: -0.0878v
step  3, cal point:  0.5v, meas val:  0.3101v
step  4, cal point:    1v, meas val:  0.8039v
step  5, cal point:    5v, meas val:  4.7599v
step  6, cal point:   10v, meas val:  9.7236v
step  7, cal point: 12.8v, meas val: 12.5013v
step  8, cal point:   20v, meas val: 19.6557v
step  9, cal point:   30v, meas val: 29.6043v
step 10, cal point:   32v, meas val: 31.5924v

-------------------- CURRENT READBACK ROUTINE --------------------

Connect ch1 output to DMM 10A current input

Enter 1 to begin automated current readback
1

ch1 DAC-I readback
step  0, cal point:  0.1A, meas val:  0.0862v
step  1, cal point: 0.25A, meas val:  0.2358v
step  2, cal point:  0.5A, meas val:  0.4855v
step  3, cal point:  0.8A, meas val:  0.7837v
step  4, cal point:    1A, meas val:  0.9828v
step  5, cal point: 1.25A, meas val:  1.2307v
step  6, cal point:  1.5A, meas val:  1.4785v
step  7, cal point: 1.75A, meas val:  1.7257v
step  8, cal point:  1.9A, meas val:  1.8756v
step  9, cal point: 2.15A, meas val:  2.1228v
step 10, cal point: 2.35A, meas val:  2.3225v
step 11, cal point:  2.5A, meas val:  2.4712v
step 12, cal point: 2.75A, meas val:  2.7187v
step 13, cal point:    3A, meas val:  2.9663v
step 14, cal point:  3.2A, meas val:  3.1632v

ch1 ADC-I readback
step  0, cal point:    0A, meas val: -0.0129v
step  1, cal point: 0.01A, meas val: -0.0028v
step  2, cal point:  0.1A, meas val:  0.0861v
step  3, cal point:    1A, meas val:  0.9827v
step  4, cal point:    2A, meas val:  1.9741v
step  5, cal point:    3A, meas val:  2.9662v
step  6, cal point:  3.2A, meas val:  3.1631v

---------------------------- END OF TEST ----------------------------
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #66 on: July 11, 2020, 09:38:02 pm »
Those values look fine. These are the raw values used internally, so don't worry if they don't match the cal point exactly. Negative readback values are also not uncommon near zero for the ADC and DAC cal points.

From what I see, you are good to go if you want to calibrate using the Java script.

If you want to compare your test output with another PSU see the "Example Output of Test Routine.txt". It's the output of the test with my DMM and PSU.

P.S.

It also looks like there is a typo for the unit symbol for the current readback (V instead of A). I could also simplify the test routine: some of the prompts are carried over from the calibration script but aren't really needed for running the test.
« Last Edit: July 11, 2020, 09:50:19 pm by garrettm »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #67 on: July 11, 2020, 09:41:07 pm »
Thank you - I will move forward to calibration!
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #68 on: July 11, 2020, 10:21:27 pm »
Wow garrettm, that did an amazing job.  I checked all 3 channels in 1V steps and there is only a place or two where it deviates by more than 1mV, and when it does, the meter on the DP832 matches the DMM.  Thanks so much for the script and posting it - you are awesome!
 
The following users thanked this post: garrettm

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #69 on: July 11, 2020, 10:31:15 pm »
Glad to help and thanks for trying out the script.
 
The following users thanked this post: alank2

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #70 on: July 17, 2020, 10:45:09 pm »
garrettm - I've got another question - I got a second DP832 and it seems to have an odd issue where one of the calibration steps has a LOWER voltage than the one before it!

Any thoughts on this?  Shouldn't step 1 be around 0.3V higher than the step 0.

ch2 DAC-V calibration
step  0, cal point:  0.2v, meas val: -0.0705v
step  1, cal point:  0.5v, meas val: -0.0720v
step  2, cal point:  1.2v, meas val:  0.4447v
step  3, cal point:    2v, meas val:  1.2543v

ch2 ADC-V calibration
step  0, cal point:    0v, meas val: -0.1565v
step  1, cal point: 0.05v, meas val: -0.0899v
step  2, cal point:  0.1v, meas val: -0.0228v
step  3, cal point:  0.5v, meas val:  0.4982v
step  4, cal point:    1v, meas val:  1.0005v
 

Offline bson

  • Supporter
  • ****
  • Posts: 2269
  • Country: us
Re: Automated DP832 Calibration
« Reply #71 on: July 18, 2020, 03:48:11 am »
If I recall we originally made the calibration skip the first point since it might be negative.  But clearly there can be more than one negative point, so the calibrator should keep stepping the output by the negative readback plus a few mV until it gets positive.  So if it sees -70mV, step up the output by perhaps 75mV and check.  Repeat until it's positive.  The reason is negative outputs are almost certainly clamped and not part of the linear output range.  It might be a good idea to shift the entire calibration scale by the offset.  If 80.0mV produces 3.5mV, the offset if 80 - 3.5 = 76.5mV.
« Last Edit: July 18, 2020, 03:55:45 am by bson »
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #72 on: July 18, 2020, 06:04:59 am »
@alank2 the routine is written verbatim as described by tooOldForThis, though from what bson points out we could certainly improve the script.

As is--negative readback values and all--the script still works and calibrates just fine. Maybe not perfect 1 mV or mA accuracy across the entire output, but it seems okay from my testing.

That said, I do like the idea of finding the zero output point and using this value as an offset to add to the default cal point values. Should be pretty easy to do: First check the min cal point readback value. If non-negative do nothing. Else if negative, step up from the minimum cal point in 1 mV or mA increments until a readback value >=0 is reached, set this as the offset value and add it to all cal points.

@bson, do you think the above would work okay? Also should the default maximum cal point (i.e, full-scale cal point) not be exceeded? Or should we clamp at the full scale cal point: i.e., cal point + offset <= default max cal point.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #73 on: July 18, 2020, 06:40:29 am »
Thinking about this some more, I believe a uniform offset isn't really needed. We probably only need to shift the lower couple of points.

Find offset:
  If readback of min cal point is < 0, step up using absolute value of initial readback, then smallest increment as needed until a readback >= 0 is reached. Store this value as the offset.

Apply offset:
  For each cal point, test readback value. If < 0, add offset to default cal point, else use default cal point.

This assumes that the calibration process is linear near zero, otherwise we should probably choose the offset such that some minimum positive value is achieved for the lower cal points, say 10mV for ch1 DAC-V. Of course the ADC and DAC don't need to use the same adjustment process and may likely need to be different.
« Last Edit: July 18, 2020, 05:40:43 pm by garrettm »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #74 on: July 18, 2020, 01:01:55 pm »
Well, I'm going to try to see what I can do on this today.  I'll figure out how to compile java at the very least!

What I find odd is that in CH1 a similar thing happens where step 1 is even less than step 0 and it seems to work, but not so with CH2.

Are the steps arbitrarily chosen?  Do 0.2V and 0.5V equate to some DAC value?  When you guys talk about finding an offset, is the offset was 0.6V, are you saying that the calibration points should shift to 0.2V+0.6V, 0.5V+0.6V, and so on?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #75 on: July 18, 2020, 01:24:52 pm »
Also, if the first step is positive, how does it extrapolate below that?  Shouldn't the first point be less than zero, even if barely, so that it can be used to find that range including zero.
 

Offline Mecanix

  • Frequent Contributor
  • **
  • Posts: 269
  • Country: cc
Re: Automated DP832 Calibration
« Reply #76 on: July 18, 2020, 02:02:49 pm »
Quick side note to say thanks for all the helpful tips & tricks and those scpi commands others have posted! Good info. I've calibrated mine and long gone are those wicked low/neg mV/mA, finally. Nothing broke ;)

Background; had a DM3058 professionally re-calibrated by someone two weeks ago and happy this is finally talking the same language with the DP832, and within surprisingly tight tolerance. Particularly glad to see that what I manually enter on the keypad not only match the digits on that LCD but also match what's displayed on the DM3058. Before I use to set 3.30V and was getting 3.269V on the DP832's lcd and 3.33354V on the DMM lol It was all over the place, and useless almost...

So... thanks again guys! ps. cool kit @Rigol, btw. Not bad at all!

ps. Ended up writing my own in csharp for several reasons, using a simple condition to skip out the negative readings. Can't get any simper, works too! 

1025922-0
 

Offline aristarchus

  • Regular Contributor
  • *
  • Posts: 107
  • Country: 00
Re: Automated DP832 Calibration
« Reply #77 on: July 18, 2020, 02:13:06 pm »
..
ps. Ended up writing my own in csharp for several reasons, using a simple condition to skip out the negative readings. Can't get any simper, works too! 

(Attachment Link)

Congrats! good job done!

Now you know the next question..   :)   any chance to upload this VC project on a github or even as an attachment here ?
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #78 on: July 18, 2020, 02:18:57 pm »
Well, I'm going to try to see what I can do on this today.  I'll figure out how to compile java at the very least!

What I find odd is that in CH1 a similar thing happens where step 1 is even less than step 0 and it seems to work, but not so with CH2.

Are the steps arbitrarily chosen?  Do 0.2V and 0.5V equate to some DAC value?  When you guys talk about finding an offset, is the offset was 0.6V, are you saying that the calibration points should shift to 0.2V+0.6V, 0.5V+0.6V, and so on?

The cal points are the default from Rigol, but they can be arbitrarily chosen with up to a maximum of 80 points per array. I assume they picked these particular points based on some sort of best-fit data for linearity, but who knows.

The readback for 0.2V should be a non-negative value distinct from and lower than 0.5, it's just that the lower end of the series regulator may have some offset/clamping that throws off the lower default cal points. This is where we can improve the calibration routine.

The "offset" is simply the value added to the initial default cal point (say 0.2V+offset = 0.28V) needed to reach zero (or near zero) on readback. This then forces the series regulator into a linear region such that the cal point has significance. We don't want to drop cal points if we can, lowering the number of cal points can only worsen linearity. But we also don't want to use "bad" cal points either, i.e. use negative readback values that don't linearly increase as you've seen.

I can add this modification to the script if you want. But feel free to modify the script, and let me know if you need any help if you do.
« Last Edit: July 18, 2020, 02:38:43 pm by garrettm »
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #79 on: July 18, 2020, 02:32:43 pm »
I do want to point out that adjustments for the DAC-V and DAC-I routines are probably not needed for the ADC-V and ADC-I routines. Readback on the PSU display seems to work fine with the script as is.
 

Offline Mecanix

  • Frequent Contributor
  • **
  • Posts: 269
  • Country: cc
Re: Automated DP832 Calibration
« Reply #80 on: July 18, 2020, 02:37:56 pm »
Now you know the next question..   :)   any chance to upload this VC project on a github or even as an attachment here ?

Thanks! I'm all up for sharing, problem being its part of a form project and linked up with so many other stuff (LCR sweep & bin, oscilloscope, DMM data acquisition, arb waves, scales, etc etc). I'd need a day to pull a standalone/functional app out of this lot. Got the DMM and a LCR over serial com port, the rest over LAN. Pretty messy...

The primary function (method?) you see on the visual is what drives it all really, and pretty much the same what others have done already. PM me if you need any particular code or help with your own code and I'll happily help out, certainly will try my best anyway.

EDIT: here goes the full method. Good enough to get an idea I'm guessing. Nothing as elaborate and high quality as what Garrett already did. Below just put up the cal data in the DP832 and, that's all. Used Microsoft Visual Studio 2019, free Community edition (.NET Windows Form, buttons and all that good stuff).

Code: [Select]

using System.IO;
using System.IO.Ports;
using System.Net.Sockets;
using System.Threading;


        string DP832IP = "192.168.100.111";
        int DP832port = 5555;
        string cmd_runtime;


        // Array example CH1
        string[] VDAC = new string[] {  "0.842", "0.843", "0.845","0.85","0.9", "1", "1.2", "1.8", "2", "2.4", "2.7", "3", "3.3", "3.6", "4", "4.2", "4.5", "4.7",
                                       "5", "5.5", "6", "6.5", "7", "7.5", "8", "8.5", "9", "9.5", "10", "11", "12", "13", "14", "15", "16", "17", "18", "19",
                                       "20", "21", "22", "23", "24", "25", "26", "27", "28", "29", "30", "31", "32" };
        string[] VADC = new string[] { "0v", "0.001v", "0.005v", "0.05v", "0.1v", "0.5v", "1v", "5v", "10v", "12.8v", "20v", "30v", "32v" };

        //CH1, CH2 & CH3 iADC
        string[] iDAC = new string[] { "0.001A", "0.002A", "0.005A", "0.01A", "0.02A", "0.03A", "0.04A", "0.05A", "0.06A", "0.07A", "0.08A", "0.09A", "0.1A",
                                       "0.2A", "0.3A", "0.4A", "0.5A", "0.6A", "0.7A", "0.8A", "0.9A", "1A", "1.2A", "1.5A", "1.7A", "2A", "2.2A", "2.5A",
                                       "2.7A", "3A", "3.2A" };
        string[] iADC = new string[] { "0A", "0.1A", "0.5A", "1A", "2A", "3A", "3.2A" };


        private void btn_v_Click(object sender, EventArgs e)
        {
            Calibrate("CH3", "V");
        }
        private void btn_c_Click(object sender, EventArgs e)
        {
            Calibrate("CH3", "C");
        }

        private void Calibrate(string ch, string type)
        {
            int SCPIdelay = 1000;
            int DMMdelay = 3000;
            double DMMval = 0.00;

            using (var client = new TcpClient(DP832IP, DP832port))
            using (var networkStream = client.GetStream())
            using (var writer = new StreamWriter(networkStream))
            {
                writer.Write(":CAL:Start 11111," + ch + "\r\n");
                Thread.Sleep(SCPIdelay);
                writer.Write(":CALibration:Clear "+ch+",v\r\n");
                Thread.Sleep(SCPIdelay);
                writer.Write("*RST\r\n");
                Thread.Sleep(SCPIdelay);
                writer.Write(":OUTPUT " + ch + ",ON\r\n");
                Thread.Sleep(SCPIdelay);

                if (type == "V")
                {
                    cmd_runtime = "MEAS:VOLT:DC?";

                    Console.WriteLine("VDAC:");
                    for (int i = 0; i < VDAC.Length; i++)
                    {
                        writer.Write(":CAL:Set " + ch + "," + type + ", " + i + "," + VDAC[i] + ",1\r\n");
                        Thread.Sleep(DMMdelay);
                        DMMval = MEAS.Measure(cmd_runtime); // <-- DMM Get reading here
                        if (DMMval > 0)
                        {
                            Console.WriteLine(VDAC[i]);
                            writer.Write(":CAL:MEAS " + ch + "," + type + "," + i + "," + DMMval + ",1\r\n");
                        }
                        Thread.Sleep(SCPIdelay);
                    }
                    Console.WriteLine("VADC:");
                    for (int i = 0; i < VADC.Length; i++)
                    {
                        writer.Write(":CAL:Set " + ch + "," + type + "," + i + "," + VADC[i] + ",0\r\n");
                        Thread.Sleep(DMMdelay);
                        DMMval = MEAS.Measure(cmd_runtime); // <-- DMM Get reading here
                        if (DMMval > 0)
                        {
                            writer.Write(":CAL:MEAS " + ch + "," + type + "," + i + "," + DMMval + ",0\r\n");
                            Console.WriteLine(VADC[i]);
                        }
                        Thread.Sleep(SCPIdelay);
                    }
                }
                else
                {
                    cmd_runtime = "MEAS:CURR:DC?";

                    Console.WriteLine("iDAC:");
                    for (int i = 0; i < iDAC.Length; i++)
                    {
                        writer.Write(":CAL:Set " + ch + "," + type + ", " + i + "," + iDAC[i] + ",1\r\n");
                        Thread.Sleep(DMMdelay);
                        DMMval = MEAS.Measure(cmd_runtime); //+0.0003; //CH2 add 0.0003 offset to comp mA
                        if (DMMval > 0)
                        {
                        Console.WriteLine(iDAC[i]);
                            writer.Write(":CAL:MEAS " + ch + "," + type + "," + i + "," + DMMval + ",1\r\n");
                        }
                        Thread.Sleep(SCPIdelay);
                    }
                    Console.WriteLine("iADC:");
                    for (int i = 0; i < iADC.Length; i++)
                    {
                        writer.Write(":CAL:Set " + ch + "," + type + "," + i + "," + iADC[i] + ",0\r\n");
                        Thread.Sleep(DMMdelay);
                        DMMval = MEAS.Measure(cmd_runtime);
                        if (DMMval > 0)
                        {
                            writer.Write(":CAL:MEAS " + ch + "," + type + "," + i + "," + DMMval + ",0\r\n");
                            Console.WriteLine(iADC[i]);
                        }
                        Thread.Sleep(SCPIdelay);
                    }
                }
                writer.Write(":OUTPUT " + ch + ",OFF\r\n"); Console.WriteLine(":OUTPUT " + ch + ",OFF");
                Thread.Sleep(SCPIdelay);
                writer.Write(":CAL:End 07/16/2020," + ch + "\r\n"); Console.WriteLine(":CAL:End 07/16/2020," + ch + "\n");
                Thread.Sleep(SCPIdelay);
            }
        }




///// Public Class for the DMM's double val (over serial though)

    class MEAS
    {
        public static double Measure(string message)
        {
            double data = 0.0;
            try
            {
                Form1.port.Write(message + "\r\n");// Thread.Sleep(1);
                data = Convert.ToDouble(Form1.port.ReadLine());
            }
            catch (Exception ex)
            {
                Console.WriteLine("Error in MEAS.cs (Measure() method): " + ex.Message);
            }
            return data;
        }
    }



« Last Edit: July 18, 2020, 03:05:03 pm by Mecanix »
 
The following users thanked this post: garrettm

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #81 on: July 18, 2020, 03:49:55 pm »
I'm going to try my hand at modifying garret's java today, but in the meantime, I'm looking through the python script and have a simple question:

Code: [Select]
            if value <= first_positive:
                continue
            if unit == 'A' and value > self._manual_current_limit and manual == False:
                manual = True
                print()
                print("WARNING: CURRENT BEYOND DMM LIMIT, MANUAL INPUT REQUIRED")
                self._wait_for_enter("Connect alternative DMM 10A CURRENT inputs to PSU channel %d" % (channel))

            self._psu._write("CALibration:Set CH%d,%s,%d,%g%s,%d" % (channel, ident, step, value, unit, index));


I can see the indenting, but how does python know that "self._psu._write" is not part of the if?  I am used to brackets or and endif or something.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #82 on: July 18, 2020, 05:06:38 pm »
Okay, I've modified the script to locate the last negative result and begin calibrating there.

So far:

ch2 DAC-V determine last negative value

ch2 DAC-V starting at 0.5v

ch2 DAC-V calibration
step  0, cal point:  0.5v, meas val: -0.2530v
step  1, cal point:  1.2v, meas val:  0.4428v

ANY thoughts on why my 0.5V meas is different today than yesterday?

ch2 DAC-V calibration
step  0, cal point:  0.2v, meas val: -0.0689v
step  1, cal point:  0.5v, meas val: -0.0707v
step  2, cal point:  1.2v, meas val:  0.4444v
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #83 on: July 18, 2020, 05:17:05 pm »
The good news is that seemed to work, CH2 is much better now, I can ask for voltages less than 500mV and they are good.

garrettm - do you mind if I post the modified script?

My technique was to determine the last negative step and start using it for the calibration which may skip steps before it.
 
The following users thanked this post: garrettm

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #84 on: July 18, 2020, 05:22:05 pm »
@alank2 that's fine with me, just add that you modified it somewhere so people know the difference between the two (in case someone downloads both and becomes confused as to which one they are using).

I'll post a new revision by Monday that adjusts the lower cal points as described earlier, to preserve the total number of cal points and stay in the linear region of the series regulator.
« Last Edit: July 18, 2020, 05:24:43 pm by garrettm »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #85 on: July 18, 2020, 05:42:54 pm »
@alank2 that's fine with me, just add that you modified it somewhere so people know the difference between the two (in case someone downloads both and becomes confused as to which one they are using).

Will do.

I'll post a new revision by Monday that adjusts the lower cal points as described earlier, to preserve the total number of cal points and stay in the linear region of the series regulator.

Sounds great; please reply in this thread and I'll check it out.  Your approach then is to try to find the first positive value and instead of calibrating the first step at say 0.2V, you'll calibrate it at wherever you find the first positive value?  Sounds good.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #86 on: July 18, 2020, 05:46:06 pm »
Enclosed is my mod of garrettm's Java script.

it has two changes.

#1 - When calibrating DAC-V and DAC-I, it will look to see if there is more than one negative region.  On my PS step 1 was even more negative than step 0!  It will begin calibrating at the last negative region if one exists or at step 0 if there are no negative steps.

#2 - It will make you type "1" enter before it ends/saves the calibration.  This gives you time to abort and power cycle the unit if you don't like the ADC-V or ADC-I results you are seeing.

This is my first time writing anything in Java, so hopefully I did all right.  Mostly I used garrettm's code as an example and tweaked it.
« Last Edit: July 18, 2020, 05:47:53 pm by alank2 »
 
The following users thanked this post: garrettm, Mecanix

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #87 on: July 18, 2020, 05:56:18 pm »
Your approach then is to try to find the first positive value and instead of calibrating the first step at say 0.2V, you'll calibrate it at wherever you find the first positive value?  Sounds good.

Correct. The idea is to increase the lower cal points by some fixed value so the readback values are strictly increasing. I'd be curious to see if this improves the lower ranges any over skipping points. It could be that both are equally as effective.
 

Offline garrettm

  • Frequent Contributor
  • **
  • Posts: 267
  • Country: us
Re: Automated DP832 Calibration
« Reply #88 on: July 18, 2020, 05:58:59 pm »
#2 - It will make you type "1" enter before it ends/saves the calibration.  This gives you time to abort and power cycle the unit if you don't like the ADC-V or ADC-I results you are seeing.

Good idea! I'll add that to the new revision as well.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #89 on: July 18, 2020, 06:19:52 pm »
garrettm - do you have any thoughts about the difference in the 0.5V step values today vs. yesterday in post 83.
 

Offline Mecanix

  • Frequent Contributor
  • **
  • Posts: 269
  • Country: cc
Re: Automated DP832 Calibration
« Reply #90 on: July 18, 2020, 06:38:18 pm »
Correct. The idea is to increase the lower cal points by some fixed value so the readback values are strictly increasing. I'd be curious to see if this improves the lower ranges any over skipping points. It could be that both are equally as effective.

It does. The lowest found DAC positive val allows for the ADC to set very low reference later on the cal... if you are into mV/mA projects that helps a lot. What also improved on the tolerance (I've just found out) is to have DMM sampling at 20~50Hz and capturing an *averaged value, as opposed to the single one the DMM capture after the delay. **Requires the DMM to be set to medium/fast speed + a Sum()/x or Average() function in the code.

* The DMM fluctuate quite a lot in the mV &mA ranges, that's what gave me the idea of sampling
** Extra bells & whistles for those with low end DM3058 (5.5digits) DMMs, like the one I have ;-)
 
The following users thanked this post: alank2

Offline TurboTom

  • Super Contributor
  • ***
  • Posts: 1389
  • Country: de
Re: Automated DP832 Calibration
« Reply #91 on: July 18, 2020, 06:40:08 pm »
I think it doesn't make any sense to use negative calibration points on these single-quadrant supplies. The ability to supply slightly negative output voltages at low currents IMO is a side-effect of the active discharge circuitry that Rigol included in order to be able to discharge the PSU's internal output capacitance as well as possible input smoothing caps inside the attached DUT so it doesn't take ages for the output voltage to drop when programmed to a lower value or turned off. To compensate for semiconductor thresholds, the PSU needs a slightly negative internal supply voltage, maybe just a diode drop below the output ground.

Since it's reported that arbitrary calibration points are possibe, I'ld recommend to increase in "Test Cal Mode" the output voltage in 1mV steps until 0V is reached/crossed, use this as the first calibration value and start the CAL sequence with the recommended intervals (i.e. 0V-offset plus CAL interval) up to 1 or 2V, and above this, use the "table values". The strange, non-monotonic behaviour at negative output voltages is just a side-effect of saturating the discharge circuitry and can be disregarded for calibration purposes.
 
The following users thanked this post: alank2, Wolfgang

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #92 on: July 18, 2020, 06:51:19 pm »
I think it doesn't make any sense to use negative calibration points on these single-quadrant supplies. The ability to supply slightly negative output voltages at low currents IMO is a side-effect of the active discharge circuitry that Rigol included in order to be able to discharge the PSU's internal output capacitance as well as possible input smoothing caps inside the attached DUT so it doesn't take ages for the output voltage to drop when programmed to a lower value or turned off. To compensate for semiconductor thresholds, the PSU needs a slightly negative internal supply voltage, maybe just a diode drop below the output ground.

Since it's reported that arbitrary calibration points are possibe, I'ld recommend to increase in "Test Cal Mode" the output voltage in 1mV steps until 0V is reached/crossed, use this as the first calibration value and start the CAL sequence with the recommended intervals (i.e. 0V-offset plus CAL interval) up to 1 or 2V, and above this, use the "table values". The strange, non-monotonic behaviour at negative output voltages is just a side-effect of saturating the discharge circuitry and can be disregarded for calibration purposes.

Calibrating a DP832 in the few mA / less than 1V mode for the 30V outputs is rather futile. When looking at the specs, the accuracy is so low that this does not really make sense below 1V/10mA. Two ways to fix that:
- Use a good DMM to measure *real* voltage and current (6 1/2 digit is OK, the same you would need for calibration anyway).
- Use a SMU (OK. thats real money, but then low voltage/currents are reliable.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #93 on: July 18, 2020, 07:21:04 pm »
Their steps do not seem so consistent, but perhaps there is a reason I don't see in it:

 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #94 on: July 18, 2020, 09:11:35 pm »
Their steps do not seem so consistent, but perhaps there is a reason I don't see in it:


Where does these 35 (DAC-V) calibration points originate from?  I checked my supply and factory calibration seems to be using 43 points....

Seems like calibration can use variable number of points (up to some limit)?  So I guess optimal calibration could be achieved by first scanning
the whole range (100mV steps, etc..) recording the diff between set/measured values. Then based on this data determine optimal calibration points and then do second "pass" and calibrate using the points determined based on the first pass...

Finding optimal calibration points "manually" should be rather easy based on the curve generated from initial pass, finding the optimal calibration points programmatically would require little bit work...

 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #95 on: July 18, 2020, 09:46:04 pm »
Is there a way to read the calibration points?

Also, how many points maximum can the calibration handle?
 

Offline Mecanix

  • Frequent Contributor
  • **
  • Posts: 269
  • Country: cc
Re: Automated DP832 Calibration
« Reply #96 on: July 19, 2020, 12:00:24 pm »
Boy oh boy did I woke up on the wrong side today. My yesterday's DP832 successful calibration celebration really just got smashed, I am getting totally different current readings (as much as 7% lower than after yesterday's tests). Just found out its really (too?) sensitive to temperature. That being said the current calibration needs to be done over a long period of time for this to remain consistent/accurate i.e. you can't calibrate this unit when its hot from let's say; previous cal runs.

e.g.
Step 0 > Send cal value > Wait +10min for the unit to reach its op temp for this value > Average DMM's readings over a 1min period > SAVE
Step 1 > Send cal value > Wait +10min for the unit to reach its op temp for this value > Average DMM's readings over a 1min period > SAVE
Step 2 > Send cal value > Wait +10min for the unit to reach its op temp for this value > Average DMM's readings over a 1min period > SAVE
.......
Step 32 > Send value > Wait +1min for the unit to reach its op temp for this value > Average DMM reading over a 1min period > SAVE

I'm mostly interested in the low 0~300 mA range so I'm guessing I'd need to do this for the first 10~12 steps, or does the higher A stepping also affect the ADC?

Voltage is still spot on though, very happy with that part and what matters most for me anyway. One would measure current with a precision DMM if this is required anyway. Remains I'd like this bad boy to put up correct iFigures on its LCD (in mA that is) when I feel like questioning without the DMM being hooked up.

 
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #97 on: July 19, 2020, 12:24:43 pm »
I was thinking, probably the best way to find the first positive point is a binary search.  Start at 0.5V, then add 0.25V depending on if it was positive or negative, then 0.125V, and so on.  That should be able to pinpoint the first positive in 8 tests.  Then why use their scale at all.  Divide the maximum number of calibration entries you have into the range and then do them.

Is it known how many entries maximum for each range?

Could the ADC (meter accuracy) be improved by adding more steps if we knew its maximum?

Is there a way to read the current calibration steps somehow?  (This would be nice so one could back them up and restore them if necessary).
 

Offline Mecanix

  • Frequent Contributor
  • **
  • Posts: 269
  • Country: cc
Re: Automated DP832 Calibration
« Reply #98 on: July 19, 2020, 08:22:18 pm »
PS: Trick to get accurate current readings in the low 0~300mA is to calibrate iDAC and iADC with a good 30min delay in between. So:

1 - Run iDAC steps
2 - Pause 30min
3 - Run iADC steps

Another way is to take note of the unit temperature before the calibration (Utility > Sys info > key sequence 1-3-2) and run the iADC steps when it reached to about the same as when iDAC started. Took a good 30'ish minutes for mine to come back down to normal operating temp... which is about the same Rigol recommends.

Using those steps for all channels (current):
Code: [Select]

        //CH1, CH2 & CH3 (iDAC iADC)
        string[] iDAC = new string[] { "0.001A", "0.002A", "0.005A", "0.01A", "0.02A", "0.03A", "0.04A", "0.05A", "0.06A", "0.07A", "0.08A", "0.09A", "0.1A",
                                       "0.2A", "0.3A", "0.4A", "0.5A", "0.6A", "0.7A", "0.8A", "0.9A", "1A", "1.2A", "1.5A", "1.7A", "2A", "2.2A", "2.5A",
                                       "2.7A", "3A", "3.2A" };
        string[] iADC = new string[] { "0A", "0.1A", "0.5A", "1A", "2A", "3A", "3.2A" };

« Last Edit: July 19, 2020, 08:34:20 pm by Mecanix »
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #99 on: July 19, 2020, 11:25:19 pm »
Is there a way to read the calibration points?

Also, how many points maximum can the calibration handle?


You can use the manual calibration (menu), but its bit tedious...(you can just follow the calibration process, but not save at the end...)

It might be possible to read these via SCPI (but may require that "magic" usb drive in the usb port):

338 :PROJect:CALIbration:DATA:VOLTage:WRITe
339 :PROJect:CALIbration:DATA:VOLTage:READ?
340 :PROJect:CALIbration:DATA:CURRent:WRITe
341 :PROJect:CALIbration:DATA:CURRent:READ?
342 :PROJect:CALIbration:DATA:CURRent:ADDRess?
343 :PROJect:CALIbration:INFO:WRITe
344 :PROJect:CALIbration:INFO:READ?
345 :PROJect:CALIbration:INFO:ADDRess?

(these are from: dp800_all_commands.txt found in https://www.eevblog.com/forum/testgear/need-help-hacking-dp832-for-multicolour-option/msg2325633/#msg2325633)


 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #100 on: July 19, 2020, 11:49:35 pm »
Could the ADC (meter accuracy) be improved by adding more steps if we knew its maximum?

Adding more calibration point doesn't directly correlate (meter) accuracy, but in general more points will yield better results. However, you might need only
handful of calibration points if the "difference" is mostly linear...  With low/limited number of points (like is likely case with DP800 series), carefully selecting the calibration points could yield significantly better calibration results. Key is to choose the calibration points on places where the ratio between set/measured values changes significantly..

Ideally a calibration script could run "sweep" through the entire range (in very small steps 10-100mV / mA), recording the differences on each point.
This would yield a graph of the entire range showing the "error", then curve fitting could be used to (mathematically) fit a continuous piecewice linear function (with section/line count less than max number points supported by DP800) to match the data.
Starting and end points of sections in this linear piecewice function should also be near optimal calibration points...
 

Offline bson

  • Supporter
  • ****
  • Posts: 2269
  • Country: us
Re: Automated DP832 Calibration
« Reply #101 on: July 20, 2020, 09:01:02 pm »
At some point I think calibration requires so many sample points, for example to pick the best 80 calibrations, that it would take forever - and when sitting on the typical non-climate controlled lab bench conditions like humidity and temperature will vary over the day, to the point that the resulting drift has greater negative impact than the benefit of grinding out more points.

One useful feature might be to do a quick check of the supply after calibration (and after restart).  In particular to make sure it can output the full voltage scale, and maybe check a few points to validate the calibration data installed.  With a big negative offset like Alank's for example, my concern would be that the supply can still output the full 30V post cal.  This quick check could then be run standalone, by itself and not necessarily only after a full cal.  (Does Rigol provide a check procedure?)
 

Offline bson

  • Supporter
  • ****
  • Posts: 2269
  • Country: us
Re: Automated DP832 Calibration
« Reply #102 on: July 20, 2020, 09:04:54 pm »
I can see the indenting, but how does python know that "self._psu._write" is not part of the if?  I am used to brackets or and endif or something.
It looks at the indent. :)
 

Offline aristarchus

  • Regular Contributor
  • *
  • Posts: 107
  • Country: 00
Re: Automated DP832 Calibration
« Reply #103 on: July 20, 2020, 09:53:06 pm »
I can see the indenting, but how does python know that "self._psu._write" is not part of the if?  I am used to brackets or and endif or something.
It looks at the indent. :)

The fathers of those guys that created python most likely are the good old COBOL devs..   :-))))
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #104 on: July 21, 2020, 08:15:06 pm »

Calibration points can be viewed manually using the (manual) calibration menu. No need to calibrate anything.

Go to calibration menu and choose "Cal Item", then to view calibration points select "Cal Point" and then press "Meas Val", this will display the calibration point.
Now, you can press "Cal Point" again to back out (then just select new cal point and repeat...)


I recorded the factory calibration points for my unit, since wanted to test automating calibration, but want to be sure I can put things back to as they were if needed...



Looks like factory calibration is dynamically selecting calibration points for each channel/unit. This makes sense since calibration points chosen based on individual unit/channel characteristics likely yields significantly better "calibration" than using fixed calibration points...

 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #105 on: July 21, 2020, 08:28:14 pm »
Interesting.  What is the story with the calibration password.  I've seen it said to be 2012, but the java script garrettm has used 11111.  Will it take any password at the remote command?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #106 on: July 21, 2020, 08:43:28 pm »
sequoia - if you enter cal mode and do a set command to see what it will output, what is the measured output for the factory defaults?  (ch1 0.73, ch2 1, ch3 0.115).  Just below zero?  Just above zero?  Further above zero?  Part of me wonders if it is further above zero, how it interpolates without a lower value.  I wonder if it looks at cal points 1 and 2 (using your graph example) and assumes a cal 0 based on the 1/2 if that makes sense.  I am assuming that VDAC 0.730V has some conversion to an integer DAC value that is handled in their firmware.

 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #107 on: July 21, 2020, 09:43:07 pm »

Seems as if factory calibration has chosen the first calibration points so that measurement is near 0V....

V DAC:

(channel: 1st cal point / measured value)
CH1: 0.730 / -0.05206
CH2: 1.000 /  0.32402
CH3: 0.115 /  0.08842


 

Offline Wolfgang

  • Super Contributor
  • ***
  • Posts: 1775
  • Country: de
  • Its great if it finally works !
    • Electronic Projects for Fun
Re: Automated DP832 Calibration
« Reply #108 on: July 21, 2020, 09:59:11 pm »

Seems as if factory calibration has chosen the first calibration points so that measurement is near 0V....

V DAC:

(channel: 1st cal point / measured value)
CH1: 0.730 / -0.05206
CH2: 1.000 /  0.32402
CH3: 0.115 /  0.08842

Plausible. Cal with negative ADC values makes no sense. What I did is
- use a manual voltage/current list with a high resolution at very low voltages (steps 0.1V)
- ignore negative calibration points

Result:
- All voltages/curents *above* the ignored points are accurate
- The points belows (i.e. very low voltages/currents) are so inaccurate anayway that they should not be used.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #109 on: July 21, 2020, 10:35:58 pm »
On your CH2 sequoia if you set it to 100mV, 50mV, 20mV, 10mV, 5mV, 2mV, 1mV - does it do a good job of outputting those?
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #110 on: July 21, 2020, 10:50:33 pm »
On your CH2 sequoia if you set it to 100mV, 50mV, 20mV, 10mV, 5mV, 2mV, 1mV - does it do a good job of outputting those?

Error is below 0.5mV which seems well within the specifications.
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #111 on: July 21, 2020, 11:13:10 pm »
Superb - I have a feeling they must have a way to look at the 1st point (1.00V) and the next higher one up and use that to extrapolate even below the 1st point.
 

Offline sequoia

  • Supporter
  • ****
  • Posts: 154
  • Country: us
Re: Automated DP832 Calibration
« Reply #112 on: July 21, 2020, 11:56:37 pm »
Superb - I have a feeling they must have a way to look at the 1st point (1.00V) and the next higher one up and use that to extrapolate even below the 1st point.

Maybe they just use the "difference" from fist calibration point for any values below first calibration point...I.e. there will always be "0" calibration point
for 0.000 that has same value as first calibration point, so nothing to extrapolate as long as last calibration point is set to maximum value of the range (which seems to be case with factory calibrations).


« Last Edit: July 22, 2020, 12:02:35 am by sequoia »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #113 on: July 22, 2020, 03:41:21 pm »
I am not sure I am following you exactly.

I'm thinking that perhaps the thing to do on the cal is to use a binary search to find the 0.000 point (start at 0.5V and move up/down depending on the reading).  Once that point is established, split the other points (36, 46, ?) to get to the maximum value of 32V.  Let's say that 0.52V is my 0 point.  Then I take (32-0.52V)/35 to calculate the increase for the other steps.  I'll attach an Excel screenshot.  This is even, but does it give enough points at the lower level or should they be more dense?
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #114 on: July 30, 2020, 12:49:02 am »
Anything new on this?

Any comments on the DAC's being more linear and the ADC's being more logarithmic?

Looking at the standard values for it seems the DAC for the 32V channels ranges from a 0.3V increase between steps to a final 1.8V increase.  If I determine what the value is that is 0, is there any downside to just using uniform steps between that and 32V instead of the default ones?

Then the ADC is: 0, 0.05, 0.1, 0.5, 1, 5, 10, 12.8, 20, 30, 32 - should these steps just be left as is?

 

Offline bson

  • Supporter
  • ****
  • Posts: 2269
  • Country: us
Re: Automated DP832 Calibration
« Reply #115 on: July 30, 2020, 03:20:52 am »
I'm thinking that perhaps the thing to do on the cal is to use a binary search to find the 0.000 point (start at 0.5V and move up/down depending on the reading).
I don't think that will work.  The values aren't necessarily monotonically increasing below 0, which eliminates all bisection algorithms (of which binary search is one).

(Actually, I take that back; binary search will work, while many other bisection algorithms won't.)
« Last Edit: July 30, 2020, 03:27:16 am by bson »
 

Offline alank2

  • Super Contributor
  • ***
  • Posts: 2185
Re: Automated DP832 Calibration
« Reply #116 on: July 30, 2020, 04:24:02 pm »
Here is my plan:

CH1/CH2/CH3 volt dac - try to determine the zero point by doing a binary search between 0V-1V.  Once I find it, then do a linear range with the other 35 points up to 32V.
CH1/CH2/CH3 volt adc - use Rigol's points.
CH1/CH2/CH3 amp dac - use Rigol's points.
<Delay between these steps at least 15 minutes>
CH1/CH2/CH3 amp adc - use Rigol's points.

Has anyone ever seen the same problem with the amp dac that we've seen with the volt dac in that a higher point was a lower value, or is that only in the volt range?
 

Offline tangram

  • Contributor
  • Posts: 10
  • Country: gb
Re: Automated DP832 Calibration
« Reply #117 on: November 03, 2020, 07:30:41 pm »
Just wanted to offer you (Gandalf_Sr) a massive thanks for your thread here and the undoubtedly large amount of effort you've spent experimenting with this and putting the info together.

Having followed your instructions and, having thrashed around for some ivi info for my 34465A (which I found here - https://github.com/phsdv/python-ivi -) and a few tweaks, I've run through the DP832 calibration and it's worked a treat!

Where I've fumbled through the 'Manual' process a few times, the process is INCREDIBLY painful.  The Automated DP832 calibration is a joy!

I've also learned a little about the possibilities inherent in using Python and IVI techniques, which has been great fun.

Thank you again for all your efforts, sir.  Very much appreciated.

Tom  :-+ :-+ :-+ :-+ :-+ :-+ :-+
"All specifications are subject to change!"
 

Offline Gandalf_SrTopic starter

  • Super Contributor
  • ***
  • Posts: 1729
  • Country: us
Re: Automated DP832 Calibration
« Reply #118 on: November 07, 2020, 01:16:51 am »
I did not do it alone but thanks anyway.

How did you end up running the Python code and what were your results like?
If at first you don't succeed, get a bigger hammer
 

Offline Trident900fi

  • Contributor
  • Posts: 15
  • Country: fr
Re: Automated DP832 Calibration
« Reply #119 on: April 08, 2021, 12:35:08 pm »
hi folks,

I've just calibrate the channel 1 without problem.
Than I try to calibrate channel 2, but there is no more voltage going out from channel 2 and 3 now !
You can ask whatever you wont, you always get 0V...
Any idea !?

Thierry
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf