Poll

Using a github repository with submodules

Fine, it gives the possibility to easily upgrade the script to newer dependencies
7 (87.5%)
A bit of a hassle, but I can get it to work
0 (0%)
I don't plan on to use git, so I would like a single download
0 (0%)
No need for these scripts, I use other tools
1 (12.5%)
No need for these scripts, automating is not my thing
0 (0%)

Total Members Voted: 8

Author Topic: Automated battery charging using a calibrated SDS1104X and SPD3303X - done  (Read 1817 times)

0 Members and 1 Guest are viewing this topic.

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
This post is about how to use expensive gear to replace a cheap battery charger!

That and to showcase what can be done with some simple scripts. (I'll post them on GitHub.)

The idea is that later on only the SPD3303X will be used to charge a NiMH battery and to stop when full. The moment of a full battery can be determined by checking when the voltage drops when using CC. How much depends on the charging current, but with 1000 mA, 5mV should do the trick. This difference should be measurable in a reliable way using only the PSU.

But first I did some precise scope measurements. The charging graph is created using averages of 140 kpt, capturing those waves as fast as possible. (It uses AC-line triggering)

The script starts at 500 mV/div, but steps down to 50 mV, 5 mV, 500uV when there's no overflowed ADC value. Sample points where only reported in the lowest 500 uV vertical div with no overflows.

When there's an overflow, the VDiv will step up. Because the voltages change slowly this should be very rare (and the 2nd measurement after that will probably be a step down again). The last successfully measured voltage is taken as the offset (-).

This way it is possible to do measurements better than 100 uV resolution and has great noise filtering capabilities.

The absolute accuracy is mostly determined by the offset DAC of the scope.

The first image shows a zoom where the vertical grids lines are only 100 uV apart.
« Last Edit: January 09, 2020, 09:23:44 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 
The following users thanked this post: rf-loop, Performa01, tautech, tv84

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Re: Automated battery charging using a Siglent SDS1104X and SPD3303X(-E)
« Reply #1 on: October 27, 2019, 04:31:27 pm »
I changed the script to have a slower H-Div and acquiring and averaging 1.4 Mpts. This makes the scope zoom even less noisy.

The script/scripting tool has no problem processing this much samples, the most time is used by the scope and the transfer of data.

The measured voltage by the PSU differs a few mV, but follows the scope line very nicely. To bad they limited the voltage resolution of the PSU to mV's.

But there's seems to be no problem in stopping at a 5 mV decline, this means only the PSU is needed to charge the battery and stop it when full!
« Last Edit: October 27, 2019, 05:25:45 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 19428
  • Country: nl
    • NCT Developments
Re: Automated battery charging using a Siglent SDS1104X and SPD3303X(-E)
« Reply #2 on: October 27, 2019, 05:28:46 pm »
Why are you using an oscilloscope to measure the voltage? Just reading back the PSU voltage is simpler and way more accurate. The DC error in an oscilloscope can be as much as 3% and there can be non-linearities as well. The PSU OTOH is likely accurate to less than 1%. mV resolution is good enough for battery charging anyway.
« Last Edit: October 27, 2019, 05:30:22 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Re: Automated battery charging using a Siglent SDS1104X and SPD3303X(-E)
« Reply #3 on: October 27, 2019, 06:00:46 pm »
Why are you using an oscilloscope to measure the voltage? Just reading back the PSU voltage is simpler and way more accurate. The DC error in an oscilloscope can be as much as 3% and there can be non-linearities as well. The PSU OTOH is likely accurate to less than 1%. mV resolution is good enough for battery charging anyway.
Thanks for the response!
If it is about relative voltages the graph show that the scope is a clear winner in showing the charging curve. That makes it a good candidate to check the voltage response of the PSU. With a 1 mV resolution that one is a lot more coarse. But still very usable and probably better in absolute terms. The differences are small.

Regarding the scope, is it technically hard to have an accurate offset dac? That one doesn't need to be fast..
But I don't know what is generally more precise dac's or adc's.
« Last Edit: October 27, 2019, 06:19:40 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 19428
  • Country: nl
    • NCT Developments
Re: Automated battery charging using a Siglent SDS1104X and SPD3303X(-E)
« Reply #4 on: October 27, 2019, 06:15:48 pm »
But I don't know what is generally more precise dac's or adc's.
Doesn't matter. It depends entirely on the specifications (ENOB, linearity).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I've placed the scripts on GitHub.
https://github.com/HendriXML/XMLScripts-Project-BatteryCharging
Can be downloaded including the submodules via:
Code: [Select]
git clone --recurse-submodules https://github.com/HendriXML/XMLScripts-Project-BatteryCharging.git C:\BatteryCharging

The script "NiMH Battery charging using a SPD3303X.xml" uses only a PSU. The other script "Battery voltage measurement.xml" uses a scope to do the measurements.

The script are just examples, and will act as templates for other scripts in the future. But they show/explore the following principles:
  • Report sufficient information, the directory in which reports are saved can be set in the tool
  • Make important parameters configurable in a ini-file and report the used values back - so tests can be redone
  • Parameters are formatted with scales and units: like mV and ms etc. They can also be supplied in this way. So mistakes are less likely
  • The current status is frequently updated during the measurements
  • The measurements are reported in a separate report. Which can easily be copied and pasted in Excel

The "Battery voltage measurement.xml" is more advanced and shows how to program measurement ranges using objects, sorting them and use some state handling.

Both scripts use scaled integers (these start with conv-), these are implemented in script libraries which where "borrowed" from my BOM reporter scripts. But floating point versions of properties are available as well. The reason for this is that scaled integers are very well suited to transfer decimal based numbers without any loss or rounding errors. (Unless the pico resolution is not enough...) And I already had the logic to convert them from/to text.
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Some screenshots to give an idea what it looks like.
« Last Edit: November 08, 2019, 10:18:35 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
A lot of the coding lines of the script consists of reading config, reporting and showing the current status. In a way that make them more complicated to read. But the more I use these script, the more I see this logging and documenting as very useful when structural experimenting is done.
The idea is to change the reporting directory on every run. Because the text that is written in the input tab is reported that could be used to comment on the extra conditions of the test run. With a proper script, experiments can be repeated more quickly and be documented meanwhile.

Even only if these kind of scripts are used to do the initial setup of devices and report what values where used they still offer a lot of benefits.

Also the scripts mostly use the object oriented wrappers around the SCPI communication, this makes it easier to parameterize stuff, but sometimes outputting SCPI commands works just as well. Especially when the wrappers lack the implementation of a command.
« Last Edit: November 09, 2019, 08:48:28 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
The script that uses the scope is updated (and one of its dependencies).

One important parameter was not configurable yet: the bandwidth limit of the channel. Without it turned on, measuring at 500uV sensitivity is undoable. B.T.W. what sensitivities are used can be changed in the ini file section. Although that section is normally loaded from the script file, thar section can be imported using an command line option. This could "automate" switching between different configurations.
Also the script now only enables the channel that is used. In this way only a single value in the configuration needs to change when swapping physical connectors. That's a great benefit of parameterizing VISA interaction.
« Last Edit: November 09, 2019, 08:49:52 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Now that I own a 6.5 digit bench multimeter (Keithley DMM6500) it was time to check which device gave a more accurate voltage reading, the PSU or the scope.

As can be seen it is the scope that is the most accurate.

The Keithley DMM6500 can run what is called a trigger model. In this case it is a loop that waits each iteration for a software trigger from the X-script.
When it recieves one it triggers the Ext out trigger of the DMM on which the scope is connected is triggered.
Then the MM directly takes a measurement during 12 power line cycles.
After that it sends a service request which resumes the X-script.
Then it continues with the next iteration, waiting for a software trigger.
Meanwhile the X-script wait for the scope to finish it's acquisition and takes an average of also 12 power line cycles.

So the scope and the DMM measure exacly the same stuff.

The next step is to create calibration tables for the scope, this way the curve will get very close to the DMM's. I'm targetting 0.1 mV accuracy in the 0..2V range!
The big question is mainly how stable/durable the calibration (data) will be.
« Last Edit: December 25, 2019, 01:14:57 am by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 
The following users thanked this post: Performa01, tv84

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
When building the script to create calibration/interpolation tables. I need a programmable voltage source which can give stable voltages (It does not need to be very precise, but high precision does have benefits)
For that I used a schematic which was proposed by Performa01.
https://www.eevblog.com/forum/testgear/sag1021-vs-sdg1032-what-to-expect/msg2445294/#msg2445294
On the PCB I used a gnd spring / turned pin to have a good probe connection. The wires that hold the banana plugs are doubled and soldered on the backside as well. This makes this sturdy enough to connect it to the DMM with ease. I also added a 10 uF capacitor for noise control.

With this and the script I can produce voltages from -3 V to 3 V. The graphs show how stable those voltages are over a longer period targetting 2.5V. The voltage is "PID" controlled about every second using the fine channel.
To get to this point the script goes trough the following fases:
  • Create coarse output-input mapping (for each coarse output we need to know the channel 1 input)
  • Create close to zero voltage
  • Create fine output-input mapping (for each fine adjustment correction we need to know the channel 2 input)
After that the mappings can be used to quickly set a precise output. (But even then the output keeps being error corrected using real measurements.)

The following project step is to generate a scala of voltages and determine which offset on the scope negates those. But also measure the ADC response of the 500uV/Div range.

For this I would like to have the scope measure less noise then now is the case. Somehow the capacitor does not dampens the noise (about 1mV) as much as I would like to.
Maybe I have to "can" the circuit to see whether that makes a difference?  (The used coax cables might also be of non optimal quality.)

The planned scope measurements are actually quite robuust against noise regarding to accuracy, the largest problem it gives is that this signal goes out of screen much earlier when using the 500uV range, but I don't know how robuust the DMM measurements are on a noisy signal.
So suggestions are welcome!
« Last Edit: December 29, 2019, 02:32:57 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline Rerouter

  • Super Contributor
  • ***
  • Posts: 4579
  • Country: au
  • Question Everything... Except This Statement
The other things you can do with a setup like this is map out capacity per 10mV and if you alter the charging current by a small but measurable amount every say 50mV you can take a realtime ESR measurement of the cell, and map the ESR vs charge level

Tie in temperature and from 2 different runs you can begin to work out ESR vs Temperature,

And I have also confirmed on the same model scope that the offset DAC is the basis of the calibration on these scopes, the VGA is calibrated off that DAC, generally to better than 1% FS, though there is an offset of about ~100uV on AC and GND input modes
« Last Edit: December 29, 2019, 02:51:55 pm by Rerouter »
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
 :-+ The idea is indeed to have a 4 measurement possibility whenever its needed. But in the end the setup/calibration time needs also be kept doable. Also this way af measuring is limited by the slow tracking speed. This could be improved by using a less sensitive scope range, but that is a precision tradeoff.
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline Rerouter

  • Super Contributor
  • ***
  • Posts: 4579
  • Country: au
  • Question Everything... Except This Statement
you could also use the scopes averaging methods to filter down the noise, set it to a reasonably fast timebase and use say a 64 sample filter, you can use this to set the "cutoff frequency" in a way, e.g. a 100us/div capture with 64 waveforms averaged would behave like a 10Hz filter, (0.1 seconds / 14 division / 64 averages = 112uS)

Also the SAG1021 offset dac's range can be higher if set via SCPI, as it can be set up to ~+-4.2V so technically you can use the same approach to measure lithium cells :)

Hmm, the filtering I think would need to be built externally, but you could use the 20MHz bandwidth limit option to help things out, possibly building an LC filter with a few KHz bandwidth at where the scope measures.
« Last Edit: December 29, 2019, 03:58:35 pm by Rerouter »
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
The SAG1021 I remember had too much issues to be used in this way...

For higher voltages than 3.3 V another circuit should be used for the -10..10V SDG1032 to get to 4.2 V.
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline Rerouter

  • Super Contributor
  • ***
  • Posts: 4579
  • Country: au
  • Question Everything... Except This Statement
That or use the scopes +-2V offset range, seeing as most lithium chemistries charge range fits within 3V,
 

Offline Performa01

  • Frequent Contributor
  • **
  • Posts: 901
  • Country: at
For higher voltages than 3.3 V another circuit should be used for the -10..10V SDG1032 to get to 4.2 V.
Sorry, my initial combiner was unnecessarily complicated. Back then for some reason I figured that you don't need more than 3.3V and if that's true, then the 1/3 division ratio is beneficial indeed. It increases the resolution, reduces the potential inaccuracy of the generator's output impedance as well as the noise and provides a means for full trimming on top of that.

If you want the full output range with a fine adjust range of 1/1000 (+/-10mV), then all you need is this:

899130-0
2-Way_Combiner_02

The output impedance is still the original 50 ohms (minus 0.1%) and you get the full output voltage. V2 is the fine control for 1/1000 of the range, i.e. +/-10mV.

In general, the circuit provides full output of +10V when both V1 and V2 are set to 10V. It can be adjusted down to 9.98V by V2.
The output is 0V if both V1 and V2 are 0V and can be adjusted to +/-0.01V by V2.
As an example for an arbitrary voltage, the Output is 4V if both V1 and V2 are set to 4V. It can be adjusted between 3.986V and 4.006V by V2.

When loaded with 50 ohms, the output voltage drops by one half as usual.
« Last Edit: December 30, 2019, 09:39:44 am by Performa01 »
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
The original combiner suits me well.

These circuits will never become mathematically correct. The way I see them is that the fine control runs a bit of current through the coarse output impedance. That current is both dependent on the coarse output, the fine output and load, so the voltage drop ("fine control") is also dependent on them.

In my HiZ case I should let the fine channel follow the coarse channel when no fine control is needed and only create a voltage difference when needed. Or "compensate" the coarse channel for each output.

At the moment the latter is done automatically when creating the mapping. However how great the relative effect of the fine control is will still depend on the coarse output and load. But the differences will be small and my script corrects the error voltage anyway.

So good enough!

About the 50 ohm output of the original circuit, I have my doubts though...
« Last Edit: December 30, 2019, 03:12:23 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline Performa01

  • Frequent Contributor
  • **
  • Posts: 901
  • Country: at
About the 50 ohm output of the original circuit, I have my doubts though...
Oh? Are you saying that 150 ohm // 75 ohm is not 50 ohm? ;)

When loaded with 50 ohms, the output voltage drops by exactly one half, as expected.
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I did a run to get calibration data for the offset DAC.

The table shows the measured voltage by the multimeter (supplied by the AWG) and the offset (times -1) that it would take to have average ADC values of nearly zero.

The difference between those 2 are plotted.

This is still an early attempt, I hope t get an even more smooth graph out of it. It may also be that the few bumps just match the scope real performance. Which looks already very fine to me. 
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I ran the whole process again and it is surprisingly repeating. Will do one with smaller steps tomorrow, the 0.1V steps take about 30 minutes. 0.01V steps will take about 10 times that long..

But there's still of room for optimisation.
« Last Edit: January 03, 2020, 01:29:35 am by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I ran the script with smaller input steps (100 mV). The 3 and 4 data are different in the beginning of the run. That can most likely be explained by the warmup time it actually needs (much more than half an hour), I know from experiernce that the offset is quite sensitive to that.

The next step to take will be to create calibration data for the ADC. But that's not trivial, because of the relative high noise. In every measurements many ADC values (different steps) have their influence. It would be nice to isolate only a few to determine their mapping. In the mean time the offset can also drift.. I'm not yet sure how to attack these issues.

But looking at these graphs I think around 0.1 mV accuracy is indeed possible if the ADC calibration can be done accurately.
« Last Edit: January 04, 2020, 04:48:39 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I gathered some test data to see how "stable" the offset / adc readings are.

The input (target) voltage was 1.000000 V this was mapped to a matching offset. Which did not change anymore during the testing.

In the graph the blue dots are the difference between the target voltage and the dmm measured one. The orange dots are the difference between what the scopes adc measured and what the dmm measured.

Only measurements where reported that where less than 10 µV from the target voltage.

Can this data be used to calibrate adc values? The full range of usable adc values in the 500 µV setting is 8 x 500 µV = 4 mV. The uncertainty of the measurement seems about 60 µV so only a few calibration "rows" would make sense. If the points where more related to the previous measurement, the script could ensure the offset was zero'd (adc of around 0) and then immediately do an actual measurement (with some extra V input to check the adc response). But that's not really the case.  :'(

The graph said drift, cause it was created to check the offset drift. But as can be seen it fluctuates, but doesn't really drift. This means we could get more precise calibration tables using averaging, but that would also means an even more time consuming procedure.
« Last Edit: January 04, 2020, 04:53:24 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I don't think the adc can be calibrated with sufficient accuracy using the 500 μV/div setting. I'll revert to the 200 mV/div setting instead and scale the measurements. It won't be calibrating the same measurement path, but it will use shared elements.
Whether scope measurements in total will be accurate using calibration data generated this way will be checked.
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
There's to much time spend on setting the fine level of the output. This is mostly because the used circuit's fine output isn't predicted well (uses interpolation, but disregards the dependency on coarse input). To optimize this I think I came up with a better combiner circuit. I unleashed Kirchhoffs laws on it using Maxima. And it seems like the output can easily be calculated thus predicted with it.
In a way I think I might messed up doing the maths, but I cannot find any problem, so here it goes:
« Last Edit: January 07, 2020, 01:05:49 am by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I build the attached circuit. I've eventually chosen different values. It now has a input impedance of very close to 50 ohm on each channel.
The coarse voltage division is now 1/2, which is quite standard.
The fine voltage division is now  1/2000. So on my AWG it can swing from -5 to 5 mV.
I've tested this by hand and it seems indeed to work also when the coarse input is at it's extremes. But some automated testing (with averaging) is needed to see how precise it is.

The 47 ohm resistors may dissipate more than 0.5W so its best to take one that can handle that without drifting in resistance a lot.
« Last Edit: January 07, 2020, 12:42:59 am by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
The actual 47 ohm and 3 ohm resistors (when handpicked) should theoretically be 49.9 ohm, to get the mentioned divisions. See Vo, x (coarse) and y (fine) fractions.
« Last Edit: January 07, 2020, 12:55:59 am by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
I used the new setup and ran into the issue that the coarse setting could not reach the required accuracy (error < 2 mV) before going to fine control (-5mV to 5 mV) at the 1.5V test voltage. This because the AWG is switching ranges on 3V, which comes with some extra error, resulting in a small voltage range that cannot be reached.
I solved this going for a bit coarser fine tuning of  (-10mV to 10mV) so I could decrease the required accuracy (error < 5 mV) before going into fine control. (R5: 100 -> 200 mOhm)

Because of the much better "set voltage -> output voltage" prediction the script can now execute in 1h40, with ADC error measurements.

Those measurements are shown below for the 200 mV/Div range. The errors are at most 1,25% of full range. Also the errors seam to have an offset of +2.5 mV, which could also be attributed to the offset DAC. It could well be that the offset in the 200 mV/div setting behaves different than the 500 uV/div (for which precise offsets can be generated).
Maybe the correction should be scaled?

However the errors are so small that when scaled as if it was measured in the 500uV range they would be small relative to the vertical offset precision of 60 uV that was measured before. Both taken together I think the 0.1 mV precision/accuracy can already be achieved. So the next step will be to write the calibration table to a file, and load it into the battery charging script.
« Last Edit: January 07, 2020, 10:23:31 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Re: Automated battery charging using a calibrated SDS1104X and SPD3303X - done
« Reply #28 on: January 09, 2020, 09:47:17 pm »
An updated charging script was run, and the result are well within 0.1 mV precision and accuracy!  :box:

I also found out that the previous measurements had a little voltage drop (1.5 mV) due to the 1 A current draw, which mostly influenced the scope measurements. Those where uncalibrated better than it previously showed. The question how much they would differ can be answered using the calibration graphs..

I also used channel 2 instead of 1, just to see whether the calibration curve would be different. But as can be seen they're very alike.

Because of a more narrow voltage range (-100 mV .. 1.8 V) the used calibration was done in 50 minutes,.

To conclude: for signals that don't change fast this could be a worthy use of a scope. I think it can even get more precise, it really would have helped if the AWG voltage source had a bigger bipolar buffer capacitor added to it. The battery charging voltage is noticabley more stable.
« Last Edit: January 09, 2020, 09:52:24 pm by HendriXML »
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 
The following users thanked this post: tv84

Offline HendriXML

  • Frequent Contributor
  • **
  • Posts: 718
  • Country: nl
    • KiCad-BOM-reporter
Re: Automated battery charging using a calibrated SDS1104X and SPD3303X - done
« Reply #29 on: January 20, 2020, 06:19:24 pm »
I updated the scripts, but some changes where not completely tested.

I also added a scriptfile for viewing purpose which used the calibration and the Keithley DM6500.
https://github.com/HendriXML/XMLScripts-Project-BatteryCharging/blob/master/Script.Project/Battery%20voltage%20measurements%20DSO%20and%20DMM.xml
“I ‘d like to reincarnate as a dung beetle, ‘cause there’s nothing wrong with a shitty life, real misery comes from high expectations”
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf