Author Topic: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...  (Read 162620 times)

0 Members and 1 Guest are viewing this topic.

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
PREFACE

So, this little low cost ($15USD) MingHe B3603 DC/DC buck unit has been getting some interest here.  Initially, I merely waned to write a post about careless use of the SET key can be fatal to stuff connected to this unit.  However, this board is pretty good and I didn't want to just say something negative and leave.  So, this post grew into this "mini review".  The SET key issue is in Section III.3.

130389-0

The B3603 is a digitally controlled CC/CV board providing constant voltage and current limit.  It accepts 6V to 40V input and outputs from 0V up to Vin minus 2V.  By spec, it can support up to 3A, but driving it at max would be somewhat wishful thinking.  I have done 2A sustained without problem, and 2.7A for brief period (15 minute-ish).  It may be able to run 2.7A sustained but I have not been brave enough to try.

In a nut shell, if you want to power some stuff within 5%, this is a great little unit.

I. How well is it built?
II. How well does it work?
III. Calibration
IV. User Interface
V. The Serial Port
VI. How noisy is it?
VII. Wish list
VIII. Last words

Typical first and second questions are: 1. How well does it work? and 2. How well is it built?   So let's get into the shorter of the two first.

I. How well is it built?

I will address the construction quality alone and leave the quality of the design to someone with more knowledge of electronic design and the schematic.

I.1 The good

The quality matches that of typical low cost Chinese boards.  The unit consists of two boards of different quality.  The main board is almost perfect visually.  The bad and the ugly are with the daughter card.  The daughter board with the LED and MCU is below par.  The spread of solder is about 90% consistent and parts are well aligned on both boards.

130391-1


130393-2


I.2 The bad

I found two problems, both from parts likely hand-soldered.  The four tac-switches have bad solder distribution.  However, even looking very carefully, I was able to found only one questionable joint visually.  The component side of the questionable joint has sure-good contact but the under side looks dry and insufficient solder.    The daughter card underside picture "top-board-side-2" has a yellow arrow pointing at the problem joint.  The second problem is with the pin header and is relatively minor problem: 1 of the 16 pins has visibly less solder.  Less but still adequate and it is certainly making good contact.  Doesn't look good but works.  (Next section will discuss the red arrows)

130395-3


130397-4


The SMD joints appear to have much better consistency.  They all have good shinny surfaces and all appear to have adequate solder and joint quality.  I can't find any questionable SMD joint to comment on.

I.3 The ugly

This problem appears to be common for this model.  I see the same issue on at least three different pictures each from a different eBay vendor - all appears to be on the daughter board only.  You can see it best on the attached picture top-board-side-2 (red arrows).   On top of the solder mask, there is reflective transparent enamel like protective layer.  The application of this layer is inconsistent.  At some parts, it lumps up, and at other parts it doesn't cover the board fully and the edge is dried and flaking.  The quality of the joints is good so with or without this protective layer I do not expect problems there.  It just looks messy.

130399-5


II. How well does it work?

With this digital controller, the quality of the regulation remains with the buck regulator whereas the MCU's ADC and Shunt dominates the accuracy of the setting and of the display.  So, First, the component choice:
- MCU is STM 8S003F3 (10bit ADC +- 1LSB from page 1 of spec),
- Buck is LM2596S (+-4% from page 1 of spec),
- two MCP6002 rail-to-rail Op-Amps to do its work,
- two 74HCT595 shift-register to drive the two 4 digit 7segment LED,
- a 0.05ohm shunt for current measurement (probably 1%).

From the most up-to-date spec (2014 in both cases), the ADC with has +- 1 LSB error and the LM2596s (TI version) has load and line regulation of below 4% error.  Not knowing exactly how it uses the Op Amps, I cannot factor them in.

With 10 bit ADC, 1 LSB is 0.1% range which is small comparing to the 4% of the LM2596.  So if one assumes a perfect design, up to 4% error can be expected from these two critical components.  Unless you have really bad luck, the 1% component error in other components will fit inside the worst case +-4%.

II.1 How well does it regulate?

The voltage regulation is about what one would expect out of an LM2596s.  It is very similar to other Chinese made CC/CV boards I have.  The line regulation is good and the load regulation not so good.  That said, this board does better with noise than the other boards I have.

I use my UT61E as benchmark.  As necessary, I rounded the UT61E's reading to 2 digits after decimal to compare against the board's reading.  The UT61E was calibrated against the DMM-Check-Plus at 5V and 1mA.

At no or low (0-50mA) load, the UT61E measured voltage is < +- 3 digits of displayed voltage.  For example, if the unit displays 05.00V, my UT61E will read between 4.97V to 5.03V; and at 20.00V shown, my UT61E (with rounding) will read between 19.97V to 20.03V.  I have yet to see it exceed +- 3 digits (with rounding) at low-load and mostly it is +- 2 digits or less.

For a 10bit ADC, they are doing really well getting it to within +-00.03V.  Mathematically, 1023 counts for 36V translate to 0.0352V per count.  The observed deviation of <+-0.03V is below that of math rounding error alone even if we ignore the +-1 LSB and Op Amp error.  It would be reasonable to assume they use averaging and/or other techniques to improve the accuracy some.

The picture changes at higher current.
@0mA 8.67V is 8.696V  (on my UT61E)
@500mA  8.67V is 8.63V (0.5%)
@1000mA 8.67V is 8.577V (1%)
@1500mA 8.67V is 8.579V (1%)
@2000mA 8.67V is 8.494V (2%)
(During tests, the UT61E was connected to the board directly so any drop along the power-carrying cable is excluded.)

It is what I've seen with other LM2596 boards, when the load is heavy,   If I understand the LM2596s' spec correctly, load and line regulation is only within 4%, so the reading above is within the component specs.

I.2 Current

Current reading and regulation is considerably affected by the temperature of the shunt.  At higher current, the heat-induced resistance change come into play more and more significantly.  It needs a shunt with better temp-co and/or better cooling.

Initially confused by the readings, I added an ADS1115 to read the shunt directly displaying both the shunt's mV and use another MCU to translate the mV into corresponding mA.  The actual shunt voltage drop is accurately measured by the board when compared to my ADS1115 (PGA set to 250mV) and my UT61E's mV measurement.  However, the temperature change alters the shunt resistance making the mA reading wilder and wilder.

I use 4 readings for each comparison:
- Two for the internal shunt: the B3603's reading, the ADS1115 reading the shunt's mV and also translated to mA to display.
- Two "actual current output" with two different DMM's: my UT61E's mA/10A as direct current meter, and second DMM measures current using an external 0.1ohm shunt with cooling fan.

The B3603 and ADS1115 measurements match (since both are measuring the hot shunt).  The two DMM's current measurements also match each other.  But the 2 DMM's matching measurements are different from the ADS1115 and B3603's matching measurements.  That means the resistance of the shunt changed.

The internal shunt current measurement will start up matching (at low current and cool system) the external DMM/Shunt measurement.  As current increases and by about 1A, the reading becomes noticeable different (>5mA delta), by 1.5A the shunt get hot enough the delta approaches and exceeds 10mA.  The reading will continue to change for a few minutes until the shunt fully warms up for that level of current.   If I aim a fan at the internal shunt, the readings will close with the DMM mA measurement slowly confirming that the temperature of the shunt was causing the error.

While I have played with this most significantly, I did not collect Temp Vs Current data.  It took me a while to realize how much the temperature was affecting the reading so the early readings without temperature logged are basically useless.  Thus, I am left with my observations only.

My observation is that the reading (at low current) is within 1-2% of actual (using UT61E as mA meter and 10A meter).  At high current (1A+), it is within 2-3% of actual and increases as it gets warmer at higher current still.  With a fan on the B3603's shunt, it got better but not better than 1-2%.  If the unit is calibrated warm, it will be accurate at high current and error increases as current decreases.  Lower current does not creating enough heat to warm the shunt.

Total observation time is about 15-30 hours spread over a few of weeks with ranges from 0mA to 2.5A, and from 0V to 30V+.  I am comfortable in estimating current reading variation is 2-4%.

III. Calibration

The unit came calibrated well (meaning: I fiddle and fiddle and can't get it much better).  One can do more calibrations; however, accuracy is limited by the 10bit ADC.

III.1 Factory reset to get factory calibration

As wrong calibration can really mess up your day, when in doubt, "reset to factory" I found on the Chinese user manual is probably the best bet:
- Press and hold SET till you see F1.
- UP arrow to F6 and press OK to enter
- At r--n display, UP arrow to change n to y and press OK.

III.2 How to calibrate

Long press the SET key takes the unit into calibration mode.  The UP, DOWN allow one to choose from F1 to F6 and OK key enters that particular calibration.  While not inside F1 to F6, press SET again exits.

The other F functions are:
F1/F3, F1 for voltage reading, F3 for voltage regulation
F2/F4, F2 for current reading, F4 for current regulation
F5 is to save your calibration.

The calibration is done by repeating loops of telling the unit how far off it is at low and high.  Low and high points are 2.00V and at 30.00V for voltage, and 200mA / 1.2A for current.

So in each loop, you UP/DOWN the 2.00V to closest to actually DMM measured value (then SET and then OK), continues to the high to -change-SET-OK- and back to low.  You continue in this loop until both low and high points are closest to actual.  When both are closest, you would be pressing OK without SET as you would have accepted the displayed values without having SET to set a new closest.  This "only OK"is the signal to the unit you are doing with that particular calibration.  Once done, you UP key to the next F number until F1, F2, F3, and F4 are done,

The calibration is a frustrating exercise with F2 and F4.  The system is so busy with something unknown that it most likely will miss your UP/DOWN press.  Holding UP/DOWN too long however kicks it into REPEAT resulting in massive increase or decrease.  When it does that, now you have to start over: slowly getting it back to that correct value a bit at a time while it seem to tries its best to ignore your key press.

Credit: these info originally from a forum post in http://forum.fonarevka.ru/archive/index.php/t-15496-p-2.html
Author "SAV"wrote a brief description which I based my experiment on, and SAV credited the original info source as Jonathan at AtomicWorkshop.

III.3 Why is SET dangerous to the connected device?

Before I understood the undocumented F1 to F6 display, I mistakenly thought F1 was related to the extended function 0, 1, and 2 described in the manual under "Fully Functional Usage".  Instead, F1 is an undocumented calibration function.
 
I started the unit for normal use to power some 5V stuff.  After pressing SET a bit too long and F1 was displayed, this is what I thought:
- Extended function option 1 to enable load/store settings is on, fine,
- press OK to start
- 02.00(V) is now displayed, fine, the unit was set to 2V from last use.
- I need 5V, so I set it to 05.00, that looks good,
- press OK to go at 5V… Yikes!

This is what the system did when F1 was displayed:
- OK is pressed, go into calibration for F1
- the low point is 02.00 (Volts) so start sending out 2V for calibration
- User changed it to 0.500, fine
- OK is pressed, so user is done for the low point calibration
- Now start the high point, so send out 30V and show 30.00

Shooting 30V down 5V components is not pretty.  The first incidence I blew just a cheap TP4056 board and didn't even realize what I did - as I watch the blue LED dim, I just thought it was a bad TP4056 board.  Days later, the second time seeing the F1, 02.00, I reset it to the 5V I need then OK, it shows 30.00 and the loud pop that followed called me to action to learn what the F1..F6 means.  The partial list of casualties:
- 1st accident: TP4056 charger board (just this one victim the first time)
- My LCD (20x40) popped and smoked (the pop caused me to research)
- The I2C adaptor for the LCD
- The MCU
- The DS3231
- Adafruit ADS1115 breakout board

So, be careful with that SET key.  Don't press SET too long and if F1 shows up, power off and back on.

IV. User Interface

Awful until you get use to it.  It doesn't do much, but still can be confusing.  Read the manual carefully.  Error can be fatal to your equipment.

Be particularly careful holding the keys too long: holding the SET too long will kick it into calibration mode which with a bounce of your finger on OK will shoot out the most it can give to try to get 30V.  Enough to fry most 5V stuff with 30V.

If F1 or F(any-number) shows up, best to turn the unit off and start over until you are familiar with the unit.

V. The Serial Port

The unit has serial port.  Some units are labeled and some like mine is not.  The left most 4pin set on the daughter card is the serial.
Top VDD (square hole)
2nd Tx
3rd Gnd
4th Rx (botton)
By raw experimentation, I found 38400 7,N,1 shows "pppx"on power down but can't get it to response.  I suspect it is for initial testing/loading.  I can get as far as receiving "ppp"as a power down message but beyond that I cannot get it to do anything.


VI. How noisy is it?

It depends on your power source.  It will clean up your power source some, and add some from the unit itself.  For me, this one is the better than another LM2596 boards I have.

I found that when the source is closest (least bucking) to the needed voltage, my noise is least.  That may be just my laptop power bricks.  I found that the second worst is when I buck a lot and the number one worst is over drawn from the power-brick.  In general, I found the unit adds some noise and reduce some power-source noise.  Mostly, I get less noise than the original power-source.

First, my Hantek6022BE has ground noise of about 20mV.  So, the first 20mV may be from the scope's imperfection.  I use four power sources, and output at 5V 100mA, 5V via resistor to get 995mA.  For each output, I took two pictures one at higher frequency to show switching noise, the other at slow time division to show lower frequency noise.  I use a 56uF to filter out the DC to look at the noise alone.

At times, I am not sure I can call it noise.  As seen in the first group of tests using battery-power, since SLA has no component, the "source noise"come from the battery voltage drop due to load drawn by B3603 and not "noise"from switching or component noise in the power source for B3603.  However, the B3603's load will see it as noise coming from the B3603, so noise it is.
 
1. 12V SLA battery (and laptop on battery as well) to ensure I am not passing noise from power source down to the B3603 This one also shows 0V output that others don't.
2. 15V Toshiba laptop power
3. 19.5V Sony laptop power - this shows how B3603 performs when using power source with more noise
4.  15V Toshiba+19.5V Sony - this shows power source with even more noise.

GREEN trace is source and YELLOW trace is the B3603 output.  With that said, I will let the pictures of the scope display do the further talking about noise.

So, scroll down to look at the attached pictures...
130401-6
130403-7
130405-8
130407-9
130409-10
130411-11
130413-12
130389-13

VII. Wish list

- I wish calibration is entered by two-key simultaneous long press or during power-up only.
- I wish option settings would not require power up/down cycle.
- I wish it has both V and Amp are displayed at the same time.
- I wish it uses a rotary encoder to adjust the digit's value and the key changes the amount of each click of the encoder.  Perhaps each click of the encoder can be 1, 10, 100.  Holding it to go from 00.00 to 36.00 takes a very long time.
- A heat sink on the shunt would be good - or at least put the shunt where there is more air flow.  As it is, you can't do enough adjustment to have better accuracy with both low and hi current. I am thinking about doing that as a project, and make my ADS1115 to be an external display to this unit.  My ADS1115 can show current from shunt already, I can add Volts and Watts as well.

VIII. Last words

Not great nor lab grade, but for the $, it is very good, it is well worth around $15(USD).  Good value for the money.  Now that I know what the F1 is for, I would not mind this unit at all and would buy it again if I need another one.

Hope you find this info useful
Rick

EDIT: Typo on MCU partnumber.  Also, added temperature info as reply.
« Last Edit: February 11, 2015, 05:07:57 pm by Rick Law »
 
The following users thanked this post: bitwelder, Brutte, Waterman

Offline poida_pie

  • Regular Contributor
  • *
  • Posts: 119
  • Country: au
Thanks for this good and informative review.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm interested in seeing if I can make it work with the serial interface, so far not much success. I did find that the four bottom holes are for the STM8 programmer (SWIM protocol), there is a Vdd, Gnd, SWIM and RST which is all that is needed for this protocol. I don't have handy an STM8 programmer (STLink) but maybe someone has and can try it out.

I'm hoping to be able to read the firmware and try to figure from the code how the serial works. This also assumes that the flash is not locked out for read.
 

Offline rob77

  • Super Contributor
  • ***
  • Posts: 2085
  • Country: sk
nice review !  :-+ :-+ thanks for your time invested into this !  :-+

just one piece of advice ;) never ever have anything connected to a power supply (except a voltmeter and a dummy load of course) when calibrating/adjusting it ;)
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
nice review !  :-+ :-+ thanks for your time invested into this !  :-+

just one piece of advice ;) never ever have anything connected to a power supply (except a voltmeter and a dummy load of course) when calibrating/adjusting it ;)

Well...  I was not really trying to calibrate it!  It was the press SET key too long mistake that send me into the calibration mode.  (I edit the original post just now to make it clearer that I was not trying to calibrate.)  That is why I originally wanted to write the post to warn other B3603 users.  F1=fatal=frying component=... and has nothing to do with "full function" option 1.

Stupid UI, could have chosen anything else but F.  F just implies it has something to do with functions in fully functional mode that is in the user manual...  Still, it is a nice little unit.

Good general advice (on not connecting anything when calibrating) however.

(For this unit you do need to connect a power drain when calibrating current ... something that can eat 1.2A)
« Last Edit: January 19, 2015, 08:56:59 pm by Rick Law »
 

Offline amyk

  • Super Contributor
  • ***
  • Posts: 8232
CAL1~CAL6 would probably be a good message instead... and a CAL0 option to "exit without saving changes"?

Somewhat reminds me of this:
https://www.eevblog.com/forum/reviews/hakko-fx-888d-decalibrated-(doh!)/
 

Offline bdivi

  • Regular Contributor
  • *
  • Posts: 108
  • Country: bg
I bought myself one B3603 and ran some tests.

The unit seems well engineered including the software (which is not very common for Chinese design). Given one LED display and four buttons the interface is intuitive and easy to learn.

Performance wise the accuracy is pretty good out of the box and I even went trough one calibration cycle that made the voltages and currents to within one count compared to my meters. Surely the current shunt has some temperature coefficient but it stays within 0.3% from cold at 1A.

I am planning to put it in a box with a 20V laptop supply as an addition to my collection.

Overall very pleased with the price/performance.
 

Offline macboy

  • Super Contributor
  • ***
  • Posts: 2250
  • Country: ca
That "inconsistent" "enamel like protective layer" is no such thing. It is, in fact, just leftover flux from the hand-soldering of the through hole parts: switches and connector/header pins.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
That "inconsistent" "enamel like protective layer" is no such thing. It is, in fact, just leftover flux from the hand-soldering of the through hole parts: switches and connector/header pins.

I couldn't tell what it is...    So I took a guess from my less than adequate experience.

I bought myself one B3603 and ran some tests.
...
Performance wise the accuracy is pretty good out of the box and I even went trough one calibration cycle that made the voltages and currents to within one count compared to my meters. Surely the current shunt has some temperature coefficient but it stays within 0.3% from cold at 1A.

I am planning to put it in a box with a 20V laptop supply as an addition to my collection.

Overall very pleased with the price/performance.

By the way, a fan does wonders in keeping the shunt cool.  I've been planning/assessing a shunt replacement, so I have been collecting some base line data.  Based on initial assessment, I see delta (with/without fan) at 2.7%@1.9A and 2.2%@1.6A.  I will post more details after I analyze it further.
 

Offline bdivi

  • Regular Contributor
  • *
  • Posts: 108
  • Country: bg
Running furher tests on the unit I noticed oscillation in current limiting mode with currents above 1.5A.

This oscillation causes another weird problem: when increasing the voltage with 4Ohm load resistor the current can go all the way to 3A before droping to 1.7A, however when the voltage is already high (say 15V) connecting the load resistor imediately brings the current down to 1.7A.

I will play with the capacitors to see if there is an improvement - the original ones are far from high quality low ESR types that are required by the datasheet.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
What follows is data on temperature of the shunt.

Preface

Because of my initial expectation, I did not collect shunt-reading data vs warming time.  So, up to this point I had no hard data other then recollection of observations.  I am considering a shunt upgrade, so I want to collect some hard data to help me make an assessment.

The data I collect is to see: at a fixed current, how much the shunt reading changes as it warm up or cool down and how long the heat up and cool down take.  The actual shunt-mA verses "real" measurement itself is not important as calibration can make sure they match at a particular temperature.

The setup is to load the B3630 with a fixed load, and use a FAN to cool it down (or not) at fixed cycles.  So fan On/Off makes the shunt cold/hot giving me an idea of how big a temperature-induced error it is, and how long it takes for it to settle.

I selected 1.96A for my first run based on my initial work.  From experimentation, the best cycle time is 8 minutes FAN ON (cool to equilibrium) and 35 minutes FAN OFF (warm to equilibrium).  Equilibrium is when reading stops changing: airflow cooling = heat gained by current flow.  A longer 8hr run was made at a lower 1.6A.

Summary of Results:

Doing calibration at different temperature cannot reduce the error; it merely changes the nature of the error.  Calibration done at high shunt temperature means error starts high and reduces as the shunt heats.  Calibration done at low shunt temperature means error starts low and increases as the shunt heats.

Assuming the system is calibrated at room-temperature w/o fan, if current jumps from <500mA (cool) to around 2A, you can expect current reading error almost immediately.  Error continues to increase as it warms for the next 20-30 minutes or so without fan.  The swing is about 2.7% at 1.96A, and 2.2 % at 1.6A.

With a good size fan (2.5" pc case fan 1-2inches from the shunt), the heat related error is not eliminated but greatly reduced to 0.8% at 1.96A amd 0.6% at 1.6A.  Without fan, the error is 3.5% (2.7+0.8%) at 1.96A and 2.8% (2.2+0.6%) at 1.6A.
 
At the initial 1.96A test, the regulator is too hot to touch.  At 1.6A second run, the regulator is warm and touchable.  Thus regardless of whether current reading accuracy is important, when over 1.6A, a fan is a good idea.

Details of test:

The set-up

The B3603 supplies a fix load of a pair of 1ohm 5W resistors in series.  The B3603 voltage is adjusted to achieve the desired test current.  A current-sensing 0.1ohm is also in series but not used for current sensing.  A UT61E is connected in series with the load to measure the actual current using the 10A range.   

An Arduino reads the B3603's shunt voltage using an ADS1115.  It also converts the mV reading to mA and displays (logs) both every 0.4 seconds.   The mA reading from ADS matches the B3603 displayed mA within 1 digit in all observations.

The Arduino also controls a relay that turns on/off the 2.5" PC-case fan.   The Arduino reports the data to the PC at 0.4 second interval and the UT61E reports data at 0.5 second interval.  Every 1 second, the PC logs the latest reading of the UT61E current (10A range) and the Arduino reading.

134160-0


The B3630's load resisters sits a foot below and a foot away so load cooling fan does not increase airflow at the shunt over natural convection.  The 0.1ohm current sense resistor at the load takes no role in measurement as the UT61E measure the current using the 10A range.

134162-1


Resulting data:

At 1.96A, cooling is about 3 minutes and warming up takes about 30 minutes to reach equilibrium.  The reading swings by 2.7% between equilibrium-cool and equilibrium-hot. 

At equilibrium-cool (ie:with fan) the reading is about 16mA (.8%) higher than when shunt is at room temperature (ie:when first powered on).  Thus total change from heat is 3.5%.


134164-2


At 1.6A, cooling and warming up time doesn't change significantly. 15 seconds (from 3 minutes) faster in cooling to equilibrium-cool and about 3 minute faster (from 30 minutes) to heat to equilibrium-hot.  The reading swings by 2.2% between equilibrium-cool and equilibrium-hot.

At equilibrium-cool (ie:with fan) the reading is about 10mA (.6%) higher than when shunt is at room temperature (ie:when first powered on).  Thus total change from heat is 2.8%

I added an LM35z to measure the air temperature near the shunt.  It is at and almost touches the far end of the shunt.  Far is the side far from the fan, so the fan is still blowing over the entire shunt.

134166-3




My thoughts:
I collected these data to assess if I want to upgrade the shunt.  Hope this data is of use to you as well.

My goal is to be fan-less below 1.5A and keep error as low as possible.  I am not sure about the temp-co of the stock shunt, along with not knowing the exact temperature swing, it is hard to judge.  So, I "back of the envelop" it.  At 1.6A, fan/no-fan changes resistance 2.2%, that is 22,000ppm.  I see many if not most shunts operates only up to 175C.  So, assuming an impossibly high 220C swing, it would still need a 100ppm shunt to achieve that 22000ppm.  So, if I get a 75ppm shunt, it has to help.  As to how much that would help, I have no idea.  I suspect (guess work) it may be a 500-1000ppm low price stuff.  A 44C change for 500ppm would change it by 2.2%.  If that is the case, a 75ppm would help greatly.  Only doing it would tell me if it really helps…
« Last Edit: February 04, 2015, 01:56:44 am by Rick Law »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Running furher tests on the unit I noticed oscillation in current limiting mode with currents above 1.5A.

This oscillation causes another weird problem: when increasing the voltage with 4Ohm load resistor the current can go all the way to 3A before droping to 1.7A, however when the voltage is already high (say 15V) connecting the load resistor imediately brings the current down to 1.7A.

I will play with the capacitors to see if there is an improvement - the original ones are far from high quality low ESR types that are required by the datasheet.

Interesting.  I just tried it with my setup for shunt test - already got a 2ohm fan cooled load.

I current-limit it to 1.8A, I see some noise - a saw-tooth 200mVpeak 50khz.  Over a few minutes, it did started to have a 300mVpeak oscillation with wildly changing frequency.  Is that what you see?

Can you post a scope picture?   The oscillation I just saw is the kind I've seen before with other CV buck and with a home made CV-buck one under high current draw.    I am interested in this as I could not solve the problem with my home made CV buck.

Please keep us posted.
 

Offline bdivi

  • Regular Contributor
  • *
  • Posts: 108
  • Country: bg
First attachment is current limiting at 1.6A - 2kHz oscillation around 90mV pp
Second is 2.9A limit  - 4kHz oscillation again 90mV pp
The third screen is when the major oscillation kicks in - it is almost 300mV pp and 30kHz.

I think the small oscillations are normal behaviour of the LM2596 with high currents.
The large oscillation is probably a problem with the opamp current limiting control loop.

This is with new low esr capacitors so I cannot do much more without the schematics.

After all I can sefely use it with currents bellow 1A.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
First attachment is current limiting at 1.6A - 2kHz oscillation around 90mV pp
Second is 2.9A limit  - 4kHz oscillation again 90mV pp
The third screen is when the major oscillation kicks in - it is almost 300mV pp and 30kHz.

I think the small oscillations are normal behaviour of the LM2596 with high currents.
The large oscillation is probably a problem with the opamp current limiting control loop.

This is with new low esr capacitors so I cannot do much more without the schematics.

After all I can sefely use it with currents bellow 1A.

What you posted confirmed that we are talking about the same thing.   I've seen with this and other LM2596 based boards including a home-built with the Chinese made XL4015 which is a LM2596 functional compatible.

With the larger oscillation, I might have wrongly chalked that up to over-drawing on the power brick since I typically saw it in the 12V-13V range at high amp with Toshiba 16V 4A power source.  To reproduce what you said with 1.8A, I was merely at 4-5V range.   Clearly my initial thought on it being over-taxing the power source is wrong.

On other CV boards at higher volt (and amp), I have seen that oscillation hitting a point of way over 1V.

After I wrap up what I am doing with temperature testing, I would get into that to see what I can learn there.  This is a problem worth learning more about.  Good thing you pointed it out!
 

Offline bdivi

  • Regular Contributor
  • *
  • Posts: 108
  • Country: bg
Same issue found by DadHav on youtube

Once the current goes above 2.6A it drops dow n to 1.5A and respectively the voltage. I am not sure the author is visiting our forum but if he does he can share the details.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I hooked the unit to an stlink hoping to peek into the firmware and it seems to be locked out, all I can read are 0x71 bytes which seems to indicate the unit is read-locked (as can be somewhat expected but I was hoping).
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Just found a nice reference to the pinouts of the mainboard to the controller board pinouts: http://forum.banggood.com/forum-topic-20302.html

Now I need to read on stm8 programming and to find the pinouts from the external pins to the stm8 pins to figure out how to write my own firmware. Wish me luck!

(luckily I have ordered two such units just for such an occasion, I can use one and hack the other)
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Just found a nice reference to the pinouts of the mainboard to the controller board pinouts: http://forum.banggood.com/forum-topic-20302.html

Now I need to read on stm8 programming and to find the pinouts from the external pins to the stm8 pins to figure out how to write my own firmware. Wish me luck!

(luckily I have ordered two such units just for such an occasion, I can use one and hack the other)

Great find!

I was about to spend time mapping the pins.  This sure save me effort.  In that short thread, the one who posted the pinouts was talking about redoing the control board with an Arduino NANO.  I was  kicking around that idea as a project.  That would be kind of interesting to do and I can implement many of the enhancements I wish it got.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Using an Arduino or any other board would be simpler than reverse engineering the STM8 board but if I could do that I would also get to use the screen and buttons. I'm looking to working with the STM8 with the serial and controlling the pins and then figuring out how to do the 7-segment screens with the '595.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Using an Arduino or any other board would be simpler than reverse engineering the STM8 board but if I could do that I would also get to use the screen and buttons. I'm looking to working with the STM8 with the serial and controlling the pins and then figuring out how to do the 7-segment screens with the '595.

Yeah, but there is still a lot of reverse engineering to do before one can sit down and write a program.

The pinout in the link you posted is a good starting point, it confirms some of my suspicion / expectation on how it may work, but one still needs to find out how it actually works,  and what the exact is the slope & intercept of the sense-volt vs actual, sense-current vs actual.  The numbers in the pinout diagram does not evaluate to actual (I tried immediately after seeing your posted links; I have different a & b for my y=ax+b than in the pinout photo).  So calibration may impact those constants as well.

Until reverse engineering is done to the point where we have a "guide book" / "user manual" level description, we are not ready to make an "arduino replacement control board".  By user-manual level, I mean things such as "to control out V, set PinX to Vout/456+123V.  And this level of detail is needed with all the different functions of the board and not just Vout.

There are 3 pins unlabelled pins.  They also need to be identified.  In the end, it could be more work than building from sratch --- but, this could be a fun thing to do.

Rick
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
There is still things to document like the pinouts from the MCU to the control points and the unknown pins are indeed of interest. I did verify that the two bottom pins on either side always show 3.15V and that this comes from the top-board. That last pin is a mystery but for all I can see it is always 0V.

The slopes should be calibrated per device in any case so I'm not too worried about them, I'll use some sensible defaults starting with whatever checks out on my device(s) and then add calibration in the software for the slope and offset. These calibrations will be needed even if you hook up an Arduino to control it, that would need to be calibrated anyway. I can think of maybe using an arduino to control it initially in order to handle the bottom board control as well and to learn from it since reverse engineering the top board would be quite some more work but I do assume also that the control part from the STM8 to the external pins should be not much harder on the STM8 compared to an Arduino since I will only need to figure out how the wiring to them works and I can leave out the buttons and the displays unused for now and just use it all through the serial and then go on to add the buttons, leds and displays later on.

A large part of the fun is the mystery itself :-)
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
There is still things to document like the pinouts from the MCU to the control points and the unknown pins are indeed of interest. I did verify that the two bottom pins on either side always show 3.15V and that this comes from the top-board. That last pin is a mystery but for all I can see it is always 0V.
...
A large part of the fun is the mystery itself :-)

The two bottom pins on either side?  I am confused.  Let's use pin numbers.
Attached is a picture with pin numbers.



Pin 8, 15, and 16 are the unidentified.  8 and 16 are the bottom pins.  Do you mean they are 3.15V?
From the board trace, I have:
Pin 8 to pin 13 of MCU, the Port C3 pin.
Pin 15 connects to the Txd pin header pin (and no where else) and MCU
Pin 16 connects to the Rxd pin header pin (and no where else) and MCU

Pin 8 and 16 (Rxd) are the bottom pins, they cannot possibly be 3.15V since RxD connects to just the pins headers of RxD and no where else.  There is no visible connection between the MCU and the Txd/Rxd!  No wonder I can't get any TTL-RS232 signal except on power off - I see a few P's coming out.  'p' is 0x70=0111 0000b.  I wonder how could that come out.  I am confused.  I have to dig further.

EDIT - Corrected by striking out words above on no visible connection from pin 15,16 to MCU.  I found the trace that connects TxD and RxD header pins to the MCU at UART1 MCU-pin 2 and 3

Note that the board I have doesn't have silkscreen printed TxD and RxD.  Perhaps mine is a different revision.

Oh, an additional note - on the base board (with the voltage regulator and all) I see no visible trace going to Pin 8, 15, or 16.  But there is a possibility that the female pin header itself is covering up a thin trace.  Peeking at the base of the pinheader, I found that unlikely there is a thin trace there because peeking there I can also see the edge of the hole.
« Last Edit: February 19, 2015, 11:23:06 pm by Rick Law »
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
I found this picture somewhere

and usermanual, and calibration guide, also some usermanual stuff here: http://henrysbench.capnfatz.com/henrys-bench/minghe-b3603-user-manual-table-of-contents/

Edit: Oh, that was the picture linked above :)
« Last Edit: February 11, 2015, 06:09:53 pm by neslekkim »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I just checked with a logic analyzer and it seems that port 16 and 15 have some serial data going in them. It generally seemed that the decoding provided in the link I pointed is incorrect with regard to some (if not most) of the pins 9-16. I believe pin 14 is connected to the serial out of the 74HC595. I still need to verify that and it confuses me why it seems to be connected directly to pin 13.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
My logic analyzer seems to pick something on almost all of the right pins (9-16) and pin 16 which should be UART TX doesn't seem to have any meaningful serial data on it. I'm quite baffled there.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I've created a project on github to attempt documenting what I (we) found and maybe get some code going, either for the STM8 or even if just for an external Arduino: https://github.com/baruch/b3603
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
Great!
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
My logic analyzer seems to pick something on almost all of the right pins (9-16) and pin 16 which should be UART TX doesn't seem to have any meaningful serial data on it. I'm quite baffled there.

My logic analyzer wasn't connected to the ground of the system, it was random noise I was seeing  |O
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
My logic analyzer seems to pick something on almost all of the right pins (9-16) and pin 16 which should be UART TX doesn't seem to have any meaningful serial data on it. I'm quite baffled there.

My logic analyzer wasn't connected to the ground of the system, it was random noise I was seeing  |O

Haha...  That is funny.  I thought I was the only one who would do something like that.  Oh, well...

I looked at 15 and 16 on the controller board very closely and I can't see it going anywhere else [edit- added ELSE, because they do go to the RxTx headers].  But I do consistently pick up a "pppx" or "ppp" on power down.  So there is some consistent noise there that looks a little like TTL UART at 38400 baud.

The thing I don't like about the control voltage is the negative part.  It will complicate ADC conversion for the read, and the DAC for the write.

I was going to decode the pins later, but with that link, it got me started sooner than expected.  What does surprise me is the lack of a pin for Power Source V- and/or not using that V- as controller board ground.  The PowerOut V- seats 0.00x to 0.3xxxVolts (or perhaps more) above PowerSource V-.  Using that as ground would have kept all the control voltage positive making ADC/DAC conversions a lot easier.
« Last Edit: February 12, 2015, 09:36:28 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
A curious thing I just read is that if the offset on the measured value is the same as the offset for the Vref of the ADC we should be fine. So if the ground of the MCU is the same as that of the Vout than the offset we are seeing externally will not be seen by the STM8 MCU.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I erased the firmware on the STM8 now (I have another one and bought a third now, I managed to blow the 7-segment display by feeding it 17V input instead of 5V).

I managed to program the STM8 and got it to blink the red led by toggling pin 11 (PB5) of the MCU. Once I'll have full UART communication it will be possible to probe all pins to check behavior and effect of the simple features (leds and output pins).
 

Offline ticpu

  • Newbie
  • Posts: 5
  • Country: 00
From Github I could read:
Pin 4: Iout control, 970mV/A + 140mV
Pin 5: Vout control, 72mV/V + 42mV

In fact, the STM8 chip outputs 2 square waves for control:


Minimum ~20/1000 µsec for pin 4
Minimum ~7/1000 µsec for pin 5

Maximum ~800/1000 µsec for pin 4
Maximum ~840/1000 µsec for pin 5

Oscilloscope: https://drive.google.com/open?id=0B8iQKMsmBSa1d1J6aVFPNHVOOHM&authuser=0

This is most likely what is adjusted in the calibration menus.

Edit: What I said for maximum was completely wrong.
« Last Edit: February 14, 2015, 06:32:00 pm by ticpu »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I would assume that they use PWM and feed it to a capacitor for integration to generate the actual voltage feed to the opamp network.

Thanks for looking into this!
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Now I'm truly baffled... The initial mappings of MCU pins to left connector pins were done by probing with a continuity tester. I'm not trying to validate that with some code running on the STM8S003F3 chip and it doesn't check out. The strange thing is that I toggle PB6 and it turns on pin 7 on the left connector, the odd part being that according to STM8 specs there is no PB6 pin on this chip in the first place.

EDIT: I had an off-by-one on the ports. Add this to my account of bugs on the road. But at least things seem to check out now.
« Last Edit: February 14, 2015, 09:46:50 pm by baruch »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
ticpu, can you see what happens on pin 8 of the left connector? we have no decoding for it and it is wired to the MCU so I wonder what can be picked up from it, if it is a PWM or some other signal. Especially helpful would be to know if it changes in any normal mode of operation.
 

Offline bal00

  • Newbie
  • Posts: 2
Hi. I'm the person who did the basic pin mapping that was posted here earlier. Happy to see that someone is doing something with this module. Seems like a useful little board.

Just to clear up a possible point of confusion:

The thing I don't like about the control voltage is the negative part.  It will complicate ADC conversion for the read, and the DAC for the write.

I was going to decode the pins later, but with that link, it got me started sooner than expected.  What does surprise me is the lack of a pin for Power Source V- and/or not using that V- as controller board ground.  The PowerOut V- seats 0.00x to 0.3xxxVolts (or perhaps more) above PowerSource V-.  Using that as ground would have kept all the control voltage positive making ADC/DAC conversions a lot easier.

The control and sense voltages are all positive. Might be hard to read on the image, but it says ~0.97V/A, not -0.97V/A. Wasn't able to nail down a more precise constant, possibly because there's some non-linearity involved (due to the shunt temperature perhaps?).

Pin 15 and 16 are indeed only connected to RxD and TxD as far as I can tell, and pin 8 is a mystery to me as well. Note that the manufacturer uses the same controller board on different DC-DC converters (B3603, B3008, BST400), so maybe pin 8 is only used on the boost converter module, or maybe they just had an extra pin and wanted to future-proof the design for other products.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I found two of the buttons are connected to PC7 and PD1, I believe the other two are connected to the UART pins PD6 and PD7 which will make them useless for my needs since I want the serial port more than the buttons.

Ideas for UX with only two buttons would be appreciated :-)

Also, ideas for the power supply itself are welcome as well. I'm starting to work on the code for the alternative STM8 firmware and have put a serial protocol definition at https://github.com/baruch/b3603/blob/master/stm8/PROTOCOL.md and my feature wishlist in the wiki at https://github.com/baruch/b3603/wiki

Still haven't done the actual power supply control as I need to learn exactly how to do the PWM controls yet on the STM8. There are examples around so it shouldn't be hard to get it working, making it work good enough and the power supply stable enough would be the real challenge.
 

Offline ticpu

  • Newbie
  • Posts: 5
  • Country: 00
ticpu, can you see what happens on pin 8 of the left connector? we have no decoding for it and it is wired to the MCU so I wonder what can be picked up from it, if it is a PWM or some other signal. Especially helpful would be to know if it changes in any normal mode of operation.

After trying many operations, it just acts as a "Power Good" always-on pin. It does not seem to be mapped anywhere on the bottom board, am I right on this?

It is always on except when power goes below the CPU threshold and the display shows "0     " with output disabled.
Operations tried:
- Bootloader menu
- Switching all 3 bootloaders options on or off
- Switching the output on or off
- Switching the output from CV to CC and vice-versa
- Going in calibration menu
 

Offline ticpu

  • Newbie
  • Posts: 5
  • Country: 00
I found two of the buttons are connected to PC7 and PD1, I believe the other two are connected to the UART pins PD6 and PD7 which will make them useless for my needs since I want the serial port more than the buttons.

I know that this will be a completely new program but let's immagine we would be modifying the current program for this example.

I'd suggest having the 4 button operationnal like the current program. I would add an option in the boot menu, the one accessible by holding OK on boot, an option to start automatically oin serial controlled mode. In this mode, pressing the OK or Set button (or those which are not mapped to serial) would return to button controlled mode and to return to serial controlled mode you would press and hold Set and OK.

In this way, it would be possible to modify the settings both from the serial mode and button controlled mode.

Also, ideas for the power supply itself are welcome as well. I'm starting to work on the code for the alternative STM8 firmware and have put a serial protocol definition at https://github.com/baruch/b3603/blob/master/stm8/PROTOCOL.md and my feature wishlist in the wiki at https://github.com/baruch/b3603/wiki

What would you think of having the protocol supporting GET/SET/SHOW in parallel with action keywords? I think it would made interfacing with other device a bit easier since you would not have to parse the output as much.

Then having aliases/shortcuts for case-insensitive fast to type commands like "seti 0.33" would be a piece of cake. Plus, having GET/SET interface could allow a COMMIT command if someone implements this. SHOW would be a listing of all available settings and HELP a list of all available actions.

Example session, all lines terminated by \r\n, < and > represent serial data direction.

> SET max_voltage=10.1
> SET max_amperage=0.41
> SET output_enabled=1
> GET max_voltage
< 10.10
> GET max_amperage
> 0.410
> GET output_voltage
> 9.231
> SHOW
< max_voltage=10.10
< max_amperage=0.410
< output_enabled=1
< output_voltage=9.231
< other_settings=...
< \r\n (empty line)


I'm willing to be writing some code to implement this; if it makes sense! :)

Maybe I missed it, but right, how do you upload code to the STM8 device? I don't have a dev environment set-up for those devices at the moment.

Edit: I forgot about read only variables, they could just be prefixed with something so the user know they are read only/status data.
« Last Edit: February 15, 2015, 09:54:34 pm by ticpu »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
After trying many operations, it just acts as a "Power Good" always-on pin. It does not seem to be mapped anywhere on the bottom board, am I right on this?

I've so far failed to trace the bottom board so I can't tell much, it does seem to be generated by something in one of the boards as it is there even when the STM8 MCU doesn't generate anything on it. I tried to use it now as a power good signal in the code but I can't seem to manage to output even one character before the board loses power. Will have to check this when I can actually output power from the board.

It does sound like a good option for it to be a power good signal.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I found two of the buttons are connected to PC7 and PD1, I believe the other two are connected to the UART pins PD6 and PD7 which will make them useless for my needs since I want the serial port more than the buttons.

I know that this will be a completely new program but let's immagine we would be modifying the current program for this example.

I'd suggest having the 4 button operationnal like the current program. I would add an option in the boot menu, the one accessible by holding OK on boot, an option to start automatically oin serial controlled mode. In this mode, pressing the OK or Set button (or those which are not mapped to serial) would return to button controlled mode and to return to serial controlled mode you would press and hold Set and OK.

In this way, it would be possible to modify the settings both from the serial mode and button controlled mode.

Sounds likely, switching between serial enabled/disabled by the buttons could work I guess and make more sense than a mostly serial device.

Also, ideas for the power supply itself are welcome as well. I'm starting to work on the code for the alternative STM8 firmware and have put a serial protocol definition at https://github.com/baruch/b3603/blob/master/stm8/PROTOCOL.md and my feature wishlist in the wiki at https://github.com/baruch/b3603/wiki

What would you think of having the protocol supporting GET/SET/SHOW in parallel with action keywords? I think it would made interfacing with other device a bit easier since you would not have to parse the output as much.

Then having aliases/shortcuts for case-insensitive fast to type commands like "seti 0.33" would be a piece of cake. Plus, having GET/SET interface could allow a COMMIT command if someone implements this. SHOW would be a listing of all available settings and HELP a list of all available actions.

I'm trying to get something useful and was planning to do most of the serial control through an application (most likely sigrok) or custom scripts. I'd like to have something that is rather simple to parse and I quickly implemented most of that protocol today. I do like the idea of a COMMIT command.

Ofcourse, if you want to implement the code to do the parsing I don't really mind the protocol itself. I would even appreciate support for the arrow keys for completely serial control but don't find it important enough to implement it myself. I also hope the flash and ram will be sufficient but for now I don't see the space a real constraint.


Maybe I missed it, but right, how do you upload code to the STM8 device? I don't have a dev environment set-up for those devices at the moment.

I got myself an STLink V2 device from Aliexpress for $3.5 and a CP2302 serial to do the serial communications. I tried using stm8flash but it stopped working after I used once the STM official programmer, now I can only use the official programmer software which is a hassle but not too bad when it runs in a windows vm on my linux machine.

I do my compilation with sdcc and makefiles.
 

Offline flex

  • Contributor
  • Posts: 25
Just thought I'll leave that here. (can't guarantee correctness ;) )

Edit: There is a mistake, because of the 50mOhm resistor. Will post full schematic (of the bottom board) later.
« Last Edit: February 17, 2015, 01:35:19 pm by flex »
 

Offline plazma

  • Frequent Contributor
  • **
  • Posts: 472
  • Country: fi
    • Homepage
Is the original firmware protected? Can it be read and disassembled?
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Is the original firmware protected? Can it be read and disassembled?

It is protected, that was the first thing I tried but the read-out-protection is enabled and nothing can be read in normal means. That's why I resorted to reverse engineering to write my own firmware.

 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Just thought I'll leave that here. (can't guarantee correctness ;) )

Nice!

I can see from that the on/off signal goes directly to the LM2596, I can't find anything for pin 8 on the left connector which is still somewhat a mystery. As can be seen above we suspect it is a power good signal but in my attempts to read it to notice failure the MCU doesn't notice it soon enough to do anything with it.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I got PWM to work on the STM8 now.

I'm somewhat fumbling on this topic but what would be the best way to control the PWM vs the voltage/current setting vs the output sense?

Should I do a feedback loop between the outputs and the pwm? Find some calibration and let it be?

Anyway, I've got all the road blocks removed to get serial control over the B3603. I still have no idea how to drive the display through the 74HC595, if someone can figure out the pinouts there I'd appreciate that a lot.
 

Offline DuckDaffy

  • Newbie
  • Posts: 1
Hi all! I'm new one here.
I was thinking about new software for B3603 too.
Great to find this topic. :)

I dream about:
* rotary encoder for setting U/I,
* auto-off function when overcurrent.

I hope I can help somehow, for ex. I have 2x B3606, 1x B3603 and also 1x BST400, so I can test SW with these HW. ;)
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
Maybe some of those use the pins that seems to be unused on the B3603?
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I dream about:
* rotary encoder for setting U/I,
* auto-off function when overcurrent.

I hope I can help somehow, for ex. I have 2x B3606, 1x B3603 and also 1x BST400, so I can test SW with these HW. ;)

I'm currently doing only software work for this, I can think of hardware modification for user interface work. Some people like the buttons and some prefer the rotary encoder, I can also think of an LCD but that's going to be a later modification. For now I want to get the software to work and I'm getting close :-)

Do you have an STLinkV2 programmer? If you will want to play out with the software you'll need one to program the devices.

I would appreciate it a lot if you can test the measurements we have for the B3603 on the other devices, presumably you have the B3606 and the BST400 which I don't have. What voltages do they show on the left connector pins for different settings? What is the PWM values for the control pins?
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
How do one read firmware from these chips?

I know that it was locked, but are we sure all of them are locked, if some did slip through, or have been updated in various versions?, it's lots of sellers of these, I have some of these versions also.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
To know if it is locked you try to read it. You hook it up with the SWIM connector (I soldered a connector at the bottom pins) and use an STLinkV2 device to read it (either Linux stm8flash or the Windows STVP), I so far only tried one unit and it was locked out. I have another one but need to solder the header to it first. I suspect they are all going to be locked out, it seems like a trivial step at the mfg process that I can't think it will be missed in some batches.

But feel free to try it out and report if you succeed! It would be really nice to see what they do in there and be able to improve on it rather than start from scratch as I currently do.
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
The four pins next to the buttons?
cool, will try that, have both the clone stlinkv2 and the original stlinkv2, going to try to find all my units, I have atleast two of them, and one more in the post, ordered from dx, banggood, and some ebay sellers.
 

Offline flex

  • Contributor
  • Posts: 25
I just drew the schematic of the bottom board.  ;D
(Please report any error...)

Update 1: forgot C8 (attachment updated)
Update 2: Added IN- to GND wire
Update 3: Now the resisor value don't use the smd codes anymore
Update 4: Clarify schematic. Use VCC instead of pin 14
Update 5: Reorder and add some more labels to clarify
Update 6: enhance readibility, correct some groups
Update 7: Top board. See newer post.
« Last Edit: February 19, 2015, 01:01:41 am by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I just drew the circuit of the bottom board.  ;D
(Please report any error...)

Amazing!

I wonder how the voltage to the top board stays constant when the out+ changes, the schematics shows a voltage divider from what I understand and nothing more. The top board always gets about 5V.

Any ideas about the resistor values? What about the voltage divider for the input? We can find it empirically but getting a sense for the voltage divider values would be a nice validation.

I'm collating all of the information on the B3603 in my github repository, I hope it is ok that I take it like that. I'll be sure to give credit for all the work all you folks are doing, I couldn't have progressed so fast without all that information! If you don't want the files/documents added to my git repo just let me know and I'll remove it.
« Last Edit: February 17, 2015, 03:15:42 pm by baruch »
 

Offline flex

  • Contributor
  • Posts: 25
Quote
Any ideas about the resistor values?
I think all resistor values are in my schematic. Or what do you mean?

Quote
I wonder how the voltage to the top board stays constant when the out+ changes
Not sure if I got the question right, but the 5V are provided by the XL1509 DC DC converter.
« Last Edit: February 17, 2015, 03:16:54 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm not experienced with electronics so the numbers on the resistors don't mean much to me, I expected to see 10K, 15K and such values, the numbers on the resistors are in the hundreds. Are these just codes for the actual value?
 

Offline flex

  • Contributor
  • Posts: 25
Quote
Are these just codes for the actual value?
Yes  ;), you might want to google for "smd resistor code"

I just updated the values. They might be easier to read this way. (see orig post)
« Last Edit: February 17, 2015, 03:40:16 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
The input sense voltage divider is made out of a 15K and a 1K resistors so the factor is 0.0625 which is pretty much exactly what bal00 figured out from his work. It's nice to see things get validated between different parts.

One thing though, this doesn't seem to work out when I read the values from the ADC, my factor for my device is 53.67 rather than 62.5, I'm mostly wondering what mistake I made in my code that brought this...
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Quote
I wonder how the voltage to the top board stays constant when the out+ changes
Not sure if I got the question right, but the 5V are provided by the XL1509 DC DC converter.

I saw that part and it feeds VCC into pin 13 and 14 but there is also an OUT+ part that gets connected to pin 14 and then gets connected to the opamp on the current sense side.
 

Offline flex

  • Contributor
  • Posts: 25
I just modified the schematic to use the power label "VCC" instead of pin 14 to clarify that. But anyway the constant VCC voltage is regulated by XL1509.

Further it looks like the 50m shunt is used to measure the current and is biased by R7, R10. Don't ask me what R18 is doing there, I can't see any reason for it.

Quote
One thing though, this doesn't seem to work out when I read the values from the ADC, my factor for my device is 53.67 rather than 62.5, I'm mostly wondering what mistake I made in my code that brought this...
I yet have to take a look at the code, but be aware that VCC is only approximitly 5V, the real value is 1.23*(1+3/1)=4.92V (just in case you use VCC as reference). I know that this won't explain the your difference.
« Last Edit: February 17, 2015, 04:17:53 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
The MCU indeed uses VCC as reference for all I know. I believe there is a way to measure the Vref with an ADC, I still didn't try that yet. Not sure what to expect from it either.

 

Offline flex

  • Contributor
  • Posts: 25
Just took a look at your ADC problem. I can't see how you derived these formulars in read_state. Maybe you could explain them?

However I found the following in the Reference manual for STM8S and STM8AF microcontroller families:
Code: [Select]
VREF-  This input is bonded to VSSA in devices that have no external VREF- pin
VREF+  This input is bonded to VDDA in devices that have no external VREF+ pin
VSSA   This input is bonded to VSS in devices that have no external VSSA pin
VDDA   This input is bonded to VDD in devices that have no external VDDA pin

That way I would expect 0x00 to be 0V and 0x3ff=1023 be VCC and would do the following calculation to obtain the voltage from the adc value (but I never used the STM8 adc before):
state_vin = val * 62.5 * VCC * / 1023

But I can't test this at the moment, because I only have the preprogrammed STM8 and don't want to erase it (I should order an IC for testing).

Btw. I assume you don't use the "stm8 standard peripheral library" because of the restrictive licensing? Too bad that it isn't BSD licensed as the STM32 HAL and CMSIS libraries.
« Last Edit: February 17, 2015, 06:10:52 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Just took a look at your ADC problem. I can't see how you derived these formulars in read_state. Maybe you could explain them?

I've used a linear approximation and measured the voltage in at two points and used the ADC output and the measured voltage in  to find the A and B values of the equation Y=A*X+B. I believe I tried initially to use the 62 multiplier and that resulted in wrong numbers.

Quote
That way I would expect 0x00 to be 0V and 0x3ff=1023 be VCC and would do the following calculation to obtain the voltage from the adc value (but I never used the STM8 adc before):
state_vin = val * 62.5 * VCC * / 1023

I get from the ADC the value 167 when I have the input at 8.91, if I use your formulate I get:
167 * 62.5 * 4.92  / 1023 = 50.19

If I drop the 4.92 value I get 10.2

Both of these are wrong (should be 8.91)
 

Offline flex

  • Contributor
  • Posts: 25
 |O I just realized, that my formula has the wrong multiplier (but that won't solve the problem :-DD)

So we have 9.81V Vin => 9.81V/16=0.613V as output of the voltage devider (can you verify that by measuring pin3 to gnd? and could you also measure VCC?)

Now I would calculate adc=0.613V/4.92V*1023=127 (but you got 167, so I'm off the track)

However the formula I meant is:
adc * VCC  / 1023 * 16
(would result into the wrong result: 167 * 4.92  / 1023 * 16 = 12,85V)

Would you mind to post some more samples (Voltage Vin, voltage at pin 3 (voltage divider), measured adc value, vcc)?
« Last Edit: February 17, 2015, 09:02:28 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
It is 8.91V in the numbers above and not 9.81V...

I've switched to try to get the PWM working so the code is currently not safe to run on the full module, I use the top board only for this development part. I'll get back to the ADC once I get the PWM somehow working to get even a semi-fixed voltage output.
 

Offline flex

  • Contributor
  • Posts: 25
I think I found my mistake. VCC is 4.92V of the lower board != voltage of the STM8 at the top board.
I hadn't took a look at the the top board and assumed it would run with 5V, but now I saw the 3.3V ldo.

So I get:
adc * 3.3  / 1023 * 16 = 167 * 3.3  / 1023 * 16 = 8.619V
That is much closer to 8.91V :D (since the devices will be calibrated, this could be in spec ;))

btw: My stm8 is also protected. I'll wait until I'll get another uC (don't want to loose the orig program).

btw2: I just did another schematic update.
« Last Edit: February 18, 2015, 02:37:32 am by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm not quite sure how the current loop works. The input to it is from VCC and OUT-, from the very little I know I've only seen the shunt used on the OUT+ and then the voltage drop across it being measured. I can't figure how this gets done here.

Would appreciate if someone could point me to some reference on the method used here.

EDIT: Looks like the VCC is voltage divided into almost nothing, I still can't understand what it does there. But I still don't understand how comes the opamp input for the current sensor doesn't come between the OUT- and the shunt to measure the voltage drop on the shunt.
« Last Edit: February 18, 2015, 06:59:06 am by baruch »
 

Offline eas

  • Frequent Contributor
  • **
  • Posts: 601
  • Country: us
    • Tech Obsessed
Don't want to distract from this excellent reverse engineering discussion, but I thought people in this thread might be interested in something I stumbled on. I thought I spotted a familiar logo in a fuzzy product listing on ebay. Another photo had a clearer view. It was the MinHe logo, on a 10A switching supply that can do up to 120v. A rather big step up from the 6A sibling of the B3603.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
It is a big brother to the small ones. It's also priced much higher. The one thing I like about the B3603 is that it's dirt ship at $12 and if I can get serial communication with it I'll have the cheapest programmable power supply that can possibly be :-)
 

Offline flex

  • Contributor
  • Posts: 25
@eas and it is 10 times more expensive ;) but interesting

@baruch
I think I got an idea what is going on with the current regulation:

Obviously there is a voltage Vshunt at the shunt (5mV per 100mA). Lets assume this voltage is constant (R10 is 10k so this isn't too unreasonable).
so the voltage divider divides the voltage (VCC-Vshunt). With VCC=5V we get:
Code: [Select]
I Vshunt Vout to Vshunt
100mA 0,005V 0,0099700599V
1A 0,05V 0,0098802395V
This is a more or less constant 10mV. That way (Vout to GND)=10mV+Vshunt.

That means Vshunt gets an offset of 10mV. This value gets feed into the non inverting amplifier (ingore R18, the input impedance is much higher) and is amplified with the factor 16.

That means we can measure at pin 1 (ignoring R25, R28 because we asssume inf input impedance again):
16*(10mV+Vshunt)=160mV+16*(I*0.05)=160mV+0.8*I
that is reasonable close to bal00 formula: 140mV+0.97*I (There are real world tolerance and I did some idealization, in the calculation)

Now that output gets to the PI-Conroller.

Update: I just did a LTspice simulation to verify the calculation. Looks pretty good (see attachment)
Update: Did a more exact calculation later in the thread
« Last Edit: February 20, 2015, 05:03:49 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
@flex, Thanks for the analysis!

I'm getting some progress on the PWM front.

I found that the total flash area is 2KB so it's going to be a very slim fit and when I switched to a more verbose protocol it made the code bloated with the additional strings. I think I'll need to go back to a slimmed down protocol to save the space for real code and assume a utility from outside to communicate with the unit and make the verbose interface.
 

Offline flex

  • Contributor
  • Posts: 25
2KB is quite limiting. For me a binary protocol would be enough. One could use an interface tool.
Have you tried to use code optimization (--opt-code-size and increasing --max-allocs-per-node)?

Btw, I just updated the schematic again (nothing too major).
« Last Edit: February 18, 2015, 04:13:59 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Optimization for code size helped a bit, I also use some floating point calculations to save the time to look up fixed point arithmetics and that is costly in cpu time and flash size and I will need to get rid of it soon. But for now I want to see results faster than I want more features.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I got some basic PWM working but I think I burned my unit somehow, it is not putting any output and when I use another unit to feed it power it says it is current limited even though it didnt get to the current limit. Still no idea which of the units is busted.

I guess I'll try tomorrow with a fresher mind.


EDIT: I couldn't hold off, I switched the heads of my two units and the top-unit is working, the bottom board seems to malfunction. I got the PWM to work to generate voltage, still didn't try to put a load on it.
« Last Edit: February 18, 2015, 07:47:50 pm by baruch »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
...
The thing I don't like about the control voltage is the negative part.  It will complicate ADC conversion for the read, and the DAC for the write.
...

The control and sense voltages are all positive. Might be hard to read on the image, but it says ~0.97V/A, not -0.97V/A. Wasn't able to nail down a more precise constant, possibly because there's some non-linearity involved (due to the shunt temperature perhaps?).


Wow, so much have been done in the little time I have been busy-up finishing another project.  This is good stuff!

Glad to someone read this right:  ~0.97 and not -0.97.  I need to upgrade my glasses.

The shunt temperature makes it very difficult.  When I tested its reaction to heat, even with a PC-case fan (2.5") blowing (at approx 68-70F, the cold (first power on) is still measurably different from fan-cooled reading.  Fan-cool do keep it at more-or-less constant, just at a higher number than cool.  Regardless, I decided to wait on slope/intercept constant until I get my 75ppm shunt replacement.  I will report back on how that works out hopefully fairly soon.
 

Offline ticpu

  • Newbie
  • Posts: 5
  • Country: 00
Ouch, that 2kb limit will really impair the plan I had to have a CLI on these :-/   I'm still waiting for a second board + programmer to check this out but you're right, a binary or really short commands protocol will be the only way to get something going on these.

Edit: and on my side, the fact that this is mostly programmable on Windows is another problem for me.
« Last Edit: February 19, 2015, 12:52:07 am by ticpu »
 

Offline flex

  • Contributor
  • Posts: 25
Just a quick update: I reversed engineered the top board. The buttons are really strange :-//. Maybe someone could (dis)confirm that part of the schematic? I don't have an oscilloscope right now, but I would suppose that the gpio pins are toggling to read all buttons...

Another thing: I just checked the Datasheet, because 2K is very limiting and it says: STM8S003F3 has 8K flash. (See STM8S003K3 STM8S003F3 Datasheet page 1 or page 9)
Actually the only 2K Flash STM8 is the STM8L101F1 (see http://www.st.com/web/en/catalog/mmc/FM141/SC1244).

@baruch: how did you determined the 2K flash you were talking about?

btw: I was able to use stm8flash under linux with my stlinv2 clone. (But the readout failed due to the protection).

Update: the bottom board had the labels lost. I added them again.
Update: see newer post for updated schematic
« Last Edit: February 20, 2015, 05:02:46 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Edit: and on my side, the fact that this is mostly programmable on Windows is another problem for me.

I could program it from Linux with stm8flash, if you can help with figuring how to remove the ROP bit with stm8flash your road should be clear to work in Linux/Mac as you please. I believe that the first use of the Windows program upgraded the firmware such that stm8flash does not work anymore with the new firmware. I plan on ordering another programmer to check that theory.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Just a quick update: I reversed engineered the top board. The buttons are really strange :-//. Maybe someone could (dis)confirm that part of the schematic? I don't have an oscilloscope right now, but I would suppose that the gpio pins are toggling to read all buttons...

At least the two left buttons (SET, DOWN) triggered separate pins on the STM8. One of the others (I forgot already) triggered one of the pins of the UART. They were going from high to low when I pressed the button.

Quote
Another thing: I just checked the Datasheet, because 2K is very limiting and it says: STM8S003F3 has 8K flash. (See STM8S003K3 STM8S003F3 Datasheet page 1 or page 9)
Actually the only 2K Flash STM8 is the STM8L101F1 (see http://www.st.com/web/en/catalog/mmc/FM141/SC1244).

@baruch: how did you determined the 2K flash you were talking about?

My bad, the flash is 8K, I somehow misread the code size when it failed.

Quote
btw: I was able to use stm8flash under linux with my stlinv2 clone. (But the readout failed due to the protection).

There is a report that stm8flash can't clear the ROP which is why I used windows and since then stm8flash doesn't work for me. I would suggest finding a way to erase the ROP with stm8flash if possible. I'm just not up to it to reverse engineer the usb protocol of the new firmware.
« Last Edit: February 19, 2015, 05:36:21 am by baruch »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Just a quick update: I reversed engineered the top board. The buttons are really strange :-//. Maybe someone could (dis)confirm that part of the schematic? I don't have an oscilloscope right now, but I would suppose that the gpio pins are toggling to read all buttons...

At least the two left buttons (SET, DOWN) triggered separate pins on the STM8. One of the others (I forgot already) triggered one of the pins of the UART. They were going from high to low when I pressed the button.

A few questions:

1. Pin 10 of the MCU goes to the two leds, how are they going to be controlled separately? We do see only one of them turning on at a time.

2. Pins 11 & 12 of the MCU, both come from VCC and driven by outside but from what I found so far, pin 11 seems to be CV/CC indication and pin 12 is output enable.

3. I'd have to check out that complex button setup, seems like it will require changing the pinout settings constantly from input to output to probe each pin at a time, if done fast enough it can work and I can think that it is possible that what I saw on the UART lines was just noise and not a real signal but I didnt probe that yet since I use the UART for debug and didn't control leds or lcd yet to enable a better probe into that.

 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
@flex, one more thing.

That's an amazing work you did with the reverse engineering, if you do a video of that work in Dave's reverse engineering style I'd be sure to watch it!

Some of the lines go under a chip, others are completely obscured and I couldn't for the life of me keep a straight track of where each line goes at any time. The best I could hope for was to use my cheap multimeter to buzz points for connectivity.
 

Offline plazma

  • Frequent Contributor
  • **
  • Posts: 472
  • Country: fi
    • Homepage
A few questions:

1. Pin 10 of the MCU goes to the two leds, how are they going to be controlled separately? We do see only one of them turning on at a time.
Drive the pin low to enable the high side led. Drive the pin high to enable the low side led.

Nice switch and led tricks to reduce the needed pin count.
 

Offline flex

  • Contributor
  • Posts: 25
Quote
One of the others (I forgot already) triggered one of the pins of the UART. They were going from high to low when I pressed the button.
Quote
3. I'd have to check out that complex button setup, seems like it will require changing the pinout settings constantly from input to output to probe each pin at a time, if done fast enough it can work and I can think that it is possible that what I saw on the UART lines was just noise and not a real signal but I didnt probe that yet since I use the UART for debug and didn't control leds or lcd yet to enable a better probe into that.
I just checked the buttons again and couldn't find any connection to RX/TX. The button solution looks a little crazy. I also think that changing the pinout settings is the only option to read them.

Quote
2. Pins 11 & 12 of the MCU, both come from VCC and driven by outside but from what I found so far, pin 11 seems to be CV/CC indication and pin 12 is output enable.
I think you forgot the question here?
Anyway, I think pin 11 of the mcu (pin7 of the pin header) is driven by the bottom board. There is no reason for R18 with this bottom board, maybe there is one with an open collector output. To proof my point, I just desoldered the R18 and it still works.
Pin 12 of the mcu (pin6 of the pin header) is driven by the top board to enable the output. Since it is active low the R16, R17 pull up ensures that the device is off even without the MCU. As soon as you pull it down, the out and the led is enabled.

Quote
That's an amazing work you did with the reverse engineering, if you do a video of that work in Dave's reverse engineering style I'd be sure to watch it!

Some of the lines go under a chip, others are completely obscured and I couldn't for the life of me keep a straight track of where each line goes at any time. The best I could hope for was to use my cheap multimeter to buzz points for connectivity.
There is no secret in my reverse engenerring. All I did was a combining a scan of both pcb sides. After that I tried to draw all the trace of the bottom layer at the top layer. Since some parts weren't visible I did some educated guesses an tested them with my multimeter (beep). Now  I transfered that to a CAD prog (Kicad). You just need some patience and the better your guesses the faster you are.  >:D E.g. The buttons took forever, because I didn't expected that strange schematic for them.
« Last Edit: February 19, 2015, 10:24:29 am by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il

Quote
2. Pins 11 & 12 of the MCU, both come from VCC and driven by outside but from what I found so far, pin 11 seems to be CV/CC indication and pin 12 is output enable.
I think you forgot the question here?
Anyway, I think pin 11 of the mcu (pin7 of the pin header) is driven by the bottom board. There is no reason for R18 with this bottom board, maybe there is one with an open collector output. To proof my point, I just desoldered the R18 and it still works.
Pin 12 of the mcu (pin6 of the pin header) is driven by the top board to enable the output. Since it is active low the R16, R17 pull up ensures that the device is off even without the MCU. As soon as you pull it down, the out and the led is enabled.

About pin12, if it is driven from the MCU, how comes it is connected to VCC? (at least in the version I have)

For pin 11, you mark it as coming from VCC as well but it's not always on, I can get sometimes when it goes down. Initially there was a thought it might be a power good signal but I don't think it's the case.

I'm mostly baffled by the connection you make to VCC it seems to me as it implies that it gets driven from outside. I might also be misreading it.
 

Offline flex

  • Contributor
  • Posts: 25
About pin12, if it is driven from the MCU, how comes it is connected to VCC? (at least in the version I have)
Are we talking about the same pin 12? I'm talking about the PB4 of the STM8 (pin12 of the TSSOP package directly connected to pin 6 of the pin header). This pin has a pull up to VCC and also a LED. This is the "not Output enable" (~OE) pin and is driven by the MCU.
Quote
For pin 11, you mark it as coming from VCC as well but it's not always on, I can get sometimes when it goes down. Initially there was a thought it might be a power good signal but I don't think it's the case.
Same with pin11, are we talking about the same pin? PB5 of the STM8 (pin11 of the TSSOP package) is directly connected to pin 7 of the pin header and has a 10k pull-up resistor (that is not really needed). This pin indicates CC/CV and is driven by an opamp on the lower board and can be read by the MCU.
Quote
I'm mostly baffled by the connection you make to VCC it seems to me as it implies that it gets driven from outside. I might also be misreading it.
I'm not sure if we are talking about the same thing  ::). If you think your PCB is different to my schematic, please tell the exact position (like "I think pin12 of the stm8 is directly connected to VCC and doesn't have a pullup"). That way I can easily verify the schematic.
« Last Edit: February 19, 2015, 12:26:42 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
We are talking about the same pins exactly. I don't think the PCB is different. I may not understand than how the pull-up works. It should be noted that I'm a software guy and hardware is merely a side hobby of recent times so even the most obvious things in hardware may not be understood by myself. I've learned a lot in a short time but I still know a lot less than I need :-)
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
As for the double leds, I found that I can drive the three states we care for:
Green ON: PA3 OUTPUT=LOW CR1=1
Orange ON: PA3 OUTPUT=HIGH CR1=1
Both OFF: PA3 OUTPUT=HIGH CR1=0 or INPUT

CR1 means 1=push-pull 0=open-drain
 

Offline flex

  • Contributor
  • Posts: 25
Let's take a look at pin 12.
a) pin 12 in "high impedance" mode (as it is at startup). The two resistors R16, R17 pull up the voltage of pin 12 (and the ~OE Pin) to VCC (we can assume inf resistance for the ~OE input pin and pin 12).
b) pin 12 output low. Now there is about 0V at pin12 (imagine a voltage divider with the lower resistor 0R). That means the potential of the ~OE Pin is 0V. This also means that a small current is sinked from VCC through the Out LED into pin 12.

I put an attachment to explain the idea.

btw. The output pin uses a tristate logic. So it has "low", "high", "high impedance". There usually also is an pull up/down that can be enabled. The drawing is an simplification. The chip has a mosfets integrated to do the switching. I didn't draw "high", "pull-up", etc.
« Last Edit: February 19, 2015, 01:45:24 pm by flex »
 

Offline flex

  • Contributor
  • Posts: 25
As for the double leds, I found that I can drive the three states we care for:
Green ON: PA3 OUTPUT=LOW CR1=1
Orange ON: PA3 OUTPUT=HIGH CR1=1
Both OFF: PA3 OUTPUT=HIGH CR1=0 or INPUT

CR1 means 1=push-pull 0=open-drain
This was expected, as plazma already said.  ;)
« Last Edit: February 19, 2015, 01:44:23 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
This was expected, as plazma already said.  ;)

Yes, of course. I just had to translate it to actual hands-on terms for myself.

As for the pins 11/12, the thing that bothers me in the schematics is that the top board schematics shows that the pins are connected to VCC and do not show that it is connected to the bottom board parts as well.
 

Offline flex

  • Contributor
  • Posts: 25
I think you are reading the schematic wrong  ;)

All signals with the same label are connected.

That means pin 12 oft the stm8 is connected by the label 6 to the pin 6 of the pin header and this pin header is connected to the lower part.

This way there is no need to draw all the wires (this can get quite confusing). This is the same concept as with the power signals "GND", "VCC", etc.
« Last Edit: February 19, 2015, 02:08:03 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm using my pin testing firmware (https://github.com/baruch/STM8_reverse_engineering_aid) and when I press the down button I get that PA1 goes from 1 to 0 and back when I release it.

In the schematics published it should be connected to PD1.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I think you are reading the schematic wrong  ;)

All signals with the same label are connected.

That means pin 12 oft the stm8 is connected by the label 6 to the pin 6 of the pin header and this pin header is connected to the lower part.

This way there is no need to draw all the wires (this can get quite confusing). This is the same concept as with the power signals "GND", "VCC", etc.

I'm fast to understand when I get repeated and slow explanations :-)

Does it also mean that the connection to pin 6 of the header from pin 12 is near the mcu?
 

Offline flex

  • Contributor
  • Posts: 25
Quote
Does it also mean that the connection to pin 6 of the header from pin 12 is near the mcu?
I'm sorry, but I have not idea what you are asking. I'm not a native English speaker. What do you want to say with "is near the mcu"? I suppose you can't mean that it in the "short distance" sense.

However as I said pin 12 (stm8) is directly connected to pin 6 (pin header). Even if there is no wire. The label does the trick ;)

Quote
I'm using my pin testing firmware (https://github.com/baruch/STM8_reverse_engineering_aid) and when I press the down button I get that PA1 goes from 1 to 0 and back when I release it.

In the schematics published it should be connected to PD1.
I can't see that it is connected to PA1. Are you sure about your code? Please try yourself by using the multimeter.

Update: just added an image to illustrate the down button connections
Update2: just enhanced the image
« Last Edit: February 19, 2015, 02:59:19 pm by flex »
 

Offline jaxbird

  • Frequent Contributor
  • **
  • Posts: 778
  • Country: 00
Did anyone evaluate the dynamic performance of this device?

E.g. set output to 5V 1A, then hook up a scope, short it and capture the behavior. followed by a release of the short and capture the behavior.

And also capture on and off switching between e.g. 100mA load and 1A load. With CC limit set above 1A.

Analog Discovery Projects: http://www.thestuffmade.com
Youtube random project videos: https://www.youtube.com/user/TheStuffMade
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Quote
Does it also mean that the connection to pin 6 of the header from pin 12 is near the mcu?
I'm sorry, but I have not idea what you are asking. I'm not a native English speaker. What do you want to say with "is near the mcu"? I suppose you can't mean that it in the "short distance" sense.

However as I said pin 12 (stm8) is directly connected to pin 6 (pin header). Even if there is no wire. The label does the trick ;)

I'm not a native English speaker either but we don't have another common language, unless you speak Hebrew :-)

Let me attempt to explain my misunderstanding differently. In the diagram you showed me, there are three connections to the pin 12 circuit, there is VCC, pin 12 and ~OE. How do I understand from the schematics where each of them is connected?

EDIT: Looking now at the schematic, the small schematic for pin 12 and my question and it is obvious where I was mistaken, the small schematic shows a straight line with no component between pin 12 and ~OE. Now I understand what you meant above that the pin 12 is directly connected to pin 6 on the connector.

Quote
Quote
I'm using my pin testing firmware (https://github.com/baruch/STM8_reverse_engineering_aid) and when I press the down button I get that PA1 goes from 1 to 0 and back when I release it.

In the schematics published it should be connected to PD1.
I can't see that it is connected to PA1. Are you sure about your code? Please try yourself by using the multimeter.

Update: just added an image to illustrate the down button connections
Update2: just enhanced the image

You were right, my program had a bug.
« Last Edit: February 19, 2015, 08:02:45 pm by baruch »
 

Offline flex

  • Contributor
  • Posts: 25
I just did a small hardware mod  >:D

One problem with the b3603 is, if you connect power, the output will be on for a small period of time (even if the unit is turned off). This is a hardware bug, and happens because the 5V switching reg is starting up too slowly to disable the LM2596 from the beginning.

Solution: solder an additional pull up resistor between pin6 (easier access tha ~OE) and Vin+ on the bottom board to ensure that the output is off and remove R16 on the top board. This resistor has to be at least 10k, because otherwise you could exceed the absolute max injected current of the stm8 with 40V Vin. I used a 100k resistor. R16 has to be removed, because otherwise we'll get an unintended voltage divider.

Quote
Did anyone evaluate the dynamic performance of this device?
If noone does it, I'll test it at the end of next week. Btw we should be able to tweak the performance by changing R34, R35, C14 (current) and R20, R19, C10 (voltage).

@baruch I think you got the idea behind my drawing

I updated the schematic again, because the bottom board had lost it's component labels. ::)
« Last Edit: February 19, 2015, 08:51:28 pm by flex »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Just a quick update: I reversed engineered the top board. The buttons are really strange :-//. Maybe someone could (dis)confirm that part of the schematic? I don't have an oscilloscope right now, but I would suppose that the gpio pins are toggling to read all buttons...
...

I double checked twice.  Your schematic looks right with the button connection!

- With D5-pin3 (double schottky) 3 goes top-side but under the button, then come back down to bottom-side under the same button.  The  top-side trace being under the button is not visible but DMM confirmed that the two holes are connected which subsequently connects to the OK button.

- D5-pin2 going to the SET button is a visible trace.  It subsequently go top-side on a visible trace that went back to the bottom-side under the MCU chip.  Since under the MCU the trace's path is not visible, but position suggests MCU-pint17 and DMM confirms MCU-Pin17 connects to MCU pin17 exactly as your schematic shows.

So, in so far as I can see, your schematic for the buttons is confirmed; AND they are indeed strange.

By-the-way

In an earlier post (where I purposed using header pin numebrs) I said no visible TxD/RxD connects HeaderPin 15 and 16 (Rx/Tx) to the MCU's Rx/Rx.  I found the traces connecting them to UART1 RxD and TxD.  I reedited that reply also to avoid confusion.

Rick
 

Offline flex

  • Contributor
  • Posts: 25
@Rick Law or anyone else with an scope: Could you check the PWM frequency at pin 4 and 5? I don't have any scope nearby.

Quote
In an earlier post (where I purposed using header pin numebrs) I said no visible TxD/RxD connects HeaderPin 15 and 16 (Rx/Tx) to the MCU's Rx/Rx.  I found the traces connecting them to UART1 RxD and TxD.  I reedited that reply also to avoid confusion.
They are also in my schematic ;)

Quote
So, in so far as I can see, your schematic for the buttons is confirmed; AND they are indeed strange.
thx for checking. This is an interesting design choice.
« Last Edit: February 20, 2015, 12:23:23 am by flex »
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
@Rick Law or anyone else with an scope: Could you check the PWM frequency at pin 4 and 5? I don't have any scope nearby.



1.293K Hz based on my Rigol DS1052E

I have been wanting to replace the interface on that module. with the pinout mapping provided I think I can do it.

my Ideal interface would include OLED display to display Vset,Iset,Vout,Iout. one rotary encoder with built in switch to vary the set voltage and current ( switch to change from voltage to current and vice versa) . one push button to switch the output on & off. 2 leds for CC & output ON.
 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #100 on: February 20, 2015, 09:17:12 am »
Nothing new, but I found the product pages from the manufacturer, at least it looks so:
B3603
http://www.mhinstek.com/product/html/?106.html
B3008:
http://www.mhinstek.com/product/html/?108.html
400w:
http://www.mhinstek.com/product/html/?110.html

No support pages, no nothing.. atleast I expected download links for the manuals there, but still the link from this page points to mhinstek.com for the pdf:
http://www.banggood.com/B3603-Precision-CNC-DC-DC-Digital-Buck-Module-Constant-Voltage-Current-p-946751.html
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #101 on: February 20, 2015, 10:16:37 am »
@neslekkim, I found on that website a download section with manuals, even English translations for the B3606 and B3606. They don't have any new information there though.
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #102 on: February 20, 2015, 05:01:18 pm »
I just calculated all the output pin values, They all line line up quite well with bal00 measurements:

Iout_set=Iout_sense=16*(0.01V+Iout*0.05)
Vout_set=Vout_sense=16/15*(31mV+0.068*Vout)
Vin_sense=Vin/16
CC=2.91V and CV = 0.45V

In case someone is wondering why there is an offset: Even through the MCP6002 is an rail to rail Input/Output opamp, the Output can't reach 0V and VCC (see datasheet). The offset just is a workaround to this problem.

I also attached an updated schematic with more annotations and a scan of the calcualtion for Iout_sense and Vout_sense (don't have time for LaTeX at the moment).

btw: For simplification I left the small voltage drop of the shunt (which mainly depends on the current, but also temperature) out of the sense voltage calculation.

Update: I forgot the resistor R18 of the top board. With it the CC voltage increases to 2.91V. and unlike I claimed some posts ago the resistor has a use: it increases the voltage to better fit inside the high level voltage. (not strictly necessary, because VIH=0x7*VDD to VDD+.3mV)
Update2: just added the inductor values and fixed the IC numbers of the opamps.
(I think I should just upload them with a version control system. That way I wouldn't need to update these forum posts  ::) )
Update3: forgot to recalc the CV voltage for the first update.

Quote
I'm somewhat fumbling on this topic but what would be the best way to control the PWM vs the voltage/current setting vs the output sense?

Should I do a feedback loop between the outputs and the pwm? Find some calibration and let it be?
I wouldn't create a digital feedback loop. That loop would be rather slow. Rather trust the calibration. (I think that is also what the orig firmware does)


@Rick Law or anyone else with an scope: Could you check the PWM frequency at pin 4 and 5? I don't have any scope nearby.

1.293K Hz based on my Rigol DS1052E
That is interesting. I think we can assume that the pwm timer of the stm8 is running at the max freq of 16Mhz. That would mean the pwm timer counts to 16Mhz/1.293kHz=12374.
« Last Edit: February 20, 2015, 07:46:05 pm by flex »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #103 on: February 21, 2015, 07:03:20 am »
@Rick Law or anyone else with an scope: Could you check the PWM frequency at pin 4 and 5? I don't have any scope nearby.

Quote
In an earlier post (where I purposed using header pin numebrs) I said no visible TxD/RxD connects HeaderPin 15 and 16 (Rx/Tx) to the MCU's Rx/Rx.  I found the traces connecting them to UART1 RxD and TxD.  I reedited that reply also to avoid confusion.
They are also in my schematic ;)

Quote
So, in so far as I can see, your schematic for the buttons is confirmed; AND they are indeed strange.
thx for checking. This is an interesting design choice.

Sorry, didn't mean to ignore you!  Been busy.  Good that Asim got that taken care of with his scope.

I removed the shunt today and upgraded it to a Vishay/Dale 75ppm shunt.
I will have some hard data to share in a few days.  (Doing 8 hour runs, so just one pass per night, 8 minutes FAN on, 35 minutes FAN off cycle to log how shunt readings change as it heats up at particular current.  I do not have equipment to measure shunt temperature. )

In setting up for the run, during the pre-run tests, I noticed the (from fan cool to no-fan max) swing was half what I saw with the stock shunt at 1.6A.  This is less improvement than I had hope.  So the stock shunt is not as bad as I had thought.  I'll post the data when I got that done.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #104 on: February 21, 2015, 07:23:27 am »
@flex, in the schematics the formula seems to involve a mA value and what seems to be fractions of a volt. Am I reading it right?

For example the Iout says: Iout_sense = 16 * (10mV + 0.05 * Iout),
The code translation for me would be: Iout_sense = 160 + 0.05 * Iout
I hope I'm not mixing up units...

I built some code to drive the 7-segment display and hit again the 8K limit, the floating point code I currently use takes about 4KB of that so I'm now reeducating myself on fixed point arithmetic and converting the code from float to fixed point. I'm also incorporating already the defaults as you calculated, we'll add actual calibration as well so the parameters above (16, 10) are values that can be changed online.
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #105 on: February 21, 2015, 12:43:54 pm »
@flex, in the schematics the formula seems to involve a mA value and what seems to be fractions of a volt. Am I reading it right?

For example the Iout says: Iout_sense = 16 * (10mV + 0.05 * Iout),
The code translation for me would be: Iout_sense = 160 + 0.05 * Iout
I hope I'm not mixing up units...
You got me there. I was a sloppy and left the units out. I'll fix that. See attachment for the units. I'll also add the units to the schematic.


Quote
I built some code to drive the 7-segment display and hit again the 8K limit, the floating point code I currently use takes about 4KB of that so I'm now reeducating myself on fixed point arithmetic and converting the code from float to fixed point. I'm also incorporating already the defaults as you calculated, we'll add actual calibration as well so the parameters above (16, 10) are values that can be changed online.
Sounds good to me. I hope you can get it into 8K flash. Not sure if you should use the theoretical values as default. Maybe we should average over the units in this forum. But with calibration, this isn't too important.

@Rick Law that sounds interesting. The position of the shunt is too bad. Otherwise we could try to put something like the MP915-0.050-1% as shunt.
« Last Edit: February 21, 2015, 01:15:36 pm by flex »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #106 on: February 21, 2015, 08:50:30 pm »

@Rick Law that sounds interesting. The position of the shunt is too bad. Otherwise we could try to put something like the MP915-0.050-1% as shunt.

Indeed!   To achieve better stability, it would be better to move the shunt than to just upgrade it.

Interesting to note that (as shown in Flex's schematic) pin 9,10 verses 11,12 do not connect to same ground.    The initial pin decoding in banggood.com has header pin 9,10,11,12 as ground, but they are not the same.

(@Flex, what is extra D stands for in the "other ground" GNDD vs GND)

With the shunt removed, you can see the trace that connects the low-end of the shunt to GNDD at header pin 11,12.



I took a picture of it because with the "under the component" traces - easy to forget once the component get back on and covers it again.

Rick
« Last Edit: February 21, 2015, 08:52:58 pm by Rick Law »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #107 on: February 21, 2015, 09:21:27 pm »
Ok, I upgraded my shunt with the Vishey WSL2R0500FEA, a 75ppm shunt.  The improvement is measurable but far below what I had expected.  This points to two possible explaination: (1) The stock shunt is better than I had thought, and/or (2) the location of the shunt is too awful for it to cool.

I think the stock shunt is probably 150ppm.

Before we look at the new graphs, let's refresh our memory on the two baseline graph/data with the stock shunt posted on 2/4/15.





Using the same setup to let the shunt heat up at 1900 or 1600mA, which takes about 20 minutes to heat to equilibrium, and cools it by fan and it take just over 5 minutes for it to cool to equilibrium.  I set the no fan heat up for 35 minutes, and fan cool down to 8 minutes.

Below are the new data with the Vishay shunt.  You can see that the max/min swing is about half.
@1900mA-2000mA
Vishay shunt swings 2006 to 1973 = +-33
Stock shunt swings 2009 to 1954 = +-55
@1600mA
Vishay shunt swings 1613 to 1592 = +-21
Stock shunt swings 1628 to 1591 = +-37

[new 1900 picture]


[new 1600 picture]

Note also, at near 2.0A, one should run a fan at the voltage regulator.  It gets very uncomfortably hot bear 2A.

The low improvement is disappointing.

Rick
« Last Edit: February 21, 2015, 09:26:52 pm by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #108 on: February 21, 2015, 09:43:45 pm »
I switched over the code to fixed point, I needed 32bit arithmetics so there is some space in that now but I dropped the code size to 5KB and I can now fit the display driver for the 7-segment display. It works ok, I currently update the digit data too often so a flicker is visible at the lowest digit when it's on the edge. That should be easy to fix I guess with only updating the display not too often.

I didn't manage to use the buttons properly so far, I suspect now it is because the pin is also a SWIM pin and there is a flag to make it usable for the software. I'll give it a try tomorrow.

I also didn't manage to control the on/off led button and as such can't really enable/disable the output properly. I tried all the pin configurations and it hardly changes. There is some change so I'm affecting something but it's not a 0V to 5V, more like 0.3V. The led anyway stays on all the time for me. While I can't properly affect the led if I put the Iout and Vout set pins to on (no pwm) I get voltage out of the unit, still didn't try to draw current from that though.
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #109 on: February 22, 2015, 06:10:25 pm »
(@Flex, what is extra D stands for in the "other ground" GNDD vs GND)
It's for digital ground. I was looking for another GND symbol an thought digital ground would suit, since it is only used for the (digital) display. Feel free to suggest any other name.

I also didn't manage to control the on/off led button and as such can't really enable/disable the output properly.
In theory, all you have to do to enable the on/off led and the regulator is to set PB4 to output GND.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #110 on: February 22, 2015, 06:14:19 pm »
I also didn't manage to control the on/off led button and as such can't really enable/disable the output properly.
In theory, all you have to do to enable the on/off led and the regulator is to set PB4 to output GND.

I tried all the possible combinations that I can find, I set it to output HIGH, output LOW, input and also played with the open-drain/push-pull. The red led is always on no matter what I do. Output control works by setting the Iout and Vout pins high, the OE pin seems uncontrollable. Obviously I'm doing something wrong since the original firmware does control it somehow.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #111 on: February 22, 2015, 09:55:18 pm »
Current state of things:

  • I do not control the Output Enable pin, but it doesn't seem to matter. No idea how it comes. It is always at 0V no matter what I do on the pin.
  • I can control the 7-segment display and it shows data consistently and nicely
  • I can control the CV/CC led, still didn't get to test CC condition to be sure of the full loop
  • I can measure with pretty good accuracy the Vin, Vout and Iout based on the theoretical numbers. I didn't attempt to calibrate my own unit yet.
  • I can control the PWM for the voltage and the current, this is not at all calibrate so the random I use and the output are off by a factor or so but it seem to work and only needs tuning

This is now pretty close to being a minimally functional serially controlled power supply.

EDIT: I have two units, one supposedly damaged in parts but where OE is 5V and one supposedly good where OE is always at 0V. I'll try the new firmware on the old and supposedly damaged unit and see what happens, this will happen tomorrow. The damaged unit had been injected 17V to the top board and then once plugged in reverse. the 7-segment display doesn't seem to work properly and there was some problem with output regulation after it was plugged in reverse.
« Last Edit: February 22, 2015, 10:08:07 pm by baruch »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #112 on: February 23, 2015, 03:10:27 am »
@Rick Law or anyone else with an scope: Could you check the PWM frequency at pin 4 and 5? I don't have any scope nearby.



1.293K Hz based on my Rigol DS1052E

I have been wanting to replace the interface on that module. with the pinout mapping provided I think I can do it.

my Ideal interface would include OLED display to display Vset,Iset,Vout,Iout. one rotary encoder with built in switch to vary the set voltage and current ( switch to change from voltage to current and vice versa) . one push button to switch the output on & off. 2 leds for CC & output ON.

I think 1.293K Hz is wrong!

I hooked up my scope (6022BE) and took a snap shot of pin4 and 5.  I have a frequency of around 1.946KHz (with typical occasional 6022BE jitter from 1.937 to 1.947 but mostly 1.946).

To double check, I connect my UT61E to it, and it confirms a rock solid 1.945KHz.

To triple check, I connect my UDB1308S function gen and use the frequency counter, a rock solid 1.945KHz.

So, if I am a betting man, I would place my bet on 1.945KHz.

Asim, would you double check?

Thanks
Rick

Edit:  Just an interesting FYI... a coincidence - like the B3603, the UDB13xx cheap function generator is also a MingHe product.  I guess they specialize on economy stuff.
« Last Edit: February 23, 2015, 03:20:03 am by Rick Law »
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #113 on: February 23, 2015, 06:15:00 pm »
@Rick Law I think you are correct. I also have an UT61E (but forgot about the frequency function until you mentioned it). It shows for my unit 1.937kHz.

That is also more reasonable. Since ld(16Mhz/1.945kHz)=13 Which would mean 13 bit PWM.

EDIT: I have two units, one supposedly damaged in parts but where OE is 5V and one supposedly good where OE is always at 0V. I'll try the new firmware on the old and supposedly damaged unit and see what happens, this will happen tomorrow. The damaged unit had been injected 17V to the top board and then once plugged in reverse. the 7-segment display doesn't seem to work properly and there was some problem with output regulation after it was plugged in reverse.

OE is supposed to be at 5V until the mcu program sets the output low. Great to hear that your firmware is making progress.
« Last Edit: February 23, 2015, 06:32:06 pm by flex »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #114 on: February 23, 2015, 09:10:55 pm »
I have one top-board with faulty 7-segment and one top-board where the OE is always low. I have PWM control now but the calculation is off by what seems like a factor of 5 or so. The code is on github if someone wants to review my math. It's been a while since I needed to worry about integer overflows and to tread carefully in numerical analysis.

I have ordered another B3603 unit so I will have another hopefully fully working one but due to the chinese new year the unit wasn't shipped yet and it will be at least three weeks after the holiday is over before I have a chance to get it so for now I can either develop the display or the pwm control.

EDIT: Direct line to review, the functions are control_voltage() and control_current(), can be found at https://github.com/baruch/b3603/blob/master/stm8/main.c#L532
« Last Edit: February 23, 2015, 09:47:15 pm by baruch »
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #115 on: February 23, 2015, 09:40:36 pm »
HAHAHA,
I don't know how i typed that frequency. I meant to say 1.923kHz. this is what my unit shows.

the frequency doesn't matter that much as far as I know as it will be filtered to an average voltage that will control the Iset & Vset.
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #116 on: February 24, 2015, 12:35:28 am »
the frequency doesn't matter that much as far as I know as it will be filtered to an average voltage that will control the Iset & Vset.
Yeah there is a low pass filter. But for 1.293kHz (T=773us) the filter isn't as good as for 1.923kHz (T=520us). It has significant more ripple. I attached my simulation results.

@baruch I think the problem is:
Code: [Select]
tmp *= 73<<10since sizeof(int)=16, this will overflow. Try this instead:
Code: [Select]
tmp *= 74752;//73<<10;The same for
Code: [Select]
tmp += (33<<10)/1000;it also overflows 16 bit signed int. Instead use:
Code: [Select]
tmp += 34;//(33<<10)/1000;
I haven't checked control_current, but I suspect the same mistake.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #117 on: February 24, 2015, 05:05:39 am »
@flex, it works now!

I can control the voltage and it is pretty good even without calibration I get one digit after the dot of accuracy. Still haven't tested the current part, I'll need to find a led or something that I can use in a current limited mode.
 

Offline flex

  • Contributor
  • Posts: 25
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #118 on: February 24, 2015, 09:35:20 am »
Great to hear. Just a note on your commit message:
Quote
Fix integer overflow issues
The default type is uint16_t and some of the fixed numbers were
overflowing that, giving the explicit number makes the compiler use a
larger integer type and corrects the calculation.
That isn't technically correct. The default type of literal integer constants in c is int. Further the size of int is compiler (platform) dependent (2 byte in this case) and signed. If a constant is too big to be represented as int, the compiler uses the next bigger one like long int. You can influence this behavior with the suffix u and l.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #119 on: February 27, 2015, 06:26:55 pm »
I am kind of "out of the game" now...  I am not familiar with how to program this MCU.

Any suggestion on what you think is best for an Arduino guy to get started with programming the STM?

Thanks
Rick
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #120 on: February 27, 2015, 06:58:27 pm »
I'm not an Arduino guy nor an STM guy. For my day job I develop C for PCs for clustered systems so MCUs are a new game to me :-)

Unlike the Arduino there is no fancy library that does all the hard work for you, actually there is but due to space limitations and license uncertainty I prefer to avoid it. So I'm developing in raw C with the raw registers.

There are two main documents that I refer to near constantly, one is the overall reference which can be found in the git repo at doc/CD00190271_RM0016_STM8_Reference_manual.pdf and the other is the STM8S003 details at doc/DM00024550_STM8S003F3_Datasheet.pdf I also consult the STM8 application notes where relevant and google for how to do specific things but there is maybe one or two sites that have the information I need.

The other part to know is to build and install, I use sdcc to build so you should have that installed, the makefile will do the rest for you in that regard. For some reason stm8flash, the linux flash utility, doesn't work for me anymore so I resort to having a Windows VM on my Linux which runs the STVP (Windows programmer software) and I use that to program. I needed to expose to Windows the STLinkV2 and used google to find how to do it.

If you have questions just ask (here or on email), I tend to do these things in small bites. I suggest that you decide what you want to work on and I can help guide you along the way. I try to make the code fairly well documented and hopefully easy to read. You can find it all at https://github.com/baruch/b3603/

 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #121 on: February 27, 2015, 07:00:56 pm »
BTW, there are also requests/ideas to create another top-board with more features. If you're interested in hooking an Arduino and write code for it that would be interesting and useful as well. It may also provide more validation around the sense and control circuits too.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #122 on: February 27, 2015, 11:11:14 pm »
I am a C guy started with C with machines with 48K ram (Atari800), so bare-bone library and small-space coding is not an issue for me.  I am not sure about handling the MCU since Arduino has been doing it for me.  I also don't have tools to flash the STM.  So, I am not sure if messing with the B3603 control board is a good "first step".

Perhaps I will dive into the alternative Arduino based control board.  Off hand, I think the 10bit ADC and 8bit PWM is going to be too limiting...

Thanks for the pointers however!  Perhaps I will gain some courage and dig into the STM...
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #123 on: February 28, 2015, 07:39:56 pm »
I don't think it's such a big issue to program the STM8 compared to an Arduino. The ideas are the same and are the harder part to grasp IMNSHO. The how is only slightly harder, instead of calling set_pin_output(1) you need to do PC_DDR |= (1<<3) but the idea is that you need to configure a pin to input/output and set some parameters for it (pull up/down and such). Try to review my C code to get an idea of it.

The STM8 has a 10bit ADC as well but the PWM is much better but even an 8bit PWM would be useful to get something working and I got a few contacts saying they wouldn't reflash the STM8 board of the original unit so an Arduino firmware may be a good thing for many people.

 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #124 on: February 28, 2015, 08:28:23 pm »
I started to implement calibration, so far I did it for Vin and Vout ADC. I have to say, the formulas for Vout were close enough that both the PWM control and the ADC require very little calibration, at least for my needs. I still don't get the 0.01V accuracy, even with my calibration but it's within the 0.1V level so for my personal needs it's fine.

I'm still lacking control of the buttons. Need to figure it out yet.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I no longer have a unit with a stock firmware, can anyone check if the accuracy claims of the product are indeed met?

I'd be interested to know both the ability to control the voltage and to read its state so I'd like a test where a voltage output is tried at fine steps (0.01V) for some consecutive values (say 5.00 to 5.1 or even higher, as much as patience enables) and to measure externally if the voltage is really regulated at this accuracy.

I just tried to switch from 6.10 fixed point to 16.16 and this required 64bit temporaries and bumped the code to 10K. The firmware size at this time is already close to 8K even at 6.10.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I no longer have a unit with a stock firmware, can anyone check if the accuracy claims of the product are indeed met?

I'd be interested to know both the ability to control the voltage and to read its state so I'd like a test where a voltage output is tried at fine steps (0.01V) for some consecutive values (say 5.00 to 5.1 or even higher, as much as patience enables) and to measure externally if the voltage is really regulated at this accuracy.

I just tried to switch from 6.10 fixed point to 16.16 and this required 64bit temporaries and bumped the code to 10K. The firmware size at this time is already close to 8K even at 6.10.

Stay tune, I will have some numbers for you before the end of the day.  (Preping for a snow storm here...)
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I no longer have a unit with a stock firmware, can anyone check if the accuracy claims of the product are indeed met?

I'd be interested to know both the ability to control the voltage and to read its state so I'd like a test where a voltage output is tried at fine steps (0.01V) for some consecutive values (say 5.00 to 5.1 or even higher, as much as patience enables) and to measure externally if the voltage is really regulated at this accuracy.

I just tried to switch from 6.10 fixed point to 16.16 and this required 64bit temporaries and bumped the code to 10K. The firmware size at this time is already close to 8K even at 6.10.

Stay tune, I will have some numbers for you before the end of the day.  (Preping for a snow storm here...)

Got some numbers now:
first column=selected
second column=what theB3603 shown as output
third column=UT61e measurement of output
delta=evaluated as nextVoltage-thisVoltage.

Set to   B3603   UT61E      Delta
1.93      1.92      1.9428      0.011
1.94      1.94      1.9536      0.011
1.95      1.96      1.9647      0.000
1.96      1.96      1.9647      0.017
1.97      1.96      1.9812      0.011
1.98      1.97      1.9922      0.011
1.99      2.00      2.0034      0.011
2.00      2.00      2.0145      0.011
2.01      2.00      2.0259      0.006
2.02      2.01      2.0315      0.011
2.03      2.04      2.0426      0.011
2.04      2.04      2.0538      0.011
2.05      2.05      2.0651      0.011
2.06      2.06      2.0761      
                  
4.93      4.93      4.944      0.011
4.94      4.93      4.955      0.006
4.95      4.94      4.961      0.011
4.96      4.95      4.972      0.011
4.97      4.98      4.983      0.011
4.98      4.98      4.994      0.010
4.99      4.98      5.004      0.006
5.00      4.98      5.010      0.011
5.01      4.99      5.021      0.011
5.02      5.02      5.032      0.011
5.03      5.02      5.043      0.011
5.04      5.03      5.054      0.006
5.05      5.04      5.060      0.011
5.06      5.06      5.071      
                  
10.93      10.93      10.943      0.012
10.94      10.93      10.955      0.006
10.95      10.94      10.961      0.011
10.96      10.96      10.972      0.011
10.97      10.96      10.983      0.011
10.98      10.97      10.994      0.012
10.99      10.98      11.006      0.005
11.00      10.99      11.011      0.011
11.01      11.00      11.022      0.012
11.02      11.02      11.034      0.011
11.03      11.02      11.045      0.011
11.04      11.04      11.056      0.005
11.05      11.05      11.061      0.011
11.06      11.07      11.072      
                  
15.93      15.91      15.948      0.011
15.94      15.94      15.959      0.006
15.95      15.93      15.965      0.012
15.96      15.95      15.977      0.010
15.97      15.96      15.987      0.012
15.98      15.97      15.999      0.011
15.99      16.00      16.010      0.006
16.00      16.00      16.016      0.011
16.01      16.00      16.027      0.011
16.02      16.01      16.038      0.011
16.03      16.04      16.049      0.011
16.04      16.04      16.060      0.006
16.05      16.04      16.066      0.010
16.06      16.05      16.076      
                  
17.93      17.92      17.948      0.011
17.94      17.95      17.959      0.012
17.75      17.95      17.971      

-----

You may notice, the actual regulated voltage output is finer than the displayed voltage.  Each 0.01V higher results in the actual output going up by about 0.011V even while the displayed voltage may not have changed.

Flex evalued the PWM to be 13 bits.  With the ADC being only 10 bits, it follows that the displayed voltage not being as accurate as the PWM output voltage setting.

The 0.011V is puzzling yet.  The 13 bit PWM has 8192 steps.  The math there makes 12 bit seem more reasonable. 0.011V*4096=45(volt).

I was interrupted.

For unknown reason, (static discharge?), when I resume, my unit's LED gone dark!
  On power up, a few random segment came on and then off.  But the unit seem to be controlling voltage still.  I can UP-arrow and see the voltage going up.  I think I did a factory result (based on memorized key-stokes), and the voltage went back to 5V -- But without the LED, I don't know what is going on.

So, I was going to see if it stays at that increment at higher voltage - but, with the unit dead, I will not be able to do much for now.

I have to figure out what to do yet.  Ideas?
On power up, a few (random?) segments of the LED do come on - quick sub-second blink, then goes dark.

Rick
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I've got one unit with the display like that but it was one that I tortured mistakenly. I'm using it now with serial only control. I am thinking about getting a replacement led display and replace it but for now it's more hassle than it's worth.

I'm also using 12 bit PWM and the frequency matches the one you all measured previously in the thread. Their accuracy indicates for me not just the PWM itself but also the accuracy of the calculations that they do. From my simulation if I use 6.10 split I can only get 800 out of the 1000 values between 0.00v and 10.00v and if I use 5.11 I can get all 1000 values but then it will overflow during some other calculations.

What I did to check is to take the 1000 strings on input, convert to a number and generate the pwm counter value and see if I get different values, in the current 6.10 I only get 800 different values for the PWM so the accuracy of the output will be lower than the original firmware.
« Last Edit: March 02, 2015, 05:30:11 am by baruch »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I've got one unit with the display like that but it was one that I tortured mistakenly. I'm using it now with serial only control. I am thinking about getting a replacement led display and replace it but for now it's more hassle than it's worth.
...

Yeah...  Oh well, time to hunt for another.

I think most likely it was ESD.  When I first got started I am usually well prepared doing things like discharging myself first.  After an interruption, I was preoccupied with trying to remember and then get back to where I was.  I think I got careless.
 

Offline bal00

  • Newbie
  • Posts: 2
Perhaps I will dive into the alternative Arduino based control board.  Off hand, I think the 10bit ADC and 8bit PWM is going to be too limiting...

I ran into this issue when I replaced the top board with an Arduino. To get around the problem with the ADC precision, I use averaging in software to increase the precision. Works pretty well.

The PWM precision can be increased by combining two PWM outputs using a resistor divider network. Basically you have 'coarse' and a 'fine' PWM output with 256 steps each, effectively providing 16 bits of resolution, which is more than good enough in terms of precision.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Perhaps I will dive into the alternative Arduino based control board.  Off hand, I think the 10bit ADC and 8bit PWM is going to be too limiting...

I ran into this issue when I replaced the top board with an Arduino. To get around the problem with the ADC precision, I use averaging in software to increase the precision. Works pretty well.

The PWM precision can be increased by combining two PWM outputs using a resistor divider network. Basically you have 'coarse' and a 'fine' PWM output with 256 steps each, effectively providing 16 bits of resolution, which is more than good enough in terms of precision.

I did a "voltmeter" with the Arduino and I used ADC averaging to get about 11 and almost 12 bit like resolution.  Particularly with a "look up" technique I developed: using the averaged 10 bit as an index into a look up table (about 256x16-bit entries per "range" interpolated to the 1024 points) of measured voltage.  That took care of the op-amp imperfection very well since it was looking up a prior measured voltage in the table.  It uses a lot of ram however.

For DAC, I tried voltage divide and add before and that worked reasonably well.  Stability is a problem however...

All these add-on's to add resolution each brings in more and more stability problem.  I have not acquired the skill to solve them well yet.

I'm going to make/replace the controller with Arduino, but with so many things I like to try (>8bit PWM, 15bit ADC, etc)... Even the replacement is going to take a while to get here.

Mean time, I have to connect my other "PSU" just to keep going for the short term.  I already miss my B3603...
« Last Edit: March 02, 2015, 06:36:14 pm by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I figured a way to get the full 13 bit resolution of the PWM and now I get the full value range. I figured that my timer counter being 8192 is just a shift of 13 so I can avoid overflow by not shifting up 13 and down 10 but rather just do the entire calculation with an additional shift of 3.

The ADC on the face of it is a lost cause, it is only 10bit accuracy so I only get a change every 8 values of the PWM. Rick, I'd be interested to hear about your ADC averaging, I started to think about it and wonder if it's worth spending time to do it. I currently take a snapshot of the ADC and need to average it out anyway but originally wasn't planning to use fractions in that average.

I did some code size reductions and the code is now at 7860 bytes. It was at 8100 beforehand and got down to 7600 and then the new PWM accuracy took more space. I kinda like the serial text interface but not sure how long I can hold on to it in its current verbose state.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I figured a way to get the full 13 bit resolution of the PWM and now I get the full value range. I figured that my timer counter being 8192 is just a shift of 13 so I can avoid overflow by not shifting up 13 and down 10 but rather just do the entire calculation with an additional shift of 3.

The ADC on the face of it is a lost cause, it is only 10bit accuracy so I only get a change every 8 values of the PWM. Rick, I'd be interested to hear about your ADC averaging, I started to think about it and wonder if it's worth spending time to do it.
...
...

Hey, I am actually glad you asked.  This may be re-inventing the wheel, but I was so pleased with myself, as a newbie, figuring out a good way to do this...

First, here is a percent error graph so you can determine if this is worth the effort - and it will be a good amount of work!!!  The graph is measuring a capacitor discharge from 5V with an UNO with ATMEGA uncompensated, then measure it again using my board with my compensation at ranges 5V and below.   %Error is as compared to the UT61e reading of the voltage.  The black is the ATMEGA uncompensated, the different color plots are the different ranges I have (at different op-amp multiplier or resister divider).  By the way, I call it my "DinoMeter" because I was using a dinosaur era laptop for data logging.

139733-0

This was first posted a year ago when I was even more of a newbie than today:
https://www.eevblog.com/forum/microcontrollers/an-implementation-of-atmega328-volt-logger-the-dinometer-a-learning-project/msg316233/#msg316233

Since the STM has only 8K flash, I am not sure it has enough RAM to implement this method.

In a nutshell, the method is to store the real world (DMM measured) voltage and use the ADC count as an index to a lookup table.  So, apart from LSB jitter, once it get the ADC reading, the program looks up in the table the stored DMM-measured voltage for the ADC reading.  So, in theory, as long as the ADC is consistent, you get the right voltage within math-accuracy.

Implementation is of course limited by RAM.  To reduce the RAM usage, rather than storing 1024 readings for the 10 bit ADC, I store about 400 readings at 16 bits each.

At low ADC, the graph is least linear, so compensation entries has to be frequent.  At higher ADC, the reading is more linear so compensation entries can be less frequent.  The table entry is continuous (1 entry per ADC count) till 250, and then 1 entry per 5 ADC count above that totaling 250+156 readings.

At ADC>1020, I have 2 ADC count per entry to make it easier to deal with overflow.  Reducing 400 entries to about 250 is feasible and (if memory serves) still give rather decent accuracy.

That totals 406 entries per range - each range being different voltage divider and/or different op-amp multiplier.  In my implementation, I had and 5 divider and op-amp multiplier choices, so in my "DinoMeter", it has 5 voltage range for collection for each ADC.

With that many ranges, I have to reduce the RAM need further.  I use 16 bit entries instead of a long or a float giving me ~ 4 digit math-accuracy.  I make use of the fact that (say for 5V range)

ADC_Translated voltage = ADC*5V/1023
should not be magnitudes off from DMM reading, the factor should be near 1.

If DMM reading is twice the ADC_translated voltage, the factor is 2.
If DMM reading is half the ADC_translated voltage, the factor is 0.5.

Now I have a much smaller magnitude number to deal with then if I store the exact DMM reading which could range from mV to 30V for my ranges.  I use the 16 bit unsigned int with implied decimal.
65535 => 6.5535, DMM reading is 6.5535 times the translated voltage
20000 => 2.0000, DMM reading is twice ADC_translated voltage
05000 => 0.5000, DMM reading is half the ADC_translated voltage

So, if the ADC is perfectly consistent, I get back exactly the DMM reading I stored when I calibrated the reading for that ADC - within math-accuracy.

This arithmetic method gives me 4 digit math-accuracy with 16 bit numbers, and any op-amp/voltage divider non-linearity issue are already taken into account since the correction makes it exactly what the DMM was reading within math-accuracy.

To account for LSB +- digit jitter, I collect multiple readings and average them.  If it averages to ADC=123.4, I interpolate the reading between ADC=123 and ADC=124 for ADC=123.4.

I was using a time-based average (average over 250ms) verses count-based average.  I don't recall how many ADC conversions the ATMEGA can do in 250ms, I think it was around 70 for most of my readings.

The collection of the calibration data is not difficult after I wrote the data logger.  I even made sure it skips a reading when the UT61E changes range as it has a habit of giving wild reading during range change.

I use a capacitor discharge and let it sweep the entire range slowly.  The discharge must be slow enough that I get at least 20 readings for each ADC count.  A JAVA program reads the ATMEGA and the UT61E.  I let it run overnight (and longer), the JAVA program fills my JAVA array [1024] for that range.   Now I know the average UTE reading for each ADC count.  An Excel spreadsheet takes the export of the array in text and automatically converts the five arrays of data into code which I insert after the PROGMEM:

// the low is for the low adc reading from 0 to 248
extern PROGMEM   prog_uint16_t CalDataLow[4][5][249] = {
{  // Start of A0
   // Start of A0 x-24
// Dataset Series number A5C1  (-24X check sum 1015.67648024476 TableSum 4009641)
 {  0,// A0@-24:1  (691231.190000)
  0,// A0@-24:2  (691231.190000)
  0,// A0@-24:3  (691231.190000)
  5612,// A0@-24:4  (131123.094922)
  9513,// A0@-24:5  (131123.094922)

  9962,// A0@-24:246  (131123.042149)
  9961,// A0@-24:247  (131123.042040)
  9961,// A0@-24:248  (131123.041933)
  9961},// A0@-24:249  (131123.041828)




So on, so on.  You can see that my Arduino A0 with opamp/voltage divider at -24x, when ADC0=246, the correction factor is 9962.
So, the real world voltage from DMM was  (246*(-24V)/1023) * (9962/10000)


//
// the high is for ADC reading 250 and up, 1 entry per 5 adc reading
extern PROGMEM   prog_uint16_t CalDataHigh[4][5][156] = {
{  // Start of A0
   // Start of A0 x-24
// A5C1  (-24X check sum 1015.67648024476 TableSum 4009641)
{  9961,// A0@-24:250  (131123.041721)
  9961,// A0

  9222},// A3@-1x:1022  (131125.015115)
   // Start of A3 x1
// A5C1 (with 5*675K discharge)  (1X check sum 1036.06042214359 TableSum 4173638)
{  10065,// A3@1x :250  (131128.140130)
  10064,// A3@1x :255  (131128.135438)
  10063,// A3@1x :260  (131128.134753)
  10060,// A3@1x :265  (131128.134113)
  10058,// A3@1x :270  (131128.133444)
  10059,// A3@1x :275  (131128.132815)
  …
For Arduino A3 when op-amp/voltage divider is at 1x with average ADC3= 272.15, my DMM voltage would be:
DMM_volt at 270 = (270*5V/1023) * (10058/10000)   call this V270
DMM_volt at 275 = (275*5V/1023) * (10059/10000)   call this V275
DMM_volt at 272.15 = is 2.15 over 270 and below next reading at 275

So, V 272.15 = V270+(V275-V270)*2.15/5

As to how well (or not) the compensation do, you can see the data I collected about a year ago.   I actually doubt you have enough RAM to do it at the same level.  I can dig up my Excel sheet, and do an analysis on if you cut the table down to 1/2.:

Say every other ADC entry up to 248, then one entry for every 10 ADC count after

If I recall, down to 256 total entries per range was still giving me good numbers with Excel analysis - I did analysis by using the real adc count for the skipped entries, compare to the interpolated value for the skipped entries, the delta would be the error penalty for skipping.  But when I was done with the program, I still had RAM left so I expanded the table size to use up as much of the remaining flash space as possible.

Rick
« Last Edit: March 03, 2015, 03:25:51 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
@Rick, that's very interesting. You are able to get a very high accuracy with that system but it will be unmanageable for this project since any user will need to do such calibration and not everyone has a serial logging multimeter.

I just checked and if I take 64 samples and divide the sum of the samples by 8 I get pretty good results with about 0.5% error and I get what seems to be 13 bit accuracy for the ADC which matches the PWM accuracy without a measurable impact on latency of the measurement (it is well below one second to measure all three different values: vin, vout, cout).

The accuracy above is for a linear approximation with full accuracy of the PC, what will be the accuracy for the fixed point calculation I do not know.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
@Rick, that's very interesting. You are able to get a very high accuracy with that system but it will be unmanageable for this project since any user will need to do such calibration and not everyone has a serial logging multimeter.

I just checked and if I take 64 samples and divide the sum of the samples by 8 I get pretty good results with about 0.5% error and I get what seems to be 13 bit accuracy for the ADC which matches the PWM accuracy without a measurable impact on latency of the measurement (it is well below one second to measure all three different values: vin, vout, cout).

The accuracy above is for a linear approximation with full accuracy of the PC, what will be the accuracy for the fixed point calculation I do not know.

Yeah, my method is an attempt to remember and then produce a second presumably higher accuracy device.  It will require a lot of work and of course a "higher accuracy device" to calibrate it against.

As to: "The accuracy above is for a linear approximation with full accuracy of the PC, what will be the accuracy for the fixed point calculation I do not know."

You may know this already, but just in case: change it to mV first to get better math accuracy.

One thing I found with various ATMEGA experimentation is: I am better off using LONG and perhaps even LONGLONG (64bit) integer multiply it by an implied decimal, do the math, round, then scale back the implied decimal, cast it back down.  It will be faster and more accurate.

Let me try to show exactly what I mean:
//
// mV instead of V, now I have 3 digit more accuracy than volts
// For 4 digit 00.01V is really a centiVolt
//
unsigned long    sumOfAdc;
uint8_t    sampleCount;     // up to 100 samples without overflow
//
// this whole thing could be one statement, broken out to show the steps.
int8_t  rangeV=40;  // here I am pretending 40V being the range
unsigned long    milliVolt = ((1000*40V * sumOfAdc / 1023) / sampleCount;
unsigned long   centiVolt = (milliVolt +5)/10;    // now scale it to centi with rounding
unsigned int   uintCentiVolt= (unsigned int) centiVolt;
//
// here is one line
// uintCentiVolt =(unsigned int) ((unsigned long) (((((1000UL*40UL * sumOfAdc / 1023UL) / sampleCount) +5UL)/10UL);

//
// note: unsigned long max = 4,294,967,295
// in this line: (1000*40V * sumOfAdc / 1023) / sampleCount
// 100 ADC sample is 100*1023 max, so max value before divide is 1000*40*(1023*100)
// lucky for us, this max = 4,092,000,000, which will just fit in the unsigned long.

In one of the experiments I did, I did the implied decimal not by 1000, but by 100000 and use longlong for the last multiply (x1000000) and the first division onward,  I switch to long as soon as I know the number should be ok as LONG for the rest of the division.  Even with LONGLONG, it was still faster than floating and with less lost of accuracy compared to floating.
« Last Edit: March 03, 2015, 10:33:54 pm by Rick Law »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Since my controller board LED went dark, I am forced into action.

I got an Arduino to do the controlling function.  I reprogrammed the PWM timers to do 13 bits matching the 1.94-ish KHz  PWM for stock controller but at 5V which I have to scale to the stock's 3.3V  (good that I did the measurement for @Flex, now I found that data useful too with my unit gone-dark).

I am for now using the ADS1115 (15bit ADC).  I am using code fragments from my other projects to set/display, so, the Arduino doesn't understand B3603 function.  Instead of setting voltage or current, I set the PWM count directly via a 5-tac-switch keypad.  But bottom line is it works but not well.

It is controlling the voltage at 13 bit resolution and the 15bit ADC display it correctly with V=aX+b where the slope and intercept is hard-coded right now, the voltage looks good... but:

The "not well" part is: it has an oscillation that I have to hunt down.  Small (+-6mV) but measurable that my DMM is cycling such as 12.345V to 12.351V.  This oscillation is way smaller than the typical noise level, but both the DMM and the ADS 1115 measurements show a clear oscillation there that was not seen with the stock controller.

This could be I am not doing PWM right.  I am try to learn the in's and out's there trying to figure things out myself there.  So I am not posting what I did (yet) to avoid suggestions for now.

I have some ideas where my oscillation may came from.  As to if I can solve it and how soon...  Also, I think I have to wait for a working unit to compare - I am seeing more noise than I remember, but until I have a working unit to cross check...

Rick

EDIT:

My suspicion was proven correct.  My LCD screen is updated every 300ms and during update, serial data is also send.  This is what caused the oscillation, either LCD, serial, or both.  Change the update timing, and the frequency of the oscillation changes.  I can set the update to 10 seconds, and it stayed stable for most of the 10 seconds until the end when refresh occurs.

I must say, I feel a little silly tracking down a small (+-6mV) oscillation on top of noise that is ten times the size; it just bugs me to see the DMM doing a count down when measuring something that is suppose to be fixed.
« Last Edit: March 05, 2015, 12:50:05 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
The nice thing about the STM8 is that it does the PWM completely in hardware, it has specific pins it can control like that but then all I needed to do was tell it how much to count (8192) and what to compare against (calculated value) and it will go off to do that and then it doesn't matter what I do in my main loop the PWM is not affected.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
The nice thing about the STM8 is that it does the PWM completely in hardware, it has specific pins it can control like that but then all I needed to do was tell it how much to count (8192) and what to compare against (calculated value) and it will go off to do that and then it doesn't matter what I do in my main loop the PWM is not affected.

The ATMEGA also has one 16 bit timer doing the PWM - not sure if it is direct hardware or micro-code inside the MCU.  Two of the Arduino PWM pins use that 16 bit timer, the rest has 8 bit timer.  I am using the pins with the 16 bit timer to create the 13 bit PWM.  So in theory, after I set the registers and tell it to go, it should also be set and forget.

I think the PWM is counting fine but the wave form is affected.  There could be jitters that I cannot observe with my scope.  It could also be because of the other activity (power draw), it is not forming the wave well - such as a slight drop in peak or a slight increased rise time.

Then again, it may be I am totally wrong with how I handle the PWM at the register level.  As I said, I am trying to learn how to modify the PWM beyond Arduino's 8bit pwm with AnalogWrite.

I suspect the Serial.print more because I have seen Serial activity having a negative impact on ADC with poor USB power.  Even while I am using external power, the added activity there could have power draw that made the impact.  It could also be the I2C activities with the LCD screen.  It could be either or both.

Instead of using a resistor divider to drop the 5V PWM to 3.3V for the B3603, I switched the resistor divider to a resistor+zenor.  I was hoping it was simply a peak issue.  Had it been that, the zenor should kill it by setting a lower peak for all.  The zenor approach cuts my delta a bit - I think.  (Not sure if it is purely due to power condition changed from yesterday such as washer is not running).

I am not sure if there is anything I can do yet, I will have to chew on that.  May be altering the timing of how I do the serial and the i2c.

This really is silly however.  The regular noise level is at least 10x that 6mV now down to 3mV or 4mV.  I feel silly worrying about it.  But it is so annoying to see what is suppose to be a fixed voltage just constantly counting 1.232, 1.233, 1.234, 1.235, 1.232, 1.233, 1.234, 1.235, 1.232 .....

It occur to me, this would bore anyone to death...  I am going into details that I don't think anyone cares...  Since I already typed it, might as well post it.
« Last Edit: March 05, 2015, 07:13:57 am by Rick Law »
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now.

https://www.eevblog.com/forum/projects/my-portable-rechargeable-power-supply-(mod)/msg562574/

above is a post I created sometime ago showing off the power supply
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now.

https://www.eevblog.com/forum/projects/my-portable-rechargeable-power-supply-(mod)/msg562574/

above is a post I created sometime ago showing off the power supply

re: "I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now."

Are you planning on using the stock STM or an ATMEGA/Arduino?
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now.

https://www.eevblog.com/forum/projects/my-portable-rechargeable-power-supply-(mod)/msg562574/

above is a post I created sometime ago showing off the power supply

re: "I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now."

Are you planning on using the stock STM or an ATMEGA/Arduino?

An arduino, what I am thinking about is an oled display to show Iset,Vset,Vout,Iout & the important thing to me which is battery charge % as I want it to be a portable rechargeable device so the charge % of the lithium battery pack is very important. my problem is software though ( not that great on coding I consider my self a hacker not a coder).
I would use push buttons for the controlling.
I am planning on using 16 bit DAC & ADC to have as accurate results as possible. double sided PCB with SMD components at the bottom such as atmega328, DAC,ADC. the top side would only be for the user interface.

That's the plan anyways. I am busy at the moment, but i will get into it
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
...I got an Arduino to do the controlling function...
...The "not well" part is: it has an oscillation that I have to hunt down.  Small (+-6mV) but measurable that my DMM is cycling such as 12.345V to 12.351V.  This oscillation is way smaller than the typical noise level, but both the DMM and the ADS 1115 measurements show a clear oscillation there that was not seen with the stock controller...
...I must say, I feel a little silly tracking down a small (+-6mV) oscillation on top of noise that is ten times the size; it just bugs me to see the DMM doing a count down when measuring something that is suppose to be fixed.

Finally, got that licked...  My controller (on breadboard) is as stable as stock. 

After confirming my initial suspicion that the oscillation may be screen refresh related, I narrow down and then identified the problem.  All these time, I thought it would be the Serial.print, but it was the 20x4 LCD's power consumption.

During screen refresh, the many lcd.print was affecting power level enough to cause the PWM to be ever so slightly different than normal.  Consequently, the V-Set-In is affected resulting in V-out change.  A mere 12mV-ish (+ and - 6mV), but it was bugging the daylight out of me...

Now that narrows my choice of display if I am going to make a new top-board. But I am please that if I do decide to make one as I likely will, it can be as stable as the stock on.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I am so tempted to rework my portable rechargeable power supply based on B3606 and do my own top board now.
...
...Are you planning on using the stock STM or an ATMEGA/Arduino?

An arduino,
...my problem is software though ( not that great on coding I consider my self a hacker not a coder).
...I am planning on using 16 bit DAC & ADC to have as accurate results as possible. double sided PCB with SMD components at the bottom such as atmega328, DAC,ADC. the top side would only be for the user interface.
...

16 bit ADC/DAC would be awsome.  I was considering using my ADS1115 which really is a 15 bit but advertised as 16.  It is really 15 bit counts with a sign bit for differential.

Keep us posted.  It sounds very interesting.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
@ rick, what is your setup? Which arduino are you using? Are using any Ecternal DAC or ADC?
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
@ rick, what is your setup? Which arduino are you using? Are using any Ecternal DAC or ADC?

Right now it is still on breadboard.  Initially it was on a bare ATMEGA made to behave like an UNO.  As of today, I switched to a NANO V3 (CH340G, not V3 with FT232RL which I also have but with connectors not suitable for breadboard).  The switch is because I wanted to get things physically closer to avoid noise as I hunt for the oscillation.  NANO is compact enough to allow me to cut out a lot of wire length at least for now.

I am using ATMEGA for everything except ADC.  I want to see how far I can go before I finalize the components.  For ADC I am using the Adafruit ADS1115 to test how far I can go on the sense-side.  I know I wont be happy with the ATMEGA ADC.  I would use ATMEGA ADC if other things are less well than expected - in that case I would not waste a good ADC on something unsatisfactory.

For PWM, I am bypassing the Arduino standard PWM and manipulate the ATMEGA's PWM registers directly so I can use the 16 bits Timer1.  So, I am using Arduino's pin9 and 10 on Timer1.

Might as well touch on this:With the change away from the 2004 LCD, I am using a Nokia 84x48 monochrome LCD, graphical but doing text only. Very fast, but uses more RAM than I like.

My software is code fragments from different prior projects - not design for this at all.  For example, to set CC or CV, my code merely allows me to enter integer(s) via a 5-key tac-switch keypad.  So I enter selected voltage as PWM timer count - awkward but works during this feasibility study phase.

In the final implementation, I may switch back to "home-made-UNO" with optical isolator for the TX/RX and an off-board USB on the other side of the isolator.  When I was collecting data for current shunt temperature, I observed what may be a problem caused by this set up:
- Two ATMega (or Arduino) connected to one PC to collect combine data
- one ATMega connected to an external device powered by the B3603
- another ATMega tied to the ground via the B3603 internal pins
When I was pulling around 1-2A, the difference between the external ATMega ground and the internal-pin-grounded ATMega were different enough for me to have some negative volts where there shouldn't be any.  I am not sure this is real problem yet, but I know to prepare for it.

This is my environment for now, but as you can imagine, I am still experimenting so it will change again.

Rick
« Last Edit: March 06, 2015, 05:20:23 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
@Rick, I know I'm interested in these thoughts of yours, feel free to share. I ramble on anyway :-)

I've switched to using mV and mA for measurement and it definitely improves the accuracy and also my grasp on things. I still use fixed point for the calculations. I got the PWM and ADC working again after this change and now need to figure out the calibration. There is some code to test the calibration with values I measured, for now I don't know what to set the expected values to. Still need to work on that.

Help and advice on the code is always appreciated!
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
...
 and now need to figure out the calibration. There is some code to test the calibration with values I measured, for now I don't know what to set the expected values to. Still need to work on that.

Help and advice on the code is always appreciated!

You are ahead of me.  I am still at the (last step of) feasibility study part - confirming that I can control with my own controller board.  Now that I think it is working on a breadboard, I need to transfer it to a proto-board (soldered) so it is stable enough for software development.

I have not put much thought into calibration yet.  My gut feel is while the stock software's method seem good (with improved UI), my guts said the thing is not going to be so linear that calibration at two points will do it.  2-points assume perfect linearity which is doubtful.  But I have not yet collected the data to tell me how many points would be adequate - perhaps the thing is linear enough for straight-line approximation.

Once I get my ATmega based controller board, I will join you in the quest for a good calibration method.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I've just taken data I collected previously and it includes the ADC value and what the multimeter read (a Mastech MS8250B). The data is quite perfectly linear in fact. There are obviously some errors but the linear approximation is very strong and continuous throughout the range. I attached it here so you can play with the same data as well.

I also attach a graph of the ADC value at the X axis, the measured voltage in Y axis and a green line of the linear function: ((0.0056395173454*x)-0.593524886878)
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
One thing to note is that different point pairs will generate a different linear function and some may have more errors than others. The right way to solve this is to use more than two pairs, probably at least five and probably the optimal number is 10 pairs and use a least squares estimate and find the best match with the least error for all collected points.

The problem with this is would be that it is time consuming and takes quite a bit of effort, both from the operator and the code. I will easily get into overflows and may even trigger watchdogs. I think I'll leave that to the user to perform so I will support the two point calibration for those who don't mind the accuracy too much and if someone wants full accuracy I'll provide an external script that will take all the different measurements and do the right thing and feed the final answer to the device.

In fact, it should be possible to write yet another script that will communicate with the B3603 alternative firmware and with a serial-capable multimeter and perform the calibration completely automatically. It can even measure it at the full range and with all data points to get the lowest possible error.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
One thing to note is that different point pairs will generate a different linear function and some may have more errors than others. The right way to solve this is to use more than two pairs, probably at least five and probably the optimal number is 10 pairs and use a least squares estimate and find the best match with the least error for all collected points.

The problem with this is would be that it is time consuming and takes quite a bit of effort, both from the operator and the code. I will easily get into overflows and may even trigger watchdogs. I think I'll leave that to the user to perform so I will support the two point calibration for those who don't mind the accuracy too much and if someone wants full accuracy I'll provide an external script that will take all the different measurements and do the right thing and feed the final answer to the device.

In fact, it should be possible to write yet another script that will communicate with the B3603 alternative firmware and with a serial-capable multimeter and perform the calibration completely automatically. It can even measure it at the full range and with all data points to get the lowest possible error.

You are describing exactly what "my gut feel" is.  It needs more than 2 points.  I will do a more detail analysis with your data.  I am guessing (we will see after an analysis).  My gut feel is, at the low end, I probably need more points.  I am bias already (from having worked on a 4-channel ATMEGA volt logger) I am thinking 8 data points with these separation:
- 4 lines for bottom 1/4 of ADC 0, 1/16, 2/16, 3/16, and 4/16
- 2 lines for 1/4 to 1/2 of ADC   4/16, 6/16, and 8/16
- 1 line for the top half of ADC   8/16 to 16/16
8 data points total (data points at 1/4 and 1/2 are used twice)

If math is faster, I would borrow from the o'scope's well tested formula of sin(x)/x.  I will be testing that out for speed first.  I also suspect another good way is to estimate the slope for adc=x as:
slope_at_x = n+(m/x)    [where n and m are constants, x would be the ADC count]
Some one who do ADC design work would probably know off the the back of his head, but I will have to test that experimentally by doing some analysis with the data.

As to actual calibration, it should be something that is done rarely.  So I would trade troublesome for accuracy any day.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
We are not talking about the same thing.

You are talking about splitting the range into several regions and let each one use a different function. I show in that attached graph that a single function is very close approximate to the entire range.

My point about the multiple points is intended to help find that one perfect line that is closest to all points at once.

EDIT: I seem to have misread initially your comment. It does talk about different points on the line. I somehow misread it to be like the ADC you wrote about previously where you used different functions to different regions.
« Last Edit: March 07, 2015, 08:05:59 pm by baruch »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I've just run an experiment where I've taken the full data, calculated the optimal a&b values from it with LSE and then chose 20 times a set of 10 points out of that data (600 points) and calculated the LSE result for each, the results are pretty good so 10 points would be plenty enough for a good accuracy. You can spread them evenly around or not the main point of elimination of the noise out of the equation you will already get.

Ofcourse, the more points you take the lower the error, but unless you have it all automated having more than 10 points is going to be very cumbersome and gain you only at most a 10mV improvement.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
I wasn't planning to post my progress, but I thought let's revive the post (wouldn't hurt).

The main reason why I want to make another top board is to replace the user interface that I am not liking. So I decided to tackle the project from a different angle. that is figuring the user interface then concentrate on the ADC,DAC and calibration ( the angle that Rick Law & baruch tackling).

My user interface contains:
1) 128x64 OLED display (SPI)
2) ONE rotary encoder with a built in push button
3) Push button for Power on/off

currently I have a smaller OLED display so I am squeezing everything as seen in the picture, this to be solved with the bigger display.
The  ONE rotary encoder is used to control the set voltage and current, if the built in push button is pressed for 1 SECOND( meaning long press)  it will switch to control the other parameter( voltage or current). if the push button is pressed for a short period( short press) this will control the multiplier used to increment/decrement the set parameter.
 10^-3 V  in the picture means each movement of the encoder will increment/ decrement the set voltage by 0.001 V.
if 10^-1 A is showing, this means each movement of the encoder will increment/ decrement the set current by 0.1 A.   

so this number is showing me two things:1. what am I controlling.
2. each movement of the encoder = to what increment/decrement  (1,0.1,0.01,0.001)

The system will have a push button for switching the output on/off, and I found the perfect push button for this job( shown below)
it has a built in LED, so when the output is on the led will be on and vice versa. So the button will serve as an indicator too!. 
 
The % shown in the top right corner tells you the charge% ( as I said in another post. this is to be a portable power supply).

There are still more things to figure out, I tackled the user interface because it motivates me + I am waiting for the ADC+DAC to arrive :-DD

as always,open for suggestions  ^-^

---------

any news from the other guys?
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Is your OLED dot addressable?  Does it use RAM for the display buffer?

I am stalled by two things I want to figure out:

1. I am using the Nokia 5110 type display.  Uses RAM for display buffer and SPI.  SPI is what I want but the RAM buffer is squeezing my RAM needs.

2. Physical config - I want to use the Nano (at least for now), and the Adafruit ADS1115 (16 bit ADC).  But I just can't get that
    onto a single top board.  I am going to have too much overhang making air flow even worst.

So, I am playing with that.  I have a first try which I am (will be) soldering up shortly.

By the way, I decided to use push button only - I tried just adjusting numbers with four tac buttons: Use the OK and SET to double-duty as FAST and SLOW.   When I holding the UP or DOWN, it auto repeats at 1sec interval.  Each FAST/SLOW pressed while UP/DOWN are held will double or 1/2  the speed.  I found that to work rather well.  This allows me to skip the encoder for physical space reason.

 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
Is your OLED dot addressable?  Does it use RAM for the display buffer?

I am stalled by two things I want to figure out:

1. I am using the Nokia 5110 type display.  Uses RAM for display buffer and SPI.  SPI is what I want but the RAM buffer is squeezing my RAM needs.

2. Physical config - I want to use the Nano (at least for now), and the Adafruit ADS1115 (16 bit ADC).  But I just can't get that
    onto a single top board.  I am going to have too much overhang making air flow even worst.

So, I am playing with that.  I have a first try which I am (will be) soldering up shortly.

By the way, I decided to use push button only - I tried just adjusting numbers with four tac buttons: Use the OK and SET to double-duty as FAST and SLOW.   When I holding the UP or DOWN, it auto repeats at 1sec interval.  Each FAST/SLOW pressed while UP/DOWN are held will double or 1/2  the speed.  I found that to work rather well.  This allows me to skip the encoder for physical space reason.

it does,,  http://www.adafruit.com/product/326.
Arduino pro mini would be my solution for the space. it is a NANO-FTDI. The rotary encoder will hang on 90 degrees from the top board. so it doesn't take space and won't block the OLED view. The encoder is a must for me, it caused me a lot of problems but all are solved now. + in a single movement I can increment 0.001 V or 1 V(faster & less hastle).

BTW, are you using a library for the nokia display? if yes, some libraries are faster/takes less space than the others.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Is your OLED dot addressable?  Does it use RAM for the display buffer?

I am stalled by two things I want to figure out:

1. I am using the Nokia 5110 type display.  Uses RAM for display buffer and SPI.  SPI is what I want but the RAM buffer is squeezing my RAM needs.

2. Physical config - I want to use the Nano (at least for now), and the Adafruit ADS1115 (16 bit ADC).  But I just can't get that
    onto a single top board.  I am going to have too much overhang making air flow even worst.

So, I am playing with that.  I have a first try which I am (will be) soldering up shortly.

By the way, I decided to use push button only - I tried just adjusting numbers with four tac buttons: Use the OK and SET to double-duty as FAST and SLOW.   When I holding the UP or DOWN, it auto repeats at 1sec interval.  Each FAST/SLOW pressed while UP/DOWN are held will double or 1/2  the speed.  I found that to work rather well.  This allows me to skip the encoder for physical space reason.

it does,,  http://www.adafruit.com/product/326.
Arduino pro mini would be my solution for the space. it is a NANO-FTDI. The rotary encoder will hang on 90 degrees from the top board. so it doesn't take space and won't block the OLED view. The encoder is a must for me, it caused me a lot of problems but all are solved now. + in a single movement I can increment 0.001 V or 1 V(faster & less hastle).

BTW, are you using a library for the nokia display? if yes, some libraries are faster/takes less space than the others.

Yeah, I am using a modified version of the GFX library.  The modification was for me to support those little tools I have which allows me to do quick tests like "an integer input" utility which I am using for testing.

I am running out of RAM now.  I expect to need to strip down as much as I can just to fit into the 32K flash.

Once my prototype works well, I may turn the 2x8 pins upside down, so the bottom-board will have pins sticking downward to plugs into a bottom-bottom board with my stuff.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
I see, I have a question though, is using the display(SPI) + ADC(I2C) causing any speed issues?
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I see, I have a question though, is using the display(SPI) + ADC(I2C) causing any speed issues?

Since I the replacement of the 20x4 I2C-LCD by the 5110 on SPI, the oscillation noise of +-6mV is removed.  That extra +6mV at peak and -6mV valley is gone even with the I2C-ADC running.

The I2C-ADC could still be generating some noise.  If it is, it was buried in with the base noise and not enough to create an extra peak/valley.  This base noise is Amperage-dependent and seem higher than I recall.  I have a new B3603 coming.  At that point, I can check if when the LED died, did any thing else went wrong.


Classic example of not reading carefully - I answered about noise instead of about speed which is what you asked.

Depends on what you consider a speed issue.  The speed of the 5110 display on SPI or bit-bang beats out the LCD2004 on I2C.  Recalling from prior detail tests, the update speed for 4 lines of text is around 30ms for SPI and about 38ms for bit-bang whereas the I2C LCD2004 was in the 40-50ms range.  So the SPI 5110 is a speed improvement for sure.

I am not too happy with the speed of the I2C ADC1115 however.  It is not unexpected as I have been using this ADC for a while and know it is slower than I like.  I use the ADC in 1 shot mode expecting to implement auto-range for the PGA.  It takes about 10ms per ADC read on average.  So 4 channel all active will take about 40ms to finish reading all four.   Since the actual time would depend on the voltage being read, my numbers are all low numbers (counts to about 300 - 2000 of 32767 counts), so I can expect it to be much slower still.

« Last Edit: March 14, 2015, 12:53:28 am by Rick Law »
 

Offline cnc4less

  • Newbie
  • Posts: 3
 Thank you Baruch for the great spirit you have to go through such project and share your ideas,  also Asim,  very interesting efforts i see here, will u share your arduino code with us?
regards
salim
 
« Last Edit: March 18, 2015, 07:17:06 am by cnc4less »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I battled with the calibration inside the STM8 for too long and decided to abandon it. Instead I now created a calibrate.py script that can be used with a serially connected multimeter to do automatic calibration. For now I implemented the Vout ADC and PWM control and it works very nicely. The Cout controls should follow once I get some 0.1R resistors to be able to generate meaningful load currents. Right now I only have a few 100R and the currents I can sustain with them are too low to make a meaningful calibration.

The calibration is within 0.01 volt of the requested voltage and I guess I could get better if I added a few more decimal points to the calculation. I'll try that later on. The accuracy is better in the higher voltages (>4.5) and is not so great (within 0.02v) below that.

I was planning to add more features to the firmware but I really have to get back to another project I'm doing with a friend so I won't spend so much time on this anymore as the unit now does exactly what I want from it, i.e.: a serially controller psu.

Ofcourse, if someone is up to adding features I'll take patches and merge requests happily and will continue to frequent this forum post.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I battled with the calibration inside the STM8 for too long and decided to abandon it. Instead I now created a calibrate.py script that can be used with a serially connected multimeter to do automatic calibration. ...

Baruch, it is interesting comparing notes with you since you are going the native STM route whereas I am trying for an Arduino based control board.

Yeah, serial connect automation is the way to go...  I was just about to do that to update my hardcoded calibration... an old demon came back - I am reminded when I collected data for the current sense resistor, I had some ground-level issues with the serial.

If I recall, RS232 is one of your primary objective, Part II here may interest you.

I. I have my Arduino-based control board working (on a breadboard and not yet ready to commit to solder).  My "calibration" is hardcoded.  I am still setting output Voltage/Current by tac-button setting the PWM count directly.  I have a couple of issues I need to solve before I can think of committing it to soldered proto-board.

I also have good luck with increasing PWM resolution - to 15 bits PWM.  I have Vout adjustment down to +-1mV with +-2mV stability.  For example, I set Vout to 11.010V and get Vout at 11.008 to 11.012V.  Of course there is the much bigger (60mV+) high frequency noise that the DMM doesn't see.   I get this +-2mV stability down to Vout=20mV.  I have not tested it at > 16V out yet.

I suppose if I want to, I could spend more time to try to reach the max PWM (16bit).  Since +-1mV is good enough for me so I went on to "fry bigger fish".

Now part II, USB/RS232 issues

II. I am left with two interesting issues, Bluetooth looks like a solution that may solve both.

II.1.  The build in XL1509 output (around) +5V to the control board.  @Flex shows it is only 4.92V on his schematic, mine is about the same at 4.91V.  If I use that low voltage as +V for the Arduino, it is a bit low even when I bypassed the VIN voltage regulator.  With such low V, presence or absence of USB affects my calibration.  I solved it with another power source of +7V to Arduino VIN for now.

For example, at a particular PWM setting:
Without USB, Vout=11.061V
With normal USB, Vout=11.081V
(I made a custom 3pin USB to test if I take out USB's +5V, would voltage go back as if no USB)
With USB but USB’s +5V is not connected, just GND, D+, D-.  Vout=11.088V
This 0.02V delta hols with all the Vout I managed to try.

With my +7V external source, the Arduino is supplied by the internal regulator and I don't see this issue any more.  This could be related to II.2 or merely an independent problem due to low voltage.  Either way, the Bluetooth may be a good solution for this as well.

II.2. Ground levels

Assume you have a device with RS232 power by the B3603.  You cannot do RS232 with this device and do RS232 with the control board at the same time.  You cannot communcate RS232 with the B3603 and the devices it powers at the same time.  It does not work without isolation.

The ground for the B3603 load is on the high side of the current sense resistor, but the control board ground is on the low side of the current sense resistor.  If RS232 is connected to both the control board AND the load, the current sense resistor is effectively shorted.

So, I am thinking about getting a Bluetooth card for the control board's Arduino.


In truth, being reminded of the ground level issue again, I am not so committed with making my experimental/temporary Arduino Control board a permanent thing.  Learning enough to have the Arduino-based control board working on breadboard may be enough for me.

My big thing is to enable logging.  The issues with the UI are smaller issues to me.  Now to do logging I will have to get a Bluetooth RS232 on that Arduino.  Beside, I am not sure I want to assign my Adafruit ADS1115 to this job.  The Adafruit ADS1115 itself is more expensive than the whole B3603!

I like +-1mV Vout adjustment, I like being able to modify voltage better than the slow/stock 4-button interface.  But doing those things more than double the original cost of the B3603…  I will get a Bluetooth to resolve the ground level issue just to benefit from the learning.  After I solved all the known problems, I may end up stopping at a working breadboard control board instead of committing the whole thing to soldered.  I will have a replacement B3603 installed soon and I may use that as is but now with much better understanding of what is going on under the hood.

Either way, I will get it working.  I just am not sure I want to commit so many of my modules to this job permanently.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I tested now the output with and without USB-serial and I get the same value. I use a CP2102 dongle and only connect the Tx/Rx/Gnd lines from it to the B3603 and it works and the output is the same either way.

I can't really understand your comment about two different devices connected with serial, there is a shared ground to them all but current from the B3603 doesn't flow directly on the Rx/Tx/Gnd lines of the serial, this is controlled by the MCU of the board that terminates the serial on its side. I haven't yet tried anything like that but will probably test it soon as I plan to use the PSU to power my other projects.

I'm only geting 10mV accuracy and I don't need more than that, my multimeter can't measure better than that anyway so I wouldn't be able to calibrate it better.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I tested now the output with and without USB-serial and I get the same value. I use a CP2102 dongle and only connect the Tx/Rx/Gnd lines from it to the B3603 and it works and the output is the same either way.

I can't really understand your comment about two different devices connected with serial, there is a shared ground to them all but current from the B3603 doesn't flow directly on the Rx/Tx/Gnd lines of the serial, this is controlled by the MCU of the board that terminates the serial on its side. I haven't yet tried anything like that but will probably test it soon as I plan to use the PSU to power my other projects.

I'm only geting 10mV accuracy and I don't need more than that, my multimeter can't measure better than that anyway so I wouldn't be able to calibrate it better.

This is what I figured out, I am not absolutely positive but I am fairly sure I am right.

Regarding RS232 on both the controller and the load at the same time:

The current sense resistor is R050, so I will use R050 to make this more readable.

The B3603 load's ground is connected to the R050 top (higher voltage) end.  The bottom end of R050 is connected to the controller's ground.  If the load is 1Amp and R050 is 0.05ohm, the top end should be +50mV over the bottom end.  So the load's ground differs from the controller's ground by 50mV.

The LM2596 actually seats a little lower than the controller's ground due to resistance of the PCB trace, the Vin Gound connector is the lowest point on the whole board.

If you have USB/RS232 (say COM1) connected to the controller
  - COM1 ground is the ground of the controller ground which is the bottom end of R050.
If you have USB/RS232 (say COM2) connected to the B3603 load
 - COM2 ground is ground of the B3603 load ground which is the top end of R050.

At the PC end, COM1 ground and COM2 ground are connected.  So the TOP end of R050 connects to the BOTTOM end of R050 via the common ground between COM1 and COM2.  This creates a short across R050 which renders the current measurement useless.  The change in the ground level also renders other voltage measurement by the ADC inaccurate and the PWM level (from a changed ground) also changed.

This issue affects me even with 2 machines connected via KVM.  The KVM creates a common ground between the machines.

Regarding with/without USB calibration issue:

That is because of the Arduino is running below 5V from the 4.92V that the XL1509 gives.  The STM runs at 3.3V is not affected by that.

With the ATMega at 4.8/4.9V-ish, when the USB is connected, the USB's D+ from the PC some how caused a very small increased the ATMega's PWM's voltage (or decreased rise time).  The resulting increase in the integrated output of the PWM is used as V-out-set.  This small increase in V-out-set changed Vout by the 0.02V increase I saw in Vout.  The external 7V power for the Arduino solves that problem.  But I don't like having to provide another power source to the B3603.  I will have to work on an alternate solution.

For the Arduino, the faster PWM (even at 14 bit) is very touchy.  With fast PWM and low duty cycle (say below 1%), the pulse is so short that it has problem reaching the top (5V).  At very low duty cycle, I saw with the scope some random cycles of the "square wave" not rising fast enough to reach the top of the square before it is time to fall again.  So this changing rise time is distorting the square wave.  I had to come up with a solution to stabalize the resulting 100mV+ jitter due to this.  My solution is to use a zenor to lop the 5V square wave at 3.3V.  It has a much easier time reaching 3.3V reducing the distortion of the square wave.  With this solution, I managed +-2mV jitter with Vout.  This works up to 15 bit PWM.  At 16bit PWM, the pulse (even with moderate duty cycle like 5% or 10%) is so short any small change in rise time distorts the wave too much to create a stable voltage.  The jitter I see at 16 bit PWM even with the zenor exceeds >100mV at Vout.  There is no point in +- 0.5mV adjustment if it introduces an instability of >100mV.  That is why I don't even want to waste time with 16 bit when 15 bit is already giving me fine enough control (and it stable to +-2mV from the DMM's view which is way more than the noise the scope is showing).  Frankly, I think +-1mV adjustment is finer than the B3603 can hold anyhow.
« Last Edit: March 19, 2015, 06:58:19 am by Rick Law »
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
@ Rick, you can get the input voltage of the device ( the terminal input voltage ) to the top board. This will eleminate the need of another external supply. If you check pin 3 of the top module, this pin is connected to Vin through a voltage divider. So desoldering the divider resistors and removing them, and short the top resistor smd pads. Will give you your input voltage through pin3. Do whatever you like with that voltage( create a stable 5v).
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
@ Salim, thank you. My code in no way, shape or form is ready. It is only a user interface. I don't mind posting the code, but it is ugly.( work in progress)
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
@Rick, I think that you can use a CP2102 or FT232 and connect it to the hardware serial pins instead of the direct USB connection and then you will skip on the added voltage input and just have the communication path. This should avoid your extra power supply need as you won't mix two power supplies (B3603 and PC-via-USB).

I currently have only one CP2102 and will order soon a few more so I can test the issue of the multiple serial connected devices. I guess that the effect will be mostly on the ADC but I'll be mindful of that when I try it. If it becomes a real issue I'll also take a look at radio.

My other projects use the nRF24LE1 so I could use that as my radio control, I have a few spares there.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
@ Rick, you can get the input voltage of the device ( the terminal input voltage ) to the top board. This will eleminate the need of another external supply. If you check pin 3 of the top module, this pin is connected to Vin through a voltage divider. So desoldering the divider resistors and removing them, and short the top resistor smd pads. Will give you your input voltage through pin3. Do whatever you like with that voltage( create a stable 5v).

I just came back home, the voltage divider resistors are R2&R1, desolder both and short R2 pads, this will provide you with Vin at pin 3 . You can reuse the voltage divider at the top board if you care about the input voltage
 

Offline cnc4less

  • Newbie
  • Posts: 3
@Asim  i understand it is not finished code as Baruch ( he is just great , very kind  and helped me  alot)  but am making a hopy project that needs some of your  arduino code in it at least the c++ code for the volt in out and I in out calculations. the rest I can wrap it up and might help closed this code for you in a week or so. regards
salim
« Last Edit: March 19, 2015, 03:50:06 pm by cnc4less »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
@ Rick, you can get the input voltage of the device ( the terminal input voltage ) to the top board. This will eleminate the need of another external supply. If you check pin 3 of the top module, this pin is connected to Vin through a voltage divider. So desoldering the divider resistors and removing them, and short the top resistor smd pads. Will give you your input voltage through pin3. Do whatever you like with that voltage( create a stable 5v).

I just came back home, the voltage divider resistors are R2&R1, desolder both and short R2 pads, this will provide you with Vin at pin 3 . You can reuse the voltage divider at the top board if you care about the input voltage

Hey, thanks for pointing that out!  I will keep that in mind.

I just got my Bluetooth module.  Not initially intended for this project so it is an HC5 instead of an HC6.  HC5 should work as client also, so I am going to try that.  BT should allows me to use Serial on the control board AND a Serial on the B3603-powered device.  If that works, I am in good shape with the (less than) +5V from pin13/pin14.  If not, I will have to figure out what I use to power the Arduino based control board.

So far, I found this to be possible with my limited skill.  I am sure others with more experience can do better:
- Improve resolution in volt setting (yeah, I can go to 15bit PMW)
- Improve display V/I resolution
- Display Vout, Iout, Vset, Iset simultaneously  (Nokia displays Vout/Iout in size 2 and Vset/Iset in size 1)
- Calibrate better with given higher resolution
- Faster/Fast/Meduim/Slow/Slower in the auto-increment of V/I setting
- logging (right now, isolated laptop if a load powered by the B3603 is grounded to external such as my PC ground)

So far, I know this should be possible but I have not actually try
- Do both on the same PC: serial at the control board and a Serial on a device powered by the B3603
  (bluetooth is my target solution right now.  Controller's onboard USB will be for program loading only.  BT allows me to isolate the control board's ground with the load's ground)
- Single power source.  Bluetooth may just make it OK to let the Arduino run with 4.9V.  If not, I know how to get a 7V source to the board some how.
- Oversample to get about 13 or 14 bit ADC with the onboard ADC (to free the ADS1115 16 bit adc)
- Auto calibration

So far, I have these concerns and like-to-do's, but will deal with them later
- The noise is heavier right now than the stock controller.  I suspect it is because my controller is on breadboard so things are too spread out and poor layout.  I like to see if I can get the noise to be closer to stock controller by moving this to a soldered prototype board.
- I like to have the Arduino auto-compensate for load regulation.  At high load, I see the Vout dropped.  With the Arduino, can I compensate for this.
- What other enhancement is feasible?  I have to experiment more.

This has been fun and educational at the same time.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I changed my code to use a higher precision fixed point for the constants and calculations, now I use a 32bit fixed point fraction with 16 bits for the whole part and 16 bits for the fraction, up from 3.13. This gets the accuracy of the millivolt measurement in the 0.01mV range and the output is very stable as well.

I've ordered 0.1R resistors and a 10W led so I can play with the current limit control, it will take a month or so for the parts to arrive. For now I don't have the ability to sink 1A to be able to get any reasonable calibration. I can barely sink 100mA.

I started generating releases for the software and the latest one is 1.0.1 with the increased accuracy. I also switched from NumPy for the LSE calibration to a more specific implementation for the single variable optimization I need here so there are less dependencies. LSE is Least Squares Estimate and is used to set the linear formula parameters for the PWM and ADC, it gets the best parameters possible that give the least amount of error across the entire range.

A sample output of my calibration script:
Code: [Select]
OPEN "MODEL: B3603"
PSU Input voltage is 8.77, will use 20 steps between 1.0 and 7.77
Setting voltage to 1.0
Multimeter samples vary too much, stddev=0.527693, data: [4.23, 3.44, 2.949]
Failed to read stable value, trying again, maybe
Multimeter samples vary too much, stddev=0.237628, data: [1.783, 1.462, 1.202]
Failed to read stable value, trying again, maybe
Step 0 Set voltage 1.000000 Read voltage 0.996333 PWM 278.0 ADC 279.0 (1.0)
Setting voltage to 1.33
Step 1 Set voltage 1.330000 Read voltage 1.328000 PWM 337.0 ADC 337.0 (1.32)
Setting voltage to 1.66
Step 2 Set voltage 1.660000 Read voltage 1.659000 PWM 396.0 ADC 396.0 (1.66)
Setting voltage to 1.99
Step 3 Set voltage 1.990000 Read voltage 1.990667 PWM 455.0 ADC 455.0 (1.99)
Setting voltage to 2.32
Step 4 Set voltage 2.320000 Read voltage 2.316000 PWM 513.0 ADC 513.0 (2.32)
Setting voltage to 2.65
Step 5 Set voltage 2.650000 Read voltage 2.647333 PWM 572.0 ADC 570.0 (2.64)
Setting voltage to 2.98
Step 6 Set voltage 2.980000 Read voltage 2.979000 PWM 631.0 ADC 629.0 (2.98)
Setting voltage to 3.31
Step 7 Set voltage 3.310000 Read voltage 3.310000 PWM 690.0 ADC 688.0 (3.31)
Setting voltage to 3.64
Step 8 Set voltage 3.640000 Read voltage 3.636000 PWM 748.0 ADC 745.0 (3.63)
Setting voltage to 3.97
Step 9 Set voltage 3.970000 Read voltage 3.967000 PWM 807.0 ADC 802.0 (3.96)
Setting voltage to 4.3
Step 10 Set voltage 4.300000 Read voltage 4.300000 PWM 866.0 ADC 863.0 (4.3)
Setting voltage to 4.63
Step 11 Set voltage 4.630000 Read voltage 4.630000 PWM 925.0 ADC 920.0 (4.62)
Setting voltage to 4.96
Step 12 Set voltage 4.960000 Read voltage 4.960000 PWM 983.0 ADC 978.0 (4.95)
Setting voltage to 5.29
Step 13 Set voltage 5.290000 Read voltage 5.290000 PWM 1042.0 ADC 1036.0 (5.28)
Setting voltage to 5.62
Step 14 Set voltage 5.620000 Read voltage 5.620000 PWM 1101.0 ADC 1096.0 (5.62)
Setting voltage to 5.95
Step 15 Set voltage 5.950000 Read voltage 5.950000 PWM 1160.0 ADC 1154.0 (5.95)
Setting voltage to 6.28
Step 16 Set voltage 6.280000 Read voltage 6.280000 PWM 1218.0 ADC 1210.0 (6.26)
Setting voltage to 6.61
Step 17 Set voltage 6.610000 Read voltage 6.610000 PWM 1277.0 ADC 1270.0 (6.6)
Setting voltage to 6.94
Step 18 Set voltage 6.940000 Read voltage 6.940000 PWM 1336.0 ADC 1329.0 (6.94)
Setting voltage to 7.27
Step 19 Set voltage 7.270000 Read voltage 7.270000 PWM 1395.0 ADC 1386.0 (7.26)
['OUTPUT: OFF', '']
ADC
(5.666595409010677, -585.2406566240919) 371365 38354331
['CALIBRATION SET ADC VOUT A', '']
['CALIBRATION SET ADC VOUT B', '']

PWM
(0.17797037089326911, 100.78828376431484) 11663 6605260
['CALIBRATION SET PWM VOUT A', '']
['CALIBRATION SET PWM VOUT B', '']

The interesting part is at each step to see what was the set voltage (voltage we want out), the read voltage is what the multimeter shows (MASTECH MS8250B) and in the brackets is the ADC calculation which the unit will show as its measured output.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm wondering, is there anyone else that plans to flash his unit? (i.e. not just hook another mcu instead of the current top-board). I'd be interested to know about the calibration results and also if someone would try the same firmware on another board with the same top-board.
 

Offline cnc4less

  • Newbie
  • Posts: 3
@baruch yes i am working on it now and will hook it  on  arduino and energia, , my application is not power  power supply but integrating PID with it so i can have it working in different way, i need few weeks and will list my code. also will hook the USB meter too as u see it in http://www.ebay.com/itm/OLED-USB-Charger-Current-Voltage-Capacity-Power-Detector-Tester-Meter-Voltmeter-/261670045269?pt=LH_DefaultDomain_0&hash=item3cecc00655

Thank you again you have excelled in your work.
Salim Safran
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Your B3603 could be giving out more noise than it should...

The XL1509 supplies 5V (4.92V) to the bottom board and the control board.  This 5V supply could damaged and cause the noise.  If you shorted this even for a very brief time, your unit appears to be working but it will generate a lot more noise and may eventually fail.

I shorted it by mistake for a very brief time, (could happen any number of ways - mine was a screw driver blade rolled under the bottom board shorting pin12 and pin13 where the top board gets its 5V).

Something smoked, the unit appears to survive it.  Prior to this mishap, I was working on my Arduino based control board.  I was getting +-1mV Vout adjustments with +-2mV Vout stability (DMM reading Vout with slow +-2mV jitter).  Scope shows <100mV high frequency noise but DMM/ADS1115 reading were solid.  Rare to see more than +-3mV jitter except purposely changing the load rapidly.

After the mishap, while it appear to have survived, I notice it was not holding voltage stability like before - now DMM is showing +- 200mV Vout jitter.  The scope was showing Vout noise a lot higher than I recall.
I put a scope to the 5V, this is what I see - frequent ~1V spike on top of the 5V.
144183-0

I tested my original B3603 (with dead LED).  It has much worst Vout stability.  I know I shorted its 5V once also and thought it survived.  The 5V has frequent 2-3V spikes on it.  This may be the cause of the dead-LED.  The 74HC595 electrical max is 7V, a 5V with over 2V spike is over the 7V limit.  I will order a couple of 74HC595 to see if I can revive the dead LED.



To confirm the 5V was the problem, I removed the XL1509, and supplying 5V via another LM2596 - With that, I am back to +-2mV stability, and Vout noise return to what I recall (100mV range) verses at times over 800mV with the bad XL1509 circuit.

So, both my B3603 has damaged 5V circuit.  One spikes at just below 1V and the other spikes at 2-3V.  Looking at that part of the circuit, when 5V is shorted, the only thing that could go are the XL1509 and the 100uH inductor.  I have both parts coming.  I will see if I can repair them.  The smoke could be from the 100uH inductor with the wire turning into heater coil.  Since the XL1509  is already removed and ready to receive a new chip, I would prefer that being the problem than the 100uH.  But the XL1509 may not be the fault as it is suppose to auto-cutoff at 2A.  So, we will see.

So, if you seem to have a bad noise problem - or a dead LED, put a scope to the 5V, see if that is your problem also.

Keep you guys posted if I can successfully repair the two units with bad 5V, and let you know if it is the inductor or the XL1509.

Rick
« Last Edit: March 29, 2015, 05:23:32 am by Rick Law »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Your B3603 could be giving out more noise than it should...

The XL1509 supplies 5V (4.92V) to the bottom board and the control board.  This 5V supply could damaged and cause the noise.  If you shorted this even for a very brief time, your unit appears to be working but it will generate a lot more noise and may eventually fail.

I shorted it by mistake for a very brief time, (could happen any number of ways - mine was a screw driver blade rolled under the bottom board shorting pin12 and pin13 where the top board gets its 5V).
...
...
Keep you guys posted if I can successfully repair the two units with bad 5V, and let you know if it is the inductor or the XL1509.

Rick

It is the inductor.  The xl1509 I removed from unit1 was lost, so I have to wait for the XL1509 before I can do experimentation - I got the parts today.

The new XL1509 on Unit 1 did nothing - the spikes were as it was before I removed the XL1509 (and lost it).

I only have a 200uH though-hole at hand, and the inductor shipment is way overdue.  Replacing the inductor with a though-hole 200uH I have at hand, both unit1 and unit2 came back to life.  Unit 1 was spiking to over 7V and unit 2 was spiking to around 6V.    Unit 2 had a screw driver slid under it and shorted the 5V.  I noticed it immediately so the time for additional damage was minimal.

With the new inductor, Unit 2's performance matches that of before damage - as compared using scope captures I had for noise-display for various Volt and Current setting.

Unit 1's performance is 20-30% better than before but still worst than unit 2 - which leads me to conclude unit 1 must have been damaged already when I took the original scope displays used to start this thread.   It was not until the LED on unit1 died that I begun hunting the issue.

Unit 1 has 1/3 more noise than unit 2 for the same settings.  I must have ran unit1 with the 7V+ spikes for a while, so I think it was more than the LED that suffered.  The MCP opAmps are rate to 6V only.  I think they might have been damaged as well.  Even with the "better performance, it is about 1/3 more noise than unit2.  Where unit2 has 60mV noise, unit 1 would have 80mV-85mV noise.

So, once I get the SMD inductors (100uH), I can repair them properly.  For now, I can use them with an though-hole inductor sticking up/out.  Unit 1 with more noise will be used for less demanding things (like feeding my TP4056 USB charger).

By the way, Baruch, how is your program coming along?  I like to compare notes with you.  While I got the Arduino to control the volt/current well, it has more noise than the stock controller.  I also have big noise problem when serial is active....
« Last Edit: April 11, 2015, 12:42:04 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I haven't attempted to progress at all on this project. I've switched back to another project of mine with a friend. I'm following the progress here when it's done and will be happy to compare notes if someone wants to discuss something on the topic or try my firmware on his device.

I have never bothered to implement the button control at all, I since learned that I was missing some option flag to take away the SWIM port for the button and I might get back to make the buttons work as well but that will have to wait a bit.

I also never really made the screen work since both of the units that I have had their screen go bust, or at least one has the screen busted and one has the regulation busted. I might try to replace the inductor and see if that would help on the regulation busted device or take the screen from that unit into the screen busted device.

I also got now the 0.1Ohm 3W resistors I ordered and can probably test and calibrate the current control. For some reason my multimeter shows these as 1.6Ohm and I have no idea if the meter is wrong or I got wrong resistors.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
...
I also got now the 0.1Ohm 3W resistors I ordered and can probably test and calibrate the current control. For some reason my multimeter shows these as 1.6Ohm and I have no idea if the meter is wrong or I got wrong resistors.
...

Just so you have a gauge to see if your DMM gone to the nut house:

Best would be if you can de-solder the 0.05 ohm and use that as guide.  But I did some measurement of stuff in circuit so you can compare without de-soldering.  The B3603 is disconnected without any external wiring.  The screws (of Vin/Vout grounds are screwed in firm) without any wire in them.

These are REL numbers - ie:The probe resistance has already been subtracted off
Of course such small resistance is hard to be precise. 

(1)
I measured the resistance between Vin ground and Vout ground at the screws of the connector block - DMM probes against the screws at the top and the screws are in firm w/o cables.  The 0.05 ohm is still in circuit.  That resistance between the two grounds at the screws (top) should be about 0.07 to 0.09 ohm.  Note:  I first shorted the two grounds (Vin ground and Vout ground) briefly to eliminate any possible remaining charge in any capacitors across this is in-circuit measurement.  Between the two ends of the 0.05ohm in-circuit I got 0.05 to 0.06ohm.

(2)
Between pin 11 and Vin Ground top-of-screw is about 0.01 ohm to 0.03 ohms.  Pin 11 connects to the low side of the 0.05 ohm via a short trace about 1-2cm which then connects to Vin ground plate.  So that 0.01 ohm is really the trace+pinHeader connection resistance.

(3)
R13 is a 100ohm, that resistor in-circuit measures 99.3 ohm on my DMM.  R13 is just next to pin 13 and pin 14 on right of the female header pins.

Those are the "low ohm numbers" I can see on something we mutually have.  If you found another thing on the board you want to measure and compare, just post it here or PM me, I will measure it so you can compare.

Hope this helps in determining the sanity of your DMM.

...
I also never really made the screen work since both of the units that I have had their screen go bust, or at least one has the screen busted and one has the regulation busted. I might try to replace the inductor and see if that would help on the regulation busted device or take the screen from that unit into the screen busted device.
...

If you have a scope, do put the internal 5V to a scope and see if you observe spikes and how high.  Per datasheet, the opAmps in there are only rated to 6V.  The shift registers are rated to 7VI had spikes as high as 7.6V.  I suspect that is the reason my LED display doesn't work anymore.  The individual segments of the LED works when I apply power to them individually (well, kind of, 2 segments lights up), but the display doesn't work.  I think the 7.6V spikes I had on Unit1 killed the shift registers and made some damage to the opAmps.  Even with the controller from Unit2, it is more noise than Unit2.  So I think the shift registers are dead.  I just can't get a good connection of my scope to it when the board is plugged in.  I may just replace the shift registers and see what happens.

If you do have spike nearing 6V, take out the inductor and use an external supply.  Up until yesterday when I got my replacement XL1509, I was using an external regulated 5V power supply.  That works very well.  In my case, I took out the XL1509, but taking out the inductor should also cut the 5V so you can connect the external 5V to pin 9 to 12 as ground and pin 13 or 14 as +5V.

I have ordered a pair of shift registers and I will make an attempt to repair my unit 1 (original one with dead LED).

I haven't attempted to progress at all on this project. I've switched back to another project of mine with a friend. I'm following the progress here when it's done and will be happy to compare notes if someone wants to discuss something on the topic or try my firmware on his device.

I have never bothered to implement the button control at all, I since learned that I was missing some option flag to take away the SWIM port for the button and I might get back to make the buttons work as well but that will have to wait a bit.
...

I made some progress but my "control buttons" on my Arduino based controller is just a routine to set an integer value to input specific variables.  That allows me to control Vout at +-1mV.  My slope/intercept is hard coded.

The stumbling block I came across is noise.  Without serial active, I can get my DMM/ADC to read +-2mV to +-3mV of the voltage I set.  Both DMM and ADC do averaging on samples.  Looking at the output on the scope, the output has a 80+ mV noise.  It is in general about 20%-30% higher than the stock controller.  At some settings, the stock noise would be 130mV and my controller would be 150mV-ish.

So that begs the question, what is the point of having +-1mV (15 bit PWM) setting resolution when the darn thing is sitting on top of 100mV noise?  That kind of "cooled my heels."

...I might get back to make the buttons work as well but that will have to wait a bit...

My big thing was serial - both monitoring and perhaps receiving control signal via serial.  Once serial is active, I get about another 100mV+ instability - a pulse of higher voltage at the frequency of Serial.print() which I do at screen update every 500ms.  In eliminating the LCD, I took that down to just Serial.print().  But Serial.print() causes much higher instability than the LCD.

USB serial presents a ground level problem.  The controller will be sitting at ground that is I(out)*R(sense) below external ground.  To avoid that, I got the Nano working with Bluetooth.  Bluetooth adds over 100mV+ instability to the system even when not serial-printing.  So I am  at 200mV instability without printing.  At Serial.Print (connected and printing), every 500ms, I get a Vout delta of 300mV or more.

I could probably solve all those problems.  I can separate the controllers: Volt-controlling function with one subsystem and the I/O with another subsystem.  Even if that works, I would still have at around the stock-controller level of noise - that would be +-100mV-ish.  The increase level of resolution (15 bit PWM) would have been worth it if it could be achieved by single simple system.  But when it needs multiple subsystems to eliminate interfering noise from each other (16bit adc separate, 15bit pwm separated, I/O subsystem separated, screen management separated due to ram limitation...).  All that trouble would make it difficult to be worth while.

I started this Arduino controller so I can get a quick replacement to my dead LED.  That worked but not so user-friendly and not so noise-free.  With none of the other things I want being easy, I too am taking it slow - doing it for fun alone instead of hoping to making it useful soon.  Fun and learning doesn't need a return.  Doing it for use means the usefulness has to exceed the trouble of making it work.

Rick
« Last Edit: April 12, 2015, 04:00:28 am by Rick Law »
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
Thanks for the test points, they measure out quite the same (1 was equal, 3 was 88.8), I re-measured my resistor and it is hard to get a steady hand but sometimes I can get it to show numbers at around 0.1 to 0.4 and other times it fluctuates. When I tried last time I used another probe with aligator clips and that was constant but measured 1.6, maybe the clips are the problem.

I don't have a scope to measure the noise and am not that concerned with the noise level for now to worry about it.

I stopped at the buttons stage since I mostly wanted the serial for automation. If the serial is causing noise and the shared ground is yet more problems you can consider using opto-isolators for the communication part and disconnect the laptop from the b3603.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thanks for the test points, they measure out quite the same (1 was equal, 3 was 88.8), I re-measured my resistor and it is hard to get a steady hand but sometimes I can get it to show numbers at around 0.1 to 0.4 and other times it fluctuates. When I tried last time I used another probe with aligator clips and that was constant but measured 1.6, maybe the clips are the problem.

I don't have a scope to measure the noise and am not that concerned with the noise level for now to worry about it.

I stopped at the buttons stage since I mostly wanted the serial for automation. If the serial is causing noise and the shared ground is yet more problems you can consider using opto-isolators for the communication part and disconnect the laptop from the b3603.

You are welcome!  You now know it was not your DMM going nut-so...  #1 being same is good.  I purposely went as far from the low-side of the 0.05 as I can to get close to the 0.1ohm.  That being the same is a good thing.

Testing the internal 5V is less to reduce the noise but more to ensure no further damage.  I think my Unit1 was damaged by the 7.6V spikes.  I didn't know I had an issue until the LED display died.  I am almost sure (but not positive) that the dead of the display and the extra noise is caused by the 7.6V spike from the internal 5V.

I already ordered (and received) the opto.  Thing is, it gets too complicated to be worth while as a "do for use" project.  So, I am treating it as a "do for fun & learn" project.  Any of these "extras" would have made it even harder to fit on the "top board" form factor.  So I got a lot less excited about it now that I have a replacement to work with instead of just the one with the dead display.

I have an idea about separating the PWM generation.  I will give that a shot later today.
 

Offline baruch

  • Regular Contributor
  • *
  • Posts: 78
  • Country: il
I'm now playing with an STM8S103F3 unit I received (ordered 2, got 5) and looking to maybe make it into the next control board by using a rotary encoder and a two-line LCD screen. I got the rotary encoder working so now I have up/down and a switch, if I can get the I2C working with the LCD I'll have an improved unit in terms of control and state.

I needed to use a variable power supply for some testing and found that while the serial is great for automated testing I need manual finger based control for simple cases like that.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I suppose we should not be surprise that there is revisions on a design.  Since I blew the display on my first unit, I now have a second which I also blew but less bad.

I just noticed these two units are slightly different.  These are the differences I noticed so far.

Top (component side) view:

1. The inductors and the capacitors are of different color.  The buttons are different.  Those may be a good indicator of vintage.

2. The inductors for the internal 5V are removed for repair, note that the capacitor there are different value.  If I read it right, Unit 1 (left) uses 100uF and Unit 2 uses 220uF.  I have a through-hole inductors I am using (waiting for Mouser for part).  The same inductor on Unit 1 is 1/3 more noise than when on Unit 2.  Note that both units' internal 5V was shorted.  Unit 1 for one or more times.  Unit 2 shorted once.  Unit 1 went through stress test and more abuse, so the noise delta may be due to other reasons.

3.  Look at R11 R13 at the lower-right corner of the bottom-board.  There is an extra trace there for unit 2.  I have not yet discern where that goes.

4.  On the LED/MCU board, note that Unit 2 has Rx Tx Vcc Gnd printed on the PCB and unit 1 has nothing printed.

Bottom view:

1. Unit 1 serial starts 20130 (left), unit 2 serial starts 20135 (right), so I assume unit 1 is older than unit 2.  Unit 1 is ordered from a supplier in the USA.  Unit 2 came from China.  But I don't know if unit 1 is the older design.  (I got unit 1 first).

2. The logos are different

3. The traces on the left of the logo (to pin 9 to pin 16) are different.  Further up around Vout, the traces are different.  Unit 1 has pin 9, 10, 11, 12 all connected to the ground plate right at where the pins are soldered.  Unit 2 has pin 9 and 10 connected to the ground plats, and pin 11 and 12 are connected to R050 (low side) at the same point where the trace came down from top layer to the same ground plate on the bottom.

4. The traces under Vout are different (top left side of the board bottom)

5. Shift registers on Unit 1 are removed for repair of the LED display.  Waiting on parts.

I have to trace the identified the trace differences to see if it is a merely a layout change or actually changed connections/components.  There may be more differences

 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
I wonder if there is noise difference between the two versions .

Another project is consuming my time, but will get back to this project
« Last Edit: April 19, 2015, 08:19:24 pm by Asim »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I wonder if there is noise difference between the two versions .

Another project is consuming my time, but will get back to this project

I can tell you for sure there is a noise difference, but I don't think I can quantify it.  It may not be significant.  I will know more once I repair both units - unit 2 bottom board and unit 1 top and bottom.  I am waiting on parts for permanent repair.

I know I cannot quantify it because I unknowingly ran unit 1 in damaged state for a while, so there may be other unknown damages thus results may not reflect an undamaged board with new layout.

Let's call the "internal 5V" supplied by the XL1509 circuit I5V for short.

Having temporarily repaired the I5V with through-hole parts instead of SMD, I am able to match unit 1 and unit 2's I5V performance (noise profile).  I can only match them if I change unit 1's 100uF output capacitor back to 220uF like the older-design unit 2.  By measurement, the 100uF give 10-30mV more I5V noise but better transient response (Better transient response according to theory but not tested.  That might have been the reason for switching).

Intellectually, I know the I5V noise will work its way to Vout.  In working the Arduino based control board, I can see in reality the I5V noise directly affecting Vout noise.  But, I5V noise is dampened at Vout - ie: 30mV I5V noise has less than 30mV impacts on Vout.

I am waiting on the real SMD parts to finalize the repair.  I know I can repair the bottom boards.  I am hoping unit 1's top board repair works as well so I can test them as a package (instead of using unit 2's top board on unit 1 which is only half the picture.)  I'll know more once the next round of repairs are done.

Stay tuned, I will share with you what I am about to learn.

Rick
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
OK!  Got the inductors from Mouser & Shift Registers from Tayda at the same time.

Both my B3603 are back!  They were damaged by shorting the internal 5V.  Immediately after shorting Unit2's 5V, I noticed a sudden increase in noise and decrease in Vout stability while working on the Arduino based control board.  I hunted the problem to the internal 5V.

Unit1: dead display, sick internal 5V
Unit ran for a long time before I knew of the damage.  Even when the LED display burned out, I did not realize the internal 5V damage until Unit 2 was damaged.  The internal 5V has 7V to 7.6V sharp spikes at XL1509 switching frequency.  After repairing the internal 5V, I attack the dead display.  The shift register was what died and killed the display - it is now back!  I am pleased my diagnosis was right.

Comparing to oscilloscope printouts I used to begin this thread, it is has a less noise than now - that could be due to a modification made or  I might have taken those print outs after I shorted the 5V the first time. 

Unit 1 is better than before but not as good as unit 2.  Unit 1 and unit 2 has the same XL1509 noise profile but has more Vout noise than unit 2, I assume this is due to possibly the OpAmp while didn't died with the 7.6V spike but is now less perfect.

Unit2: sick internal 5V
Unit2 is now comparable to before damage.  This unit I noticed the short right the way and briefly.  5V was spiking to just around 6V.  The display was working.  Unit 2 is about 10% less noise than unit 1.

* Modification made

The internal 5V supplied by the XL1509 works the internal OpAmps and controls.  Noise in the XL1509 will work its way to Vout.

Unit 1 and Unit 2 are different revisions.  On the XL1509 Unit 1's revision changed the capacitor from 220uF to 100uF perhaps to increase stability.  I experimented with increasing the inductor and the capacitor to decrease noise with the XL1509 and thus reduce Vout noise.  I found using 200uH with 220uF works great with noise but transience response may be too slow.  It could and did occasionally go into oscillation.  100uH with 220uF seem marginal as its stabilty depends on which 220uF capacitor I used.  Figuring the 20% tolerance was at play, I use a known "below nominal" capacitor.  I have a 220uF measured at just 180-190uF.  Using this "185uF" and a 100uH, I found it very stable and the XL1509 noise profile matches that of unit 2.  Unit 2 has the other revision with 220uF.

If one really want to trim the noise a bit, one can get 10x100uH and use the highest, or get 10x220uF and use the smallest.  Experiment with 120-150uH is also a possibility.  Using a known below-nominal capacitor as I am now may be ill-advise.  So, I think I may want to try 150uF and or other caps next time I order parts.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I just did a small hardware mod  >:D

One problem with the b3603 is, if you connect power, the output will be on for a small period of time (even if the unit is turned off). This is a hardware bug, and happens because the 5V switching reg is starting up too slowly to disable the LM2596 from the beginning.

Solution: solder an additional pull up resistor between pin6 (easier access tha ~OE) and Vin+ on the bottom board to ensure that the output is off and remove R16 on the top board. This resistor has to be at least 10k, because otherwise you could exceed the absolute max injected current of the stm8 with 40V Vin. I used a 100k resistor. R16 has to be removed, because otherwise we'll get an unintended voltage divider.

Quote
Did anyone evaluate the dynamic performance of this device?
If noone does it, I'll test it at the end of next week. Btw we should be able to tweak the performance by changing R34, R35, C14 (current) and R20, R19, C10 (voltage).

@baruch I think you got the idea behind my drawing

I updated the schematic again, because the bottom board had lost it's component labels. ::)

@Flex, I am about to implement this hardware mod.  Any ill-effects you may have noticed since you made the change?

Thanks
Rick

EDIT (2015.04.28)  Anyone else tried this mod?
I just did the mod and it doesn't not seem to work.  It works only if the system has been off for a while.  When the system was off for just a second or a few seconds, this mod doesn't work.

- I have a push button Vin switch to turn ON/OFF
- an LED+resister at Vout - no switch at Vout

If the system has been off for a while (minutes), it works.  If the system was off for mere a few seconds; switching on Vin, I can see the LED blink on for a second or so.  Seem muted (not staying on as long) with the mod, but the pulse of power came just the same.
« Last Edit: April 28, 2015, 10:40:15 pm by Rick Law »
 

Offline icpart

  • Regular Contributor
  • *
  • Posts: 65
  • Country: bg
Hi guys. Very interesting thread. Too bad that I see it to late.
I draw most of circuit some time ago, but I have only handwritten drawings at moment on A4 papers. If someone interesting I can scan them and post them here.

For now i will start to read from beginning the whole thread. It seems to be very interesting postings here  :).
 

Offline rr100

  • Frequent Contributor
  • **
  • Posts: 339
I just want to say this is a fantastic little bugger!
I was always fascinated by those universal power supplies that could deliver (very, VERY roughly) 3-4.5-6-9-etc volts but this is just a dream. I tried to get by recently with a "USB" powerpack that also had 9 and 12V output but this just isn't the same. Beyond all the details about precision, noise levels and so on this is getting the job done. I've had the stupidly old Microsoft mouse not recognizing the NiMh battery after a long vacation ... feed it 1.5A for a 20-40 seconds and we're back in bussiness. I've got a "female Dell power connector" in the mail and this small thing is coming with me! It'll charge/power anything up to 15V or so while provide more info than any other "normal" charger/power supply.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
I just want to say this is a fantastic little bugger!
I was always fascinated by those universal power supplies that could deliver (very, VERY roughly) 3-4.5-6-9-etc volts but this is just a dream. I tried to get by recently with a "USB" powerpack that also had 9 and 12V output but this just isn't the same. Beyond all the details about precision, noise levels and so on this is getting the job done. I've had the stupidly old Microsoft mouse not recognizing the NiMh battery after a long vacation ... feed it 1.5A for a 20-40 seconds and we're back in bussiness. I've got a "female Dell power connector" in the mail and this small thing is coming with me! It'll charge/power anything up to 15V or so while provide more info than any other "normal" charger/power supply.

Welcome to the club (of B3603 owners)...

Actually, it will do well beyond 15V.  It will go all the way to 34V (I've tested) and should go by specs to 36V.  Just make sure your Vin is at least 3V > Vout and Vin<40V.

The noise level is better than my initial post - typically about 10-30% less noise than the scope pictures I posted.  My first unit has a problem that I was not aware of, and since fixed.  It was the LED dying that alerted me of the problem and tracked it down.
 

Offline rr100

  • Frequent Contributor
  • **
  • Posts: 339
Thank you for the welcome!

And thank you for the work you put into this, I would say at least one order of magnitude above the people who actually wrote the manual... not that I'm complaining about them, not at all - I'm sure having a decent technical writer on payroll would mean doubling the price...

I said 15V because I plan to use it with (work) laptop's power supply, this is my dream portable "lab" CV/CC power supply. If needed probably I can use it with two 15V laptop power PSes (I don't know how well they take it but shouldn't be so bad, especially that we're talking much lower currents compared with what they're designed to do).

I'll report with more thorough tests and real numbers but I can't complain about mine. It does pretty well voltage-wise under 1V (well, not really down to 0) and for lower values is about 1 (at most 2) counts out; for larger voltages it is about 1% or better. Can't complain.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank you for the welcome!
...
I said 15V because I plan to use it with (work) laptop's power supply, this is my dream portable "lab" CV/CC power supply. If needed probably I can use it with two 15V laptop power PSes (I don't know how well they take it but shouldn't be so bad, especially that we're talking much lower currents compared with what they're designed to do).
...

I use 2 laptop power bricks.  I have a 15V (low noise) which I connect normally.  A 19.5V when I need a bit higher voltage.  And when I need really high, I would serially connect the 15V+19.5V.  This is what I use to get the 30V out required for calibration.

On one occasion (just to see if it works), I connected by two 19.5V giving it a 39V - just below the 40V max.  The 39V in get me the max 36V out.

Word of suggestion if you intend to gang up the laptop power-bricks to make a higher voltage V-in.  Serial connect the power-bricks first and measure the combined Vout before you plug it into the B3603 as V-in.  At no-load or low-load, some of them gives much higher voltage.  I have heard some laptop power bricks having 3-5V swing.   So, if you have a pair of 20V each with a 5V swing, you end up shooting 50V into the B3603.  That will cook it.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
It has been three months since my last post, during that time I worked out some problems with the code, received the ADC & DAC, designed a PCB.

But for some reason the DAC didn't work with me so I ditched the project for a month or so.( i was frustrated with the DAC)

I came back to it and now I setteled on using the 16 bit pwm as my " alternative DAC"

I got it working and by today I have a semi functioning uncalibrated unit. I will post some photos tomorrow or after tomorrow when I get the chance, for now I will just sleep 😁
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
It has been three months since my last post, during that time I worked out some problems with the code, received the ADC & DAC, designed a PCB.

But for some reason the DAC didn't work with me so I ditched the project for a month or so.( i was frustrated with the DAC)

I came back to it and now I setteled on using the 16 bit pwm as my " alternative DAC"

I got it working and by today I have a semi functioning uncalibrated unit. I will post some photos tomorrow or after tomorrow when I get the chance, for now I will just sleep

I had some issue with the DAC also (12 bit MCP4725).  Perhaps the low pass filter for Vout-Adj and Iout-adj & the ADC seem to be interfering each other causing some mild oscillation.  I too switched over to PWM on the ATMEGA.  I use 15bit PWM instead of 16; matching the resolution of my ADC (ADS1115).

Mine got to a state where my controller is working well to 15bit resolution but no RS232, and the UI merely suitable for "test-mode".  I kind of abandoned the project because it was like enhancing a laptop's 1024x800 display to 1600x1200.  Could be done but so much has to be done that makes it not worthwhile.

My big thing was logging and RS232 control.  I had problems with noise when Serial.print() is going.  I had hope the Bluetooth would change the picture since I cut the common ground with the PC, but it didn't help.  Serial.print() was changing the V+ on the controller enough that the PWM-integration changed ever so slight and got magnified when it get to Vout.  Mostly, it was impact on how the rise-time of the PWM.  Sometimes, it won't fully rise before fall should begin.  I considered a separately powered PWM generation circuitry, but that is just too much to make it worthwhile at least for the time being.

This Arduino-based controller for now is my "when nothing else more interesting found" project.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
For me it is worth it to continue this project, the portable B3603 mod I did before is my only power supply at home, it served me well and it lasts for 2 months with one charge. That's why I want to improve it so it has a better user inteface while keeping the small size of the unit.

Attached is the orginal pcb i made that uses the dac, i am still using the same board, nothing a bodge wire and resistors and capacitors can't fix
 

Offline rr100

  • Frequent Contributor
  • **
  • Posts: 339
What controller you used to charge/protect you LiIons?

Asking just out of curiosity, having some portable "lab" power supply was quite a sexy idea, however I can't imagine when I could use it in "real life". I do have plenty of "naked cells", e-bike power packs and so on but there is just something special about this 50g thing you can use together with a laptop power supply to do any job under 15V or so, any voltage, charging LiIon, trickle charge car battery overnight, etc.

Before this I've had in my ultra-portable kit a bus pirate and a TTL-PC serial voltage converter. The bus pirate could actually feed quite safely 3.3V and 5V (you could even charge a couple AAs in a pinch, or do many things with a bit of wire and scotch tape) but this, this is taking it to a totally new level. Both in terms of power an usability.
 

Offline Asim

  • Regular Contributor
  • *
  • Posts: 171
What controller you used to charge/protect you LiIons?

Asking just out of curiosity, having some portable "lab" power supply was quite a sexy idea, however I can't imagine when I could use it in "real life". I do have plenty of "naked cells", e-bike power packs and so on but there is just something special about this 50g thing you can use together with a laptop power supply to do any job under 15V or so, any voltage, charging LiIon, trickle charge car battery overnight, etc.

Before this I've had in my ultra-portable kit a bus pirate and a TTL-PC serial voltage converter. The bus pirate could actually feed quite safely 3.3V and 5V (you could even charge a couple AAs in a pinch, or do many things with a bit of wire and scotch tape) but this, this is taking it to a totally new level. Both in terms of power an usability.


Here is a post I put some time ago, it shows the the power supply unit & the charging circuit ( it is a module I bought from ebay) 
 

Offline PeterFW

  • Frequent Contributor
  • **
  • Posts: 577
  • Country: de
    • Stuff that goes boom
Hello!
Do you have to save the calibration?

I have run through the calibration routine and the unit works like it should but after power cycling the unit resets and the factory calibration is loadet again (wich is out of spec).

Greetings,
Peter
 

Offline PeterFW

  • Frequent Contributor
  • **
  • Posts: 577
  • Country: de
    • Stuff that goes boom
Ok... after a bit of dicking around, after i thought i have nothing to loose i found it.

Select "F5" and then change to "Y" with the up/down keys and hit "OK" to save it.

Edit:
The INA226 i put on the output makes this thing nearly "self calibrating", if you can call a cheap chinese board that.
« Last Edit: April 21, 2016, 03:29:23 am by PeterFW »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Glad you find the problem.  The UI is indeed lacking.  Coupled with the small 4 digit 7 segment LED which doesn't give a lot of feedback, it is hell to use.

I found that you often have to calibrate the HI (30V), then go back to LOW (2V) a few cycles to get to the closest. 

Bare in mind, you should adjust the display accuracy first (F1/F2) so it is displaying the voltage as accurate as possible, then go do the regulation part (F3/F4).  Otherwise, for example, if your display is off by 10%, adjusting the regulated voltage to match the display will result in it being 10% off.

Rick
 

Offline PeterFW

  • Frequent Contributor
  • **
  • Posts: 577
  • Country: de
    • Stuff that goes boom
Glad you find the problem.  The UI is indeed lacking.  Coupled with the small 4 digit 7 segment LED which doesn't give a lot of feedback, it is hell to use.

Indeed, when i first got the unit i was happy to see it had a UART port and thought that i can maybe remotely control it.
That would solve so many problems.

Quote
Bare in mind, you should adjust the display accuracy first (F1/F2) so it is displaying the voltage as accurate as possible, then go do the regulation part (F3/F4).  Otherwise, for example, if your display is off by 10%, adjusting the regulated voltage to match the display will result in it being 10% off.

Thanks!
I shorted the output to calibrate the current reading and put a resistor on the output to get the current regulation.
Is this the way it should be done? As far as i can tell, the unit seems to work reasonably well now.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
As far as I can tell, the UART is connected to the MCU.  But it is either ignoring it or the commands are so cryptic I cannot figure out how to get it to communicate.  I managed to read a character from it during power up/down at 38400,N,1 (If I recall the setting correctly).  That could be just noise in the system.  I don't recall exactly.  I tried sending it 'B' for begin, and continued to cover the entire alphabet.  I tried some guesses like "V=1.234".  It was fruitless.  I have abandoned the effort.

- - -

As to current reading calibration, you just need to make sure you have something to chew up the power it puts out.  You do need to "match" the reading with the real current, so you need a current meter or a DMM with current measurement.  Since you are doing it without a DMM, you may be doing it wrong.

You match the "real current" (DMM reading) with the B3603 display by adjust the display value to the real. 

Say your are doing the LO (200mA).  The B3603 will sets its output to 200mA, say your DMM said it is 250mA, so you adjust the display value to the "real" current which is 250.  Once you stop fiddling the value, the B3603 adjusts it output again to what it thinks is 200mA.  Say your DMM now say it is 195mA, you adjust the B3603 to read 195mA.  Now the B3603 increase it a bit again to what it thinks is 200mA.  Each cycle the difference reduces.  You stop when it starts to increase - do one more cycle which should bring the delta back down.  Typically, you would bounce around an error like 2 digits high or 1 digit low.  Once you get it to closest-match to the "real" reading from your DMM, you move to adjusting the HI.  (note: the first cycle I would leave it at +-2mA or more, read on).  For now, let say you matched it to exactly 200mA.

Now you adjust the HI value which is 2A.  Once that matched, you will find your LO (200mA) not matching again.  You do the LO again, but you would find the HI off a bit again.

Each cycle, the delta decreases.  Eventually, you got both HI and LO to as close to real as possible.  Once you are done with the display (ADC reading) part, you calibrate the output (regulation) part.  Adjusting that may have a small affect on the display part.  So you may want to do it again until the deltas are within your tolerance.

Do not expect both LO and HI to match exactly to real world.  Given that it is a 10 bit ADC, and the machine doing software over sampling to increase it a bit (I think probably to 12 bit).  It is is indeed 12 bit, perfect world would give you 1/4096 resolution which means you could in theory get all 4 digits to match.  But it really is not that stable so do not expect it to match exactly.  As you adjust, your current sense resistor (and your load resistor) would both heat up enough very quickly to mess up the reading anyhow.

I typically calibrate with a PC fan cooling it.  So when I need better current reading or running over 1.5A, I put the fan on.  Even with that, adjusting 2A/200mA is difficult.  It heats up during 2A adjustment and cools as during 200mA adjustment making it very taxing on your patience.

I upgraded the current-sense to a better one with better temp-co.  It helps but not by much.  Search back a few reply for a result-report on the current-sense resister replacement.
« Last Edit: April 21, 2016, 06:24:05 pm by Rick Law »
 
The following users thanked this post: PeterFW

Offline PeterFW

  • Frequent Contributor
  • **
  • Posts: 577
  • Country: de
    • Stuff that goes boom
Since you are doing it without a DMM, you may be doing it wrong.

No worries, i have all the wires in the world hooked up to the thing :-)


Thank you for your reply about the calibration!
The part (F2) about the current reading i have figured out, i have a problem to understand the current regulation (F4).

I calibrate the unit (F2) as you explained, i put the DMM set to DC current across the output and fun through the loops.
The DMM practially "shorts" the output in this case and the power supply runs in constant current mode in wich i calibrate it.

But the F4 function (current regulation) has me a bit stumped, i mean, it worked but i do not know why since i can not adjust it just hit the "SET" button and the unit does... something.
In the F4 calibration i put a resistor on the output, 6.4Ohm/10W in this case, it was the next best one i found.
It shows a current reading and all i can do is to hit the "Set" key and the value changes, the UP/DOWN keys have no function.
This way i run though the loops with the OK key for the HIGH and LOW reading.

Greetings,
Peter
 

Offline JonyBC

  • Newbie
  • Posts: 2
  • Country: pt
Hi Guys. I have some problem also with the Calibration, most problaly the Unit.

I was powering it with a step up Buck converter from 5v to 40v, and maybe i overvoltage it >40v, not really sure, and around the same time, I shorted the converter pins and now its getting hot and all, problaly dead.

But i tested the Voltage regulator on the side and it was fine.

the next day, it wake up all crazy, the Regulation itselft if i use a DMM is good. be it Voltage or Current i externally measure the exact set values.

But the readings on the unit are way off. always 10v and 0 amps (Even after factory reset)

But when i go to Calibrate the readings they read ok, maybe 0.1 off, in V and A, but never acepts the calibration. (Its  always the same offset)

When i Calibrate the Voltage regulation, and save it, it starts to Read the Voltage almost ok  (0.5 to 1V off)

But the Current calibration never changes, It always read my last input when calibrating, But 0 when using.

The Leds however  function OK and it trips the current and voltage at the set value(Measuring with external DMM)

As this ever happened to anyone?

Thanks

 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Hi Guys. I have some problem also with the Calibration, most problaly the Unit.

I was powering it with a step up Buck converter from 5v to 40v, and maybe i overvoltage it >40v, not really sure, and around the same time, I shorted the converter pins and now its getting hot and all, problaly dead.

But i tested the Voltage regulator on the side and it was fine.

the next day, it wake up all crazy, the Regulation itselft if i use a DMM is good. be it Voltage or Current i externally measure the exact set values.

But the readings on the unit are way off. always 10v and 0 amps (Even after factory reset)

But when i go to Calibrate the readings they read ok, maybe 0.1 off, in V and A, but never acepts the calibration. (Its  always the same offset)

When i Calibrate the Voltage regulation, and save it, it starts to Read the Voltage almost ok  (0.5 to 1V off)

But the Current calibration never changes, It always read my last input when calibrating, But 0 when using.

The Leds however  function OK and it trips the current and voltage at the set value(Measuring with external DMM)

As this ever happened to anyone?

Thanks


JonyBC,

I am no EE expert, but I am experienced with the B3603, so see if I can help here...

First, you make sure you are calibrating it right.  If your read-calibration is not correct (first volt/current calibration), the regulation-calibration (second calibration) is meaningless.  Focus on the read-calibration for both V and I first to make sure it is within 2%-ish.  If the B3603 is healthy, it should reach 2% accuracy without too much headache - do it no-load or very very light load (200mA).  If you are 0.5V off after careful calibration, you may have a hardware problem, so read on.


re: "I was powering it with a step up Buck converter from 5v to 40v, and maybe i overvoltage it >40v, not really sure"

Let me understand this, your input to the B3603 is a Buck converter, did I understand you right?  And you might have had input to the B3603>40V, right?

40V is the B3603 max input voltage limit.  Depending on how long and how much over, the B3603 might have been damaged.    The caps are only 50V cheap caps, so 40V is on the high side of the margin already.  Over 40V is not healthy at all!  The operating limit is 36V.

Powering it via a Buck converter is ok, but you are adding noise.  Also you need to watch the power drawn and oscillation.  I have power source that when above a certain current, it (when working with another boost/buck board) really really oscillates!


re: "I shorted the converter pins and now its getting hot and all, problaly dead"

What converter pins?  If you mean the B3603 output, the 3603 should handle it ok if it is brief.  We need to know what pins you mean, and what part got hot.

The B3603 should take the shorting for sub-seconds.  Within sub-second, the B3603 will sense the over-current and drive it down to the current limit.  More than sub-second, and depends on the current-limit setting, something bad might have happened.  It also depends on how you feed the B3603.  Your setup (buck feeding buck), your buck that feeds B3603 input might not have worked well.  When you shorted it, the B3603 draws a lot from your feeder buck circuit.  I may draw so much the feeder voltage collapsed - to a point that the 3603 is not getting enough voltage to operate the controlling electronics.  If so, it is no longer regulating and it would be outputting as much as the supply can give.  That means a lots of current and likely damage might have occurred.


Re: "But the readings on the unit are way off. always 10v and 0 amps (Even after factory reset" & "The Leds however  function OK and it trips the current and voltage at the set value(Measuring with external DMM)"
I understand as "The B3603's LED reads 10V 0mA which is not what you set, but your DMM get around the set voltage/current" Do I understand correctly?  If so, and you have redone your calibration carefully, you have a Hardware problem somewhere... Read on, check the stuff explained below.  You may have killed the XL1509 part and it went on to do some damage.


Some info that may help you debug and see if something is wrong:


Before we start, a quick overview:  (First) The input caps are rated only 50V, much over 40V, you are toasted.  (Second) The B3603 has two voltage regulators connected to the Vin: the LM2596s for the B3603's output and the other one is a XL1509-adj powering the internal electronics.  One or both could be damaged.  If you have a scope (and DMM), look at both the B3603 output and the internal voltage.  To look at the internal, connect to pin12 as negative and pin13 as positive (for pin numbers definition, look at reply#21). The XL1509 output should be 5V.  I would look at that with and without connecting the top board.  A third one is on the bottom-side of the top-board.  This one does not connect to the Vin but instead it draws from the XL1509. 

Since you said the LEDs are not showing the right values, lets see if it is lying or something is passing it bad info.  If it is getting bad info, your ADC part of the circuit may be shot.  The voltages-to-measure (which translates to readings on the LEDs) are amplified by an MCP6002 rail-to-rail Op-Amp.  That op-amp will blow at 6V.  The MCU which does the ADC gets its power after another voltage regulator.  That part should also be checked.  The ADC input (ie: the MCP6002 output) has mapped voltage (the B3603 output volt/current translates to certain voltage as ADC input.)  It should be fairly linear.  Check against that.  Get a few data points and see if it is really linear.  It should approximate:
  Real output voltage = ADCinputVolt*slope + intercept
  Real output current = ADCinputVolt*slope + intercept

Doesn't matter what the slope and intercepts are, you can check for if it is linear.  An output increase of 1V say from 5volt to 6volt; that) should show a delta in voltage for ADC input, that ADC input change for 1volt output change should be very very close to the same delta when you increase output from 6volt to 7volt and very very close to if you increase output from 7 to 8volt.


So, test 1:

Look up the pin definitions (described above) and test to see if the OpAmp is feeding something linear to the MCU's ADC.  Test both the volt and current's translated voltage.   For the right pin to tests, look up the reply with the pin definition described above.

If that linear relationship (described above)  is not holding, your OpAmp part of the circuit is not well.  If your XL1509 is feeding it right (ie: good and clean output at pin12 & pin13, read on for more details), your  OpAmp needs a replacement.


Test 2, is the XL1509 (part of the circuit) healthy:

If you see the B3603 output with noise level like the one I posted in the OP, you are at best marginal.  If you see the internal 5V with noise like reply#173, your XL1509 (part of the circuit) is not well.  If you don't have a scope, may be you can catch that with the DMM reading jittering.  A scope would show the info much better.  When I did the OP, I did not realize my XL1509 was probably marginal.  It later blew the display LED so I begun debugging.  It was then I realized the bad XL1509 (part of the circuit).  Post repair, the Vout noise dropped by 1/3.  If you see internal electronic noise level that high, and the internal voltage is NOT below 5% of 5V (preferrably within 1-2%), you need to look at that part of the board.

Note: I said  "XL1509 (part of the circuit)" because it may not be the XL1509 itself.  In my case described in reply 173, it was the inductor!

See what we get so far.  Make sure proper calibration then see if both the B3603 and the XL1509's output are well.

Good luck.  Keep us posted.

Rick
« Last Edit: April 29, 2016, 10:36:13 pm by Rick Law »
 

Offline JonyBC

  • Newbie
  • Posts: 2
  • Country: pt
Hi Rick Law.

First of all thanks for your detailed Reply  :D


- - Let me understand this, your input to the B3603 is a Buck converter, did I understand you right?  And you might have had input to the B3603>40V, right?

Yes Exactly that, The more i think about it. I´m Almost sure  i overvoltage it, Somehow i had the idead of 40v tops and not 36v, and then i gave it even more... |O


- Powering it via a Buck converter is ok, but you are adding noise.  Also you need to watch the power drawn and oscillation.  I have power source that when above a certain current, it (when working with another boost/buck board) really really oscillates!

I was doing it for some time also. And was perfectly fine for me. I was mostly using it to Charge Car battery with a powerbank, And triggering all shorts off automotive stuff, with great sucess, Until i got greedy i mean.

- What converter pins?  If you mean the B3603 output, the 3603 should handle it ok if it is brief.  We need to know what pins you mean, and what part got hot.


Sorry for the confusion. I shorted the Output of the Step up converter while connected to the input of the B3603, And now the Step up converter it works ok for a minute, But then starts jumping the temperature and consumption until it stops working.

I checked no Fault in the B3603 at that time, All seem normal, only a day after it went crazy.

The B3603 should take the shorting for sub-seconds.  Within sub-second, the B3603 will sense the over-current and drive it down to the current limit.  More than sub-second, and depends on the current-limit setting, something bad might have happened.  It also depends on how you feed the B3603.  Your setup (buck feeding buck), your buck that feeds B3603 input might not have worked well.  When you shorted it, the B3603 draws a lot from your feeder buck circuit.  I may draw so much the feeder voltage collapsed - to a point that the 3603 is not getting enough voltage to operate the controlling electronics.  If so, it is no longer regulating and it would be outputting as much as the supply can give.  That means a lots of current and likely damage might have occurred.



So its not really a good idea to power it only with 5v? it may work but if i shorted and it come down to 3 volts it may get to that condition right? I think the powerbank circuit cuts off before, but its always good to know, Thanks. I have use this before, for fooling a control unit into thinking the B3603 was a Sensor output.  and it worked great, No considerable current in this situation.

- I understand as "The B3603's LED reads 10V 0mA which is not what you set, but your DMM get around the set voltage/current" Do I understand correctly?  If so, and you have redone your calibration carefully, you have a Hardware problem somewhere... Read on, check the stuff explained below.  You may have killed the XL1509 part and it went on to do some damage.

Yes Exactly Everytime i do a Factory Reset, The readings are always the same 10v 0mA it doenst matter the settings or output current.

But if i forget about the readings and use external reading, Everything works fine.

No matter how many times i try to do the Read (V and A) calibration Its always Off a little and it nevers changes. But strange think is while calibrathing the Read function, The B3603 Manage to Read  and not while normal operating.

 Anyway. If i do the Voltage Regulation Calibration (That by the way  its Way off like 20v off) I have do do it maybe 4 times. Then the Voltage reading starts working almost normal (0.5 V off)

Now the current Regulation, its strange, Let me try to explain. I start the calibration.

- It starts with the low setting 200 mA, on the external DMM its ok but on the B3606 it says 2 AMPS
 and then i press down like a milion times(Normal for this unit i guess) and i get to 200 mA and set it.

- After it starts the high setting (Not sure now if its 2Amp or 1 but i will say 2) on the external DMM it read  ok
 But now on the B3606 it says 200mA or the set value that i input before for the low set, So milion clicks after i get to 2Amp and set it.

-It comes again to the low Setting. Once Again  the external DMM reads 200 mA but the B3606 as 2Amps on screen.

And I think this cycle never stops. I have tried 3 cycles and always the same.

I will try to do the test´s that you said.

Once again Rick thanks for your reply  :-+


 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Hi, very good review of the B3603.
I have received one, the last week,but when I tried to adjust the output voltage below 1 volt , it return to 2.16 Volt.
Somebody have the same problem. I need voltage below 1 volt but I can't get it.
Thank you for any advice
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Hi Rick Law.

First of all thanks for your detailed Reply  :D


- - Let me understand this, your input to the B3603 is a Buck converter, did I understand you right?  And you might have had input to the B3603>40V, right?

Yes Exactly that, The more i think about it. I´m Almost sure  i overvoltage it, Somehow i had the idead of 40v tops and not 36v, and then i gave it even more... |O


- Powering it via a Buck converter is ok, but you are adding noise.  Also you need to watch the power drawn and oscillation.  I have power source that when above a certain current, it (when working with another boost/buck board) really really oscillates!

I was doing it for some time also. And was perfectly fine for me. I was mostly using it to Charge Car battery with a powerbank, And triggering all shorts off automotive stuff, with great sucess, Until i got greedy i mean.

- What converter pins?  If you mean the B3603 output, the 3603 should handle it ok if it is brief.  We need to know what pins you mean, and what part got hot.


Sorry for the confusion. I shorted the Output of the Step up converter while connected to the input of the B3603, And now the Step up converter it works ok for a minute, But then starts jumping the temperature and consumption until it stops working.

I checked no Fault in the B3603 at that time, All seem normal, only a day after it went crazy.

The B3603 should take the shorting for sub-seconds.  Within sub-second, the B3603 will sense the over-current and drive it down to the current limit.  More than sub-second, and depends on the current-limit setting, something bad might have happened.  It also depends on how you feed the B3603.  Your setup (buck feeding buck), your buck that feeds B3603 input might not have worked well.  When you shorted it, the B3603 draws a lot from your feeder buck circuit.  I may draw so much the feeder voltage collapsed - to a point that the 3603 is not getting enough voltage to operate the controlling electronics.  If so, it is no longer regulating and it would be outputting as much as the supply can give.  That means a lots of current and likely damage might have occurred.



So its not really a good idea to power it only with 5v? it may work but if i shorted and it come down to 3 volts it may get to that condition right? I think the powerbank circuit cuts off before, but its always good to know, Thanks. I have use this before, for fooling a control unit into thinking the B3603 was a Sensor output.  and it worked great, No considerable current in this situation.

- I understand as "The B3603's LED reads 10V 0mA which is not what you set, but your DMM get around the set voltage/current" Do I understand correctly?  If so, and you have redone your calibration carefully, you have a Hardware problem somewhere... Read on, check the stuff explained below.  You may have killed the XL1509 part and it went on to do some damage.

Yes Exactly Everytime i do a Factory Reset, The readings are always the same 10v 0mA it doenst matter the settings or output current.

But if i forget about the readings and use external reading, Everything works fine.

No matter how many times i try to do the Read (V and A) calibration Its always Off a little and it nevers changes. But strange think is while calibrathing the Read function, The B3603 Manage to Read  and not while normal operating.

 Anyway. If i do the Voltage Regulation Calibration (That by the way  its Way off like 20v off) I have do do it maybe 4 times. Then the Voltage reading starts working almost normal (0.5 V off)

Now the current Regulation, its strange, Let me try to explain. I start the calibration.

- It starts with the low setting 200 mA, on the external DMM its ok but on the B3606 it says 2 AMPS
 and then i press down like a milion times(Normal for this unit i guess) and i get to 200 mA and set it.

- After it starts the high setting (Not sure now if its 2Amp or 1 but i will say 2) on the external DMM it read  ok
 But now on the B3606 it says 200mA or the set value that i input before for the low set, So milion clicks after i get to 2Amp and set it.

-It comes again to the low Setting. Once Again  the external DMM reads 200 mA but the B3606 as 2Amps on screen.

And I think this cycle never stops. I have tried 3 cycles and always the same.

I will try to do the test´s that you said.

Once again Rick thanks for your reply  :-+

(1)  The input is rated for 6V to 40V.  The over 40V could be the cause of your problems.

(2)  Supplying the B3603 with 5V is marginal.  The XL1509 cannot give out 5V when the input is 5V, so your on board electronics are under-powered.  Best to follow the designed minimum of >=6V.

(3)  The "condition" is load-caused Vin drop is not related to 5V in per-se.  It could happen even if Vin is >6V.  Some power supply just drop the voltage if the current draw is exceeded, and it doesn't care that the B3603 needs 6V.  Once when say it drops to well below the B3603's needs to operate, it is "temporary brain dead" and just let whatever can flow through the system go through.

(4)  You said "Now the current Regulation, its strange, ... and then i press down like a milion times(Normal for this unit i guess) and i get to 200 mA and set it. ..."

Yeah it is hard to operate.  During regulation calibration, it display the "real" current.  Once you press up/down to change, it display your "modified" current.  Then, a short time later, it go back to "show real current."  And it also seem to purposely ignore your key press.  I think there is a timed-ignore in there to stabilize the regulation/display.  It is very frustrating.  You have to press up/down long enough for the B3603 to see it, and not so long that it goes to "auto repeat" and it goes up tens of clicks instead of just the one click you might have intended.

(5) You said "...setting 200 mA, on the external DMM its ok but on the B3606 it says 2 AMPS..."
The calibration current for LOW is 0.200A and for HIGH is 1.200A.  If you see 2A, your calibration for the current-display is not right or the ADC/opAmp is not well.

(6) You said "Yes Exactly Everytime i do a Factory Reset, The readings are always the same 10v 0mA it doenst matter the settings or output current."

Make sure you save (F5) your new (hopefully correct) calibrated reading.  Once you saved it, it will start with your calibrated reading.  That it doesn't have a good factory-reset reading is annoying, but the factory-reset value really doesn't matter if you saved your better calibration.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Hi, very good review of the B3603.
I have received one, the last week,but when I tried to adjust the output voltage below 1 volt , it return to 2.16 Volt.
Somebody have the same problem. I need voltage below 1 volt but I can't get it.
Thank you for any advice

I can regulate down to 0.09V.  At that low count, the accuracy is not great (next click is not 0.10 but 0.12) but it does regulate down to that low.

Since you just received your unit, perhaps it is operator error.

There are two ways to set Voltage regulation:
1. HOT - while voltage is already being output
   While the voltage is being output and being displayed on the LED,
   Pressing UP/DOWN changes the regulated output voltage.
   This method does not save the setting.  It lasts only until you power-off.
2. COLD - voltage is not ON (no volt out)
   While voltage is is on, press SET, now it goes COLD with no voltage coming out.
   Press UP/DOWN to set a voltage, [edit: I missed a step here]
   Press SET to accept this voltage, the display should show 4 dashes.[end edit]

   Press OK to save and being output.
   This method saves the setting.  This voltage setting survives the power off[/b].

Try both and see if that is the confusion.  By the way, HOT and COLD are my words.  You wont find that in manual.  I just find HOT/COLD being good words to describe and highlight the differences with the two modes.  Amperage setting also have the exact HOT/COLD change modes.
« Last Edit: May 04, 2016, 04:40:30 pm by Rick Law »
 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank you rick,
I tried what you say, but when I fixed in "Hot" the display go down but after a sec, it increase at a minimun of 2.7 volt. In "cold" i did the same, but when the voltage go out the voltage increase again.
Y have already both two of this, but I have the same problem. Could be another problem?
thank you again
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank you rick,
I tried what you say, but when I fixed in "Hot" the display go down but after a sec, it increase at a minimun of 2.7 volt. In "cold" i did the same, but when the voltage go out the voltage increase again.
Y have already both two of this, but I have the same problem. Could be another problem?
thank you again

Sorry, I missed a step.  (I also edited the post prior so it doesn't confuse others)

Try COLD again:

2. COLD - voltage is not ON (no volt out)
   While voltage is is on, press SET, now it goes COLD with no voltage coming out.
   Press UP/DOWN to set a voltage, [edit: I missed a step here]
   Press SET to accept this voltage, the display should show 4 dashes.[end edit]

   Press OK to save and being output.
   This method saves the setting.  This voltage setting survives the power off[/b].
 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank you again Rick,

I did as you said. In both devices (i have bought two) .I saved the voltage (ie, 1 Volt), but when the voltage come out (switch ok), the display goes up to 2.71 v. (and also the voltage, I could measure it with a voltimeter) Then,  when I click set (the voltage output is off, the display show the initial value 1 .00 v again.
It happened with two device and thus is the doubt I have?. Two device are wrong? or I am making a mistake. May be the microcontroller can not have a feedback of the output?
One thing, when I connect a resistor, ie 1000 ohm, the voltage and the display show the preset voltage.
thank in advance.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank you again Rick,

I did as you said. In both devices (i have bought two) .I saved the voltage (ie, 1 Volt), but when the voltage come out (switch ok), the display goes up to 2.71 v. (and also the voltage, I could measure it with a voltimeter) Then,  when I click set (the voltage output is off, the display show the initial value 1 .00 v again.
It happened with two device and thus is the doubt I have?. Two device are wrong? or I am making a mistake. May be the microcontroller can not have a feedback of the output?
One thing, when I connect a resistor, ie 1000 ohm, the voltage and the display show the preset voltage.
thank in advance.

I wonder if it is different board version.  I have another B3603 that is of a different board layout.

Stay tuned.  I will check all my B3603's - I got two of one board layout and one of another layout.

I am busy with another project at the moment, so I can't mess around with my "bench" setup for now.  I will try tomorrow and update you.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Ok, I can confirm that all three of my B3603 units regulated down to 0.1 volt quite well.  Two of my units are of the same board design while my first one (unit 3) is of a different board design.

For a look at how those units looks different, refer to reply #180 here:
https://www.eevblog.com/forum/reviews/b3603-dcdc-buck-converter-mini-review-and-how-the-set-key-could-be-fatal/msg654541/#msg654541

Unit 1 and unit 2 are presumed later (numerically larger) serial number, and within 10 of each other.  They have black inductor and green capacitors.   Unit 3 is the one with white inductor and black capacitors.  It is presumed older serial number (numerically smaller) but it was my first purchase.

iglesigu, try this exact process and see what you get.  If you do not have 0.1ohm and 500ohm pot, just use a resistor 100ohm to 500ohm, anywhere around there.  See if the output volt is around 100mV.  Also, look at my pictures in reply 180, see if your board looks different.  Let me know...

Test Details:

I connected the B3603 to a 0.1ohm current sense resister in serial with a 500ohm VR.  The 500ohm pot allows me to alter the current.  A DMM is connected to the 0.1ohm (1mv=10mA) to monitor the current so I have a reading outside the B3603's own sensor, and my UT61E measuring the output voltage.

Load:

Omitting wire and connection point resistance, at minimum load, it is 500ohm+0.1ohm and maximum load is 0.1ohm.

B3603:
All three units are set up the same way.
- Connect Unit to power supply, turn on without output connection
- Press SET to change voltage, press UP/DOWN to 00.10 (Volt) and
   then Press SET again to save the voltage setting at 00.10
- After a brief four dashes [----] display, now voltage is saved (to survive power off)
  and the unit displays 00.10.
- Press SET again now it display X.XXX (note decimal position).  X.XXX is my current,
  If it is between 500mA to 1.000A, it is low enough, I just press OK and just begin.
- If it was > 1A, I just lower it to between 0.500 to 1.000, press SET again to accept.
- Now press OK for it to go HOT (ie: output the voltage), connect that to my resister+vr in serial.

B3603 - unit 1 (Serial ending 72)

Minimum load 0.1V unit display: 0.09V/0mA, DMM displays: 94mV/0mA
Maximum load 0.1V unit display: 0.09V/47mA, DMM displays: 86mV/47mA
** Power off and restart
0.1V is saved properly, unit with 0.09V display
Vout is +- 2mV Current is +- 2mA of DMM.

B3603 - unit 2 (Serial ending 60)

Minimum load 0.1V unit display: 0.10V/0mA, DMM displays: 97mV/0mA
Maximum load 0.1V unit display: 0.10V/52mA, DMM displays: 96mV/55mA
** Power off and restart
0.1V is saved properly, unit with 0.10V display
Vout is +- 1mV
Current is +- 1mA of DMM.

B3603 - unit 3 (Serial ending 58)

Minimum load 0.1V unit display: 0.09V/0mA, DMM displays: 104mV/0mA
Maximum load 0.1V unit display: 0.09V/53mA, DMM displays: 97mV/54mA
** Power off and restart
0.1V is saved properly, unit with 0.10V display
Vout is +- 1mV
Current is +- 1mA of DMM.
« Last Edit: May 06, 2016, 09:31:51 pm by Rick Law »
 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank you,
I did all step. The two b3603 have white inductor as you can see in the picture.
I have tested with two resistor only, 470 ohm and 3.3 ohm

B3603 - unit 1

Minimum load (470 ohm)0.1V unit display: 0.12V  /0 A, DMM displays: 71 to 79 mV(oscillating) / 0.15 mA
Maximum load (3.3 ohm) 0.1V unit display: 0.14V/0.011 A, DMM displays: 130 to 139 (oscillating)mV/ 12.33 to 12.54 (oscillating mA

B3603 - unit 2

Minimum load (470 ohm)0.1V unit display: 0.13V  /0 A, DMM displays: 65.2 to 70 mV(oscillating) / 0.14 mA
Maximum load (3.3 ohm) 0.1V unit display: 0.14V/0.012 A,    DMM displays: 60 to 103 (oscillating)mV/ 11.90 to 13.87 (oscillating mA


thank you
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Great!  It looks like the procedure got your units pass the 2.71V you were stuck on and they regulate down to 0.1V.

I would not worry about that small oscillation when the voltage is that low.  The noise you pick up from the environment is a larger factor when you are measuring that low.

How is the DMM verses B3603 readout for higher voltage?  Good enough?  Looking at your readings, hard to tell at that very low voltage (hence expected high error), but it does look like you could get some benefit out of carefully re-calibrating the darn things.

Even while the 4-button 4-led UI is awful, the B3603 is a nice little job!

 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank you RicK,

I just tested in a friend´s device (b3608) more current. But when I fix the voltage in 0.1 V (100 mV), and press out (ok) without charge (resistor) the display of the unit keep the voltage on 0.1 and not 2.7 V like the others b3603) I have, that is ok?
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank you RicK,

I just tested in a friend´s device (b3608) more current. But when I fix the voltage in 0.1 V (100 mV), and press out (ok) without charge (resistor) the display of the unit keep the voltage on 0.1 and not 2.7 V like the others b3603) I have, that is ok?
(Bold added to pin point specific words in quote)

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Wait..  From your post quoted above [May 10, second reply after my test], you are saying that your unit displays 2.7V, where as, from this May 9 post quoted below [first reply after my test], you have the voltage display in the 0.1V range.  (I added the bold+underline)

Is it displaying 2.7V or 0.1V (about)?

Your first (May 9) reply quoted below are in the 0.1V range and agrees with your DMM.  So it indicates all is well.   If it is like your second reply above, DMM has around 0.1V and your unit has 2.7V, your unit needs re-calibration.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Thank you,
I did all step. The two b3603 have white inductor as you can see in the picture.
I have tested with two resistor only, 470 ohm and 3.3 ohm

B3603 - unit 1

Minimum load (470 ohm)0.1V unit display: 0.12V  /0 A, DMM displays: 71 to 79 mV(oscillating) / 0.15 mA
Maximum load (3.3 ohm) 0.1V unit display: 0.14V/0.011 A, DMM displays: 130 to 139 (oscillating)mV/ 12.33 to 12.54 (oscillating mA

B3603 - unit 2

Minimum load (470 ohm)0.1V unit display: 0.13V  /0 A, DMM displays: 65.2 to 70 mV(oscillating) / 0.14 mA
Maximum load (3.3 ohm) 0.1V unit display: 0.14V/0.012 A,    DMM displays: 60 to 103 (oscillating)mV/ 11.90 to 13.87 (oscillating mA


thank you
 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank,
to summarize, I am  a little muddled.
 
without pressed ok (no voltage output) the unit display show any value. (i.e 1 v). But when I press ok button (output voltage) without any load, its going from 1 v to 2.7 v and stay there ( problem only if the fixed voltage is below 2.7 v)  The DMM display the real final voltage (2.7 v).
This not happen when I put a load as I said in previous test with R=480 ohm and lower.
The unit and DMM display the correct value.
thank

 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank,
to summarize, I am  a little muddled.
 
without pressed ok (no voltage output) the unit display show any value. (i.e 1 v). But when I press ok button (output voltage) without any load, its going from 1 v to 2.7 v and stay there ( problem only if the fixed voltage is below 2.7 v)  The DMM display the real final voltage (2.7 v).
This not happen when I put a load as I said in previous test with R=480 ohm and lower.
The unit and DMM display the correct value.
thank

Since you tried it on your friend's B3603 and it did 1V properly, so your unit is failing in someway.  Interesting that it only fail when it is without external load and it appears to work with external load.  I would however not trust that to regulate properly without thoroughly testing it first.

It may be interesting to know at what external load does it begin failing, but that would be just for curiosity as fixing it would not be a simple tasks.
 

Offline iglesigu

  • Newbie
  • Posts: 7
  • Country: es
Thank Rick,
By the way my units have black botton while my friend unit is red. My be it have a different chip?
I will try to do an scanning  of some value of load and post it.
thank again
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Thank Rick,
By the way my units have black botton while my friend unit is red. My be it have a different chip?
I will try to do an scanning  of some value of load and post it.
thank again

New thought: Your calibration for regulation is way way way way off.  It is possible that the offset is so far off what it should be so when it asks for 1V, it is actual 2.7V.  It looks like your sensing/display calibration is ok since it matches your DMM.  See what it generates when you regulate it to 30V, 20V, 10V, 7.5V, 5V, 3V.  If your regulation is way way way off, you should see it particularly when you hit 7.5V down to 3V.

If so, try recalibrating.  You need a 36V power source. 33V-ish will do, but nearer to 35V-36V is better.  40V is max, but I would not go that near the limit.

Good luck!

By the way, a new chip is possible, but I rather doubt it.
 

Offline hobby16

  • Newbie
  • Posts: 1
  • Country: fr
Hi all,
Thank you all for the thread and the good read that helped my work a lot.
Coming a bit late to this thread but I have finished a new user interface for the B3603. I've been able to squeeze into the module's STM8S003FS all the necessary functionnalities :
- use of a rotary encoder as input interface (no pushbutton), with acceleration sensing
- floating point calculation
- conversion ADC -> volt, amp using degree 3 polynomials (of course, an affine conversion is also possible with x^3 & x^2 coefficent = 0)
- floating point first degree filters
- calculation & display of power (AxI), mAh & Wh
- uart interface for data logging with Tx at 9600 bds
- sampling period for logging from 1s to 100s
- datalogging of input voltage also
- a console with Rx to accept basic commands : voltage & amp setpoint setting, polynomials coefficient change for calibration, output enable/disable
- the unused pin 8/con1 is used as an output for controlling an external fan when current is above a (changeable) threshold
- setpoint values, Wh & mAh and parameters are store in eeprom and restored automatically at power-up

With datalogging, the B3603 module is not only a PC controllable power supply, it can also be used to datalog voltage and current of an external supply.
The performance is as stated by other users above, amazingly good (when you set the output voltage to 10.00V, the real voltage will be 10.00 !). With a good user interface (which the original version does NOT have), the module is now really usable. I've used it to test leds, motors, charging battery...
The nice thing is that since it can be recalibrated by the end user using an ascii console (a process that can be automated by scripting), you can essentially get the best accuracy in the interval you wish.

I will post a video to show how the rotary encoder has made the user interface really really user-friendly. More documentation and pictures will come. I'll keep the firmware proprietary for the moment.

About the weird wiring of the pushbuttons in the original schematics, it is a multiplexing scheme to scan 4 buttons using only 2 I/O. Here is how it works, for forumers who want to get back to programming the STM8S :
button OK : PC7 = in pullup, PD1= out 0V, read PC7=0 => pressed
button UP : PC7 = out 0V,    PD1 = in pullup, read PD1=0 => pressed
button SET: PC7 = in pullup, PD1 = in pullup, read PC7=0 => pressed
button DN : PC7 = in pullup, PD1 = in pullup, read PD1=0 => pressed

I have connected the rotary encoder to the button and used sucessfully the same scanning scheme to read the encoder (see wiring diagram). The scanning is fast enough to manage the encoder acceleration (if it is turned fast enough, each step will be +- 10 instead of +-1). With acceleration sensing, it's much easier and faster (and imho funnier) to make big changes, for example from 3.30V to 25.00V.

About the voltage surge at power-on mentionned above, I was very concerned but I personnaly have NOT seen it. I've order 5 more modules, I'll check it again to be sure it won't ever be a problem.




« Last Edit: July 11, 2016, 05:25:09 pm by hobby16 »
 
The following users thanked this post: dc2, ProfessorGT

Offline dc2

  • Newbie
  • Posts: 1
  • Country: de
Hi hobby16,

thats really great news  :-+
I have been following this thread for a while now, hoping that somebody could complete a new, fully functional firmware.

Are you planning to release the firmware somewhere? I'm really keen on trying it out myself  ;D
 

Offline flywheelz

  • Regular Contributor
  • *
  • Posts: 148
  • Country: us
Great review and amazing details about the product and a schematic  :-+.  I went with the bigger brother B3606 and its the same design.  What is different is it uses the XL4016 instead of LM2596 and TI 272C op amp in place of one MCP6002. 

Few days ago I needed to test 30v zener.  Stupidly  :palm: I supplied 48v to the device and saw the LCD flash for a second and that was the last of it.

What I've found so far:
bottom board
  • XL4016 - DEAD - VIN shorted to SW OUT - cost $1.71 for 1
  • XL1509 - DEAD - SW OUT shorted to GND - cost $1.68 for 5
  • MCP6002 - DEAD - VCC shorted to GND - cost $2.13 for 10
  • TI 272C - no shorts, unknown if works - cost $1.08 for 1

top board
  • 2 x 74HC595 - DEAD - VDD shorted to GNDD - cost $0.50 for 10

Surprisingly the 3.3v regulator and stm8s003f3 on the top board I think survived.  I see 1.93khz signals on pin 4 and 5 of connector and it changes duty cycle on both.  The OK and CC leds turn on as well.

This mistake costed me $7.10 so far.  Hopefully parts from various aliexpress sellers arrive and functional.  I will give an update in time.

Update:
  • 09/05/16 - Changed both 74HC595's on the top board and the LCD started working now.  Waiting for more parts to arrive for the bottom board.
     
  • 09/21/16 - Replaced XL4016. Ordered the wrong XL1509-5.0E1 |O instead of XL1509-ADJ.  I did end up using the XL1509-5.0E1 (had to remove r16/r17 and short r17 pads) to get 4.98volts.  Ti 272c hasn't showed up yet so I installed MCP6002 in its place and replaced the other MCP6002.  The unit is back to life :phew:.  However the display is showing ~10mA more that actual.  Perhaps need calibration but I will wait for Ti 272c first.  I was lucky the MCU survive the over voltage since the original firmware is not available.
     
« Last Edit: September 21, 2016, 10:13:13 pm by flywheelz »
 

Offline dan_bitsy

  • Newbie
  • Posts: 1
  • Country: us
never mind fgot it,
« Last Edit: October 22, 2016, 04:31:43 am by dan_bitsy »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Great review and amazing details about the product and a schematic  :-+.  I went with the bigger brother B3606 and its the same design.  What is different is it uses the XL4016 instead of LM2596 and TI 272C op amp in place of one MCP6002. 

Few days ago I needed to test 30v zener.  Stupidly  :palm: I supplied 48v to the device and saw the LCD flash for a second and that was the last of it.

What I've found so far:
bottom board
  • XL4016 - DEAD - VIN shorted to SW OUT - cost $1.71 for 1
  • XL1509 - DEAD - SW OUT shorted to GND - cost $1.68 for 5
  • MCP6002 - DEAD - VCC shorted to GND - cost $2.13 for 10
  • TI 272C - no shorts, unknown if works - cost $1.08 for 1

top board
  • 2 x 74HC595 - DEAD - VDD shorted to GNDD - cost $0.50 for 10

Surprisingly the 3.3v regulator and stm8s003f3 on the top board I think survived.  I see 1.93khz signals on pin 4 and 5 of connector and it changes duty cycle on both.  The OK and CC leds turn on as well.

This mistake costed me $7.10 so far.  Hopefully parts from various aliexpress sellers arrive and functional.  I will give an update in time.

Update:
  • 09/05/16 - Changed both 74HC595's on the top board and the LCD started working now.  Waiting for more parts to arrive for the bottom board.
     
  • 09/21/16 - Replaced XL4016. Ordered the wrong XL1509-5.0E1 |O instead of XL1509-ADJ.  I did end up using the XL1509-5.0E1 (had to remove r16/r17 and short r17 pads) to get 4.98volts.  Ti 272c hasn't showed up yet so I installed MCP6002 in its place and replaced the other MCP6002.  The unit is back to life :phew:.  However the display is showing ~10mA more that actual.  Perhaps need calibration but I will wait for Ti 272c first.  I was lucky the MCU survive the over voltage since the original firmware is not available.
     

(Something must be wrong with notification)  I just got to read this today!

The 10mA extra seems like something re-calibration can easily deal with.  If you have to play with that part of the circuit, perhaps it is a good idea to swap out the current sense resister with a better one?  I changed mine with better temp-co.  Not worth doing it just for the slight improvement, but if you are already messing around there...

Did you check the noise?

That darn thing is pretty robust.  I too had to change out the guts (due to shorting) of the one of my 3603.  XL1905, the inductor and capacity for the XL1905, pair of 74HC595, pair of MCP6002.  Initially, I did not change the caps with the XL1905.  It fired back up and needed a re-calibration before the reading reads correctly.  I suppose variation with the mcp6002 is to blame.

It also came back to life with a bit more noise.  After changing the caps and inductor, the noise went back down to similar to my other two units.
 

Offline flywheelz

  • Regular Contributor
  • *
  • Posts: 148
  • Country: us
Right after I blew up the B3606 I ordered the B3603 as my 2nd unit so that I had a working unit while fixing the B3606.  I got the B3606 working now but its not properly calibrated yet as I don't know how to dissipated 2.8amps during F4 step. 

The other day I pulled out the 12v-80v Boost Converter that I had ordered a while back.  Plugged in a 9v battery into it and it was outputting similar voltage tested with  :-DMM.  I figured its safe for B3603.  Tried turning the boost Vpot a little but the voltage set to 30v on B3603 still stuck under 9v.  I got an idea to plug in 12v adapter instead of the 9v battery.  Next thing I hear is BAMM! and I smell the magic smoke  :palm:  The LM2596 actually blew apart and was dangling on one leg.  The XL1509 and both MCP6002 packed up as well on the bottom board.  The top board lost both 74HC595's and the ST mcu  :scared:  I repaired the bottom board of B3603 but thats it.  I could fix the top board but there is no  firmware with button control as far as I know.

Quote
The 10mA extra seems like something re-calibration can easily deal with.  If you have to play with that part of the circuit, perhaps it is a good idea to swap out the current sense resister with a better one?  I changed mine with better temp-co.  Not worth doing it just for the slight improvement, but if you are already messing around there...
  Once I figure out a way to dump 2.8A, I will try calibrating.


Quote
Did you check the noise?
I checked Vout and at 5v I see about 25mV, at 10v ~95mV and at 12v ~30mV.

Quote
That darn thing is pretty robust.  I too had to change out the guts (due to shorting) of the one of my 3603.  XL1905, the inductor and capacity for the XL1905, pair of 74HC595, pair of MCP6002.  Initially, I did not change the caps with the XL1905.  It fired back up and needed a re-calibration before the reading reads correctly.  I suppose variation with the mcp6002 is to blame.

It also came back to life with a bit more noise.  After changing the caps and inductor, the noise went back down to similar to my other two units.
I will need to check XL1905 output for noise.

I am getting good experience fixing these things with my hot air station  :-DD

P.S. I've subjected the little B3603 to 90volts  :palm:  The booster converter goes past whats stated.
« Last Edit: October 22, 2016, 04:44:47 am by flywheelz »
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us

Quote
The 10mA extra seems like something re-calibration can easily deal with.  If you have to play with that part of the circuit, perhaps it is a good idea to swap out the current sense resister with a better one?  I changed mine with better temp-co.  Not worth doing it just for the slight improvement, but if you are already messing around there...
  Once I figure out a way to dump 2.8A, I will try calibrating.
..
P.S. I've subjected the little B3603 to 90volts  :palm:  The booster converter goes past whats stated.

Try older incandescent automobile light bulb?

I use a pair of 12V dual filament brake+parking lights.  Each bulb handles up to 24W with the two filaments.  They are soldered to a proto-board with 18AWG wires.  Soldering them to a board is mainly to ensure the hot bulb are not wiggling around.  They can get dangerously hot.   When I want to draw the most (48W = 4A  12Volt), I use all 4 filaments. (no fancy switches, I just solder in the filaments via the 18 AWG leads to the load-in/load-out wires).

I have gone over 3A with that setup (max my other power supply was able to put out).  The bulbs can dissipate the power without problem, but they are hand-burners if not careful.   
« Last Edit: October 22, 2016, 05:12:30 am by Rick Law »
 

Offline ProfessorGT

  • Newbie
  • Posts: 1
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #227 on: November 28, 2016, 04:18:26 am »
Hi all,
Thank you all for the thread and the good read that helped my work a lot.
Coming a bit late to this thread but I have finished a new user interface for the B3603. I've been able to squeeze into the module's STM8S003FS all the necessary functionnalities :
- use of a rotary encoder as input interface (no pushbutton), with acceleration sensing
- floating point calculation
- conversion ADC -> volt, amp using degree 3 polynomials (of course, an affine conversion is also possible with x^3 & x^2 coefficent = 0)
- floating point first degree filters
- calculation & display of power (AxI), mAh & Wh
- uart interface for data logging with Tx at 9600 bds
- sampling period for logging from 1s to 100s
- datalogging of input voltage also
- a console with Rx to accept basic commands : voltage & amp setpoint setting, polynomials coefficient change for calibration, output enable/disable
- the unused pin 8/con1 is used as an output for controlling an external fan when current is above a (changeable) threshold
- setpoint values, Wh & mAh and parameters are store in eeprom and restored automatically at power-up

With datalogging, the B3603 module is not only a PC controllable power supply, it can also be used to datalog voltage and current of an external supply.
The performance is as stated by other users above, amazingly good (when you set the output voltage to 10.00V, the real voltage will be 10.00 !). With a good user interface (which the original version does NOT have), the module is now really usable. I've used it to test leds, motors, charging battery...
The nice thing is that since it can be recalibrated by the end user using an ascii console (a process that can be automated by scripting), you can essentially get the best accuracy in the interval you wish.

I will post a video to show how the rotary encoder has made the user interface really really user-friendly. More documentation and pictures will come. I'll keep the firmware proprietary for the moment.

About the weird wiring of the pushbuttons in the original schematics, it is a multiplexing scheme to scan 4 buttons using only 2 I/O. Here is how it works, for forumers who want to get back to programming the STM8S :
button OK : PC7 = in pullup, PD1= out 0V, read PC7=0 => pressed
button UP : PC7 = out 0V,    PD1 = in pullup, read PD1=0 => pressed
button SET: PC7 = in pullup, PD1 = in pullup, read PC7=0 => pressed
button DN : PC7 = in pullup, PD1 = in pullup, read PD1=0 => pressed

I have connected the rotary encoder to the button and used sucessfully the same scanning scheme to read the encoder (see wiring diagram). The scanning is fast enough to manage the encoder acceleration (if it is turned fast enough, each step will be +- 10 instead of +-1). With acceleration sensing, it's much easier and faster (and imho funnier) to make big changes, for example from 3.30V to 25.00V.

About the voltage surge at power-on mentionned above, I was very concerned but I personnaly have NOT seen it. I've order 5 more modules, I'll check it again to be sure it won't ever be a problem.

So, Hobby16,

Haven't heard any more from you after your post, and being a complete neophyte to the electronics world, I'd certainly like to play around with your modification. Can you make the firmware and all details of completing your mod available? (rotary encoder part number, etc)

Sure would be nice.

Thanks!

So many posts of very valuable benefit to all, just want to say thanks to everyone. I've just been watching in the background. I have one of these B3603 boards and put it in an enclosure to use as a tester. Works great, but I need to buy or build one that's capable of putting out, or controlling a 20amp DC output. I love the VC/CC qualities of this unit, and it is the whole reason I purchased one and use it.

Thanks Again!
 

Offline flywheelz

  • Regular Contributor
  • *
  • Posts: 148
  • Country: us
Anyone know if the open source Alternative firmware for the B3603 has the display and button working?  Any function missing that are in the original firmware?
 

Offline ass20

  • Contributor
  • Posts: 31
firmware with source and schematic
https://github.com/baruch/b3603
 

Offline RailWar

  • Newbie
  • Posts: 3
  • Country: ru
I recompiled sources with cosmic stm8 and did deep recoding. The firmware occupies 5800 bytes. Now I make simple menu for buttons. Then I'll bring sources. Here will no new functions. I hope others will do these.
 

Offline golub2017

  • Newbie
  • Posts: 2
  • Country: cs
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #231 on: February 21, 2017, 07:55:47 pm »
Hi I have dead B3606, MCU burnt, and I buy new from China. I have stlink v2 programmer and wanna try alternative version of firmware.
What software I need to compile firmware and to burn into chip.

Thanks
 

Offline jwasys

  • Newbie
  • Posts: 1
  • Country: nl
Hello Folks,

Using Barach's firmware, with my unit, I could not save the settings to EEPROM. I fixed this in the source.

I plan to create a battery charger / discharger with coulomb counter. Eventually a SLA desulficator.

I really appreciate the work done in this tread !

Arian
 

Offline RailWar

  • Newbie
  • Posts: 3
  • Country: ru
My firmware is not ready, but has extra commands in terminal, menu with buttons, useful calibration (with multimeter).
Instructions for the flash on https://hackaday.io/project/4362-power-supply-b3603-alternative-firmware
 

Offline jcastle83

  • Newbie
  • Posts: 1
  • Country: us
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #234 on: September 18, 2017, 03:23:14 pm »
Hi Railwar, thanks for making improvements.

Do you know if this is written to be usable with all the different power supplies that  have this as a "controller" daughtercard?

Eg, BST400 / BST900?

If you're able to make a simple addition, it would be to "STOP at Current/10" or a "low current" monitor menu.  Where the output is turned off as it's monitoring Amps, and sees that the CC mode has finished, it's in Constant Voltage, and as CC moves to the original value / 10.

Thanks
 

Offline RailWar

  • Newbie
  • Posts: 3
  • Country: ru
Unfortunately,  I have not a time for this project now. May be in next year
 

Offline iafilius

  • Newbie
  • Posts: 1
  • Country: nl
Re: B3603 DC/DC Buck Converter mini review and how the SET key could be fatal...
« Reply #236 on: December 02, 2017, 11:55:57 am »
Hi,

just new to the forum,  new to this thread, Arjan is my name.
I own a b3603 and bst400 and I'm interested in a full functionality device:
serial controllable, and/or standalone using buttons and display.
option to program some additional customized features.

a few topics

buttons:

On the buttons which Hobby16 functionaly described correctly, I basically
rediscovered/reverse engineered it and published working code for it.

See Barch's git repo issue #11 "Use the buttons" (https://github.com/baruch/b3603/issues/11

key detail not mentionned before is there is need to some delay in reading the secondary keys
because rise time when switching from output/low to input/pull-up(high) has
a rather huge rise time. (checked with scope)

I noticed afterwards several implementations but no code. well here it is for
public.

display:

Started used Baruch's git repo, where the display is not working properly, it shows only a last digit.
About to dive into that, I noticed RailWar's post of binary firmware with
brief description in whatsnew.txt

flashing that firmware _has_ working digits, buttons, serial and a button
controlled menu's.
That really looks great!!
I hope source will be published soon so we/i can work on that and see if it
can made to work on the bst400 family as well.

I'm curious if the working display is related to compiler or actually fixes
have been applied, and what those optional fixes where.


menu:
RailWar's 2.0.0 firmware has made a useable menu using buttons, great!


calibration RailWar's version:
calibration on RailWar's Version 2.0.0cosmic isn't clear to me. perhaps
someone can give me a hint on this (source not available yet).
What I tried:
Initial values:
CALIBRATION
CALIBRATE Uin ADC: 6.4601/87.3947
CALIBRATE Uout ADC: 5.6507/452.0000
CALIBRATE Iout ADC: 0.5156/200.0000
CALIBRATE Uout PWM: 0.1820/109.9180
CALIBRATE Iout PWM: 1.9394/160.0000
OK


RCALIBRATION
CALIBRATE Uin ADC: 423366/5727497
CALIBRATE Uout ADC: 370323/29622272
CALIBRATE Iout ADC: 33792/13107200
CALIBRATE Uout PWM: 11928/7203586
CALIBRATE Iout PWM: 127100/10485760
OK
Wanting to start with PWM output voltage calibration and ADC voltage
calibration.
I thought just re-apply the given values to discover how it works
>PWM_UOUT 0.1820/109.9180
CALIBRATE a/b: 0.0000/1020.0000
OK
but a and b value get converted to .. well I don't know.
any ideas please?


Voltage peak when powering on:

Described on:
https://github.com/baruch/b3603/issues/3

Suggested by user flex in this thread (2015) is a solution removing R16 and adding a resistor (min 10K, advised
100K) from Vin to LM2596 (~OE).

That works properly as long the top/controll board is attached.
The maximum input voltage on LM2696 is  25V for ~OE  might get exceded, depending on input voltage and ~OE
(non specified) input impedance, and resistor you had chosen.
I recommend using a voltage devider of 2x 47K:  Vin - 47K ~OE - 47K - gnd as
is documented in LM2596 product pages. (and still removing R16)
resulting in:
-making the fundamental power-on voltage peak problem go away
-possible to safely remove topboard without exceeding 25V for ~EO
-not exceeding maximum current injection of 4mA to the SMT8 (output not enabled)
-not exceeding maximum current of the STM8 when output is enabled.

Did not test it _yet_, waiting for a few 47K smd resistors.

Did anyone investigate this issue on the BST400? (I guess the same issue, but no
LM2596 is used), so solution might be a littlebit different.


usage on BST400:
swapped the topboard on a BST400 with a B3603 with RailWay's 2.0.0 firmware.
It drived the output voltage, tested set voltage 15V & 20V but outputs 60V and 80
volts and CC led keeps burning (which i didn't expect).
so my first impression is it is going to work, needed (heavy) recalibration, new
definition of maximum values, and maybe some port swaps in software.
Did anyone investigate this further?


Regards,

Arjan
 

Offline CapnBry

  • Contributor
  • Posts: 24
  • Country: us
Unfortunately,  I have not a time for this project now. May be in next year
Sorry to be digging up such an old thread but I have also flashed one of my B3603 units with @RailWar's 2.0.0cosmic. The menuing system works and I can turn on and off the output and set parameters. However, I can't figure out the ADC_UIN, ADC_UOUT, ADC_IOUT, PWM_UOUT, PWM_IOUT functions either. I can set the output voltage and when it turns on it is pretty close (5.000V set 5.034V output) but the display is way off, reading 6.170V. Current is also about 10mA, but reads as 0.393A. The same is true at 9Vout, the values are pretty close. I assume this means PWM_UOUT is at least pretty close, but I need to get the ADC_xxx set to get it to read properly?

It seems like PWM_UOUT expects to be run twice with at least one value. The first time it says NEXT and the second time it "completes" and sets a coefficient?
Code: [Select]
> PWM_UOUT 5
NEXT
OK
> PWM_UOUT 5
CALIBRATE a/b: 0.0000/1020.0000
OK

It is interesting to note that if the number is 63.000 or higher I get "NUMBER TOO BIG" returned. My UOut PWM becomes 0/1020 no matter what I try to put in there. The second value is always the current PWM (Uout PWM from RSTATUS). I've also tried going back to the original UOut PWM values with "UOUTPWMA 0.1820" and "UOUTPWMA 11928" both of which report "UNKNOWN COMMAND". I also tried "CALVOUTPWMA n" and got the same response so the original commands don't seem to be present or they are implemented in an incompatible command format.

Any chance of getting that source code so I can understand how to do my calibration, RailWar?

EDIT: I think I figured it out. ADC_UIN, ADC_UOUT, ADC_IOUT, PWM_UOUT, PWM_IOUT all expect to receive the current measured voltage in volts (for UIN/UOUT) or measured current in amps (for *_IOUT). To calibrate PWM_UOUT (output must be on)
Code: [Select]
VOLTAGE 1.000
COMMIT
PWM_UOUT 1.012       <-- 1.012 is the measured output voltage
VOLTAGE 10.000
COMMIT
PWM_UOUT 10.060     <-- again, measured output voltage
# CALIBRATE a/b: 0.1810/108.9439
In short, you need to be outputting at 2 different points and it appears to save the raw PWM or ADC for the first point, then when you tell it the voltage at the second point, it calculates the slope (a) and offset (b) using the two values to create the new calibration coefficient for that (PWM Vout in this case). Note that for PWM_UOUT, you need to be in constant voltage output mode, PWM_IOUT needs to be in constant current mode, and the ADC_xxx values need to just be measured.

So the calibration procedure would be to first set the PWM_UOUT
-- Set VOLTAGE 1.000
-- Measure voltage with a multimeter, set `PWM_UOUT v` where v is the measured voltage, e.g. 1.012
-- Set VOLTAGE 10.000
-- Measure voltage with a multimeter, set `PWM_UOUT v` where v is the measured voltage, e.g. 10.090

Now make the displayed voltage match actual with ADC_UOUT
-- Set VOLTAGE 1.000
-- Measure voltage with a multimeter, set `ADC_UOUT v` where v is the new measured voltage, e.g. 1.001
-- Set VOLTAGE 10.000
-- Measure voltage with a multimeter, set `ADC_UOUT v` where v is the measured voltage, e.g. 9.997

To make the input voltage accurate use ADC_UIN
-- Use a power supply to set the voltage to something lowish, like 6V
-- Measure voltage with a multimeter, set `ADC_UIN v` where v is the measured input voltage, e.g. 6.000
-- Change the input power supply to a higher value, like 24V
-- Measure voltage with a multimeter, set `ADC_UIN v` where v is the measured input voltage, e.g. 24.000
-- Note that you must leave the input power on while changing the voltage!

Finally, do the same thing you did for PWM_UOUT and ADC_UOUT except now for current. I just inserted my two multimeter probes directly into the output, shorting it. The range on the multimeter was set for 10A.
-- CURRENT 0.200
-- Note the current on the multimeter and set `PWM_IOUT a` where a is the current, e.g. 0.234
-- CURRENT 1.000
-- Note the current on the multimeter and set `PWM_IOUT a` where a is the current, e.g. 1.050
-- Repeat, except this time we're adjusting the displayed current so use ADC_IOUT

-- SAVE and you're done and calibrated. This is only a 2 point calibration so pick a low and high value that are somewhat far apart to minimize error.

And for my final thing of note, you can power an ESP8266 from the 5V/GND on the UART header on the left side and make a wifi connected CC/CV power supply! It seems to work ok with both a NodeMCU 1.0 board and a WeMos D1 Mini, although the latter is much smaller. A bare ESP-12 module could also be used, but I wouldn't try to power it from 5V directly (the module will eventually overheat and die) or the 3.3V on the SWIM header (because the B3603 LDO can't put out enough power), so at the very least you need an ESP module and a 3.3V step down or LDO. At that point you might as well just get a WeMos so you've have that and a USB TTL serial cable. I bet a very clever person could write ESP firmware to flash the B3603 too, which would make it a one stop shop for getting a wifi connected power supply with no stlinkv2 or anything.

« Last Edit: July 17, 2018, 06:46:20 pm by CapnBry »
 

Offline Waterman

  • Newbie
  • Posts: 2
  • Country: us
Glad you brought it back as I just finished reading all 10 pages. I currently have 2 3603s running as a charging system for a solar power setup. Interesting interactions of the two of them seen so far:
Unit 1 powers up at 3.85volts. Unit 2 waits till over 4.25 volts.
Unit 1 not only acts as a buck unit but it also will go into boost mode when the voltage is in the 9.6V range and raise the output to about 14.1 volts. That voltage then runs into a 6 amp 50v diode on both the + and - sides. Unit 2 does the same but goes up to the set point of 14.84V. Current goes up slowly from about .008amp while #2 goes from about .012amp. Both are set at 14.84 volts and 2.00amps as their limits. 14.84 - blocking diode  voltage yields 14.14v to the batteries.

Had been looking for the board connections which is what brought me here. So old threads are still useful. ;)

 

Offline rin67630

  • Newbie
  • Posts: 4
  • Country: de
Hi Rick,
I did not waste my time in trying to reprogram the control board, i replaced it with an ESP32 TTGO.
Please take a look at my github dev for a couple of DC-DC converters of the Juntek-Drok family:
https://github.com/rin67630/Drok-Juntek-on-steroids
 

Offline rin67630

  • Newbie
  • Posts: 4
  • Country: de
I did not waste my time in trying to reprogram the control board, i replaced it with an ESP32 TTGO.
Please take a look at my github dev for a couple of DC-DC converters of the Juntek-Drok family:
https://github.com/rin67630/Drok-Juntek-on-steroids

 

Offline neslekkim

  • Super Contributor
  • ***
  • Posts: 1305
  • Country: no
oh, awesome!
Is it still in the works?, seems like a project I need to follow, if I can find my psu's again :)
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
Hi Rick,
I did not waste my time in trying to reprogram the control board, i replaced it with an ESP32 TTGO.
...

I hope I am not the Rick you are replying to...  If so, I did not imply or suggest it was a waste of time.

On a different note...

One of my three B3603 failed - it wont store the calibration anymore but otherwise works fine.  Considering it has been my main "bench supply" since 2015 and it worked till earlier this year, I have to say it was well worth the money.
 

Offline TheFrunk

  • Newbie
  • Posts: 1
  • Country: nl
Would anyone perhaps have advise on how to repair these B3603 units? I have 2 of them, both died due to me being silly and using it at 12V 3A output for too long (2A is limit for continuous operation iirc). Since I have 2 with the same issue I thought I would have a look. The D1 diode seems to be faulty on both, replacing it doesn't fix it. Screen and setting menu and all that is working. Output is somehow capped at about 12.8 volt, when a load is applied the voltage drops quickly at almost no current. I'm guessing the output smoothing capacitor is charged by some tiny current somewhere and the LM2596S-ADJ may be broken, so maybe replacing those might work? Any other ideas for where to look? No obvious signs visible like blackened resistors or bulging capacitors, and that is about as far as my knowledge goes.
Thanks in advance, Frank.
 

Offline Rick LawTopic starter

  • Super Contributor
  • ***
  • Posts: 3419
  • Country: us
By spec, the B3603 should do 3A continuous.  The problem is cooling.  Some parts get rather hot without a cooling fan.  When I do >1A, I usually have a PC fan blowing cool air over it.

I could be wrong here, but the control circuitries (op amp and voltage/current sense) are far from the heat, plus they are not the ones carrying sustained high current other than the current-sense resister.  So I doubt they are the ones damaged.

I don't know if this will work, I would try switching out the voltage regulator.  The tiny heat sink on it is heart warming but not confident creating.  So targeting that seem most logical.  Not sure if the chip have a temperature shut-down there, but it may also be a fake part hence suspected.  The next "likely to cook" part is the inductor next to the voltage regulator.

Good luck, if you could share the outcome, I am sure many would appreciate it.
« Last Edit: January 19, 2022, 09:04:42 pm by Rick Law »
 

Offline val22

  • Newbie
  • Posts: 1
  • Country: de
Hallo. Please tell me if there are alternative firmware for the BST900 W MCU Nuvoton N76E003AT20.
Thanks for the info.

https://www.mediafire.com/view/5a0tndg5fdup8vb/IMG_1.jpg/file
https://www.mediafire.com/view/fzhx3llzv4dunwv/IMG_2.jpg/file



« Last Edit: March 05, 2022, 02:51:53 pm by val22 »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf