I watched this video:
https://youtu.be/yIx_VWWgKUI?t=1521 (I have set a timestamp). When the guy tests the power supply overshoot when the output is switched on, he sets the voltage to 10v, connects a 4.7ohm resistor (should draw 2.1A with no limit) and sets a current limit of 1A. He determines that the constant current mode kicks in and that the PSU limits the voltage to keep the current at the 1A limit.
What I did not understand was the test he did following this one. He tests current overshoot by making a simple current shunt and measuring the voltage drop across it. He determines that there is a pretty significant current overshoot when a load is connected and PSU output is switched on (same testing scenario as test 1).
I don't understand how, in the first example, he measured the voltage, and found that it stops at a lower voltage so that the current does not go over the 1A limit he set. But in the second example, he... contradicts this...?
Could someone please explain what he means by the second test?