One thing I don't think was mentioned in the video (and could go into one of the "common beginner mistakes" compilations) is that many (most?) CC PSU implementations do the limiting
before any output capacitors. This is also the standard for any current-measuring PMICs that typically have a few small MLCCs near the chip (presumably for fast transient response on the feedback pin), then a current sensing resistor, and only then the big electrolytics.
In practice, if you want to test a ~3V LED at 20mA, you short the terminals, dial in 20mA, open the circuit, then ramp up the voltage to "good enough" 5V because CC will save it, and
then you connect the LED, you get a big spark and a dead LED.
...
OTOH you
can drive LEDs from voltage-controlled PSUs because, as you increase the DC voltage, any parasitic effects (like ripple) become less relevant. I've built a custom 17-LED lamp based on careful V/I measurements of the specific LEDs I wanted to use (~2.74V @ 110mA) and am running it on a CV 48V Meanwell PSU with the trimmer pot adjusted down to a voltage low enough where the LED heatsink stays below ~50C.
Yes, a LED Vf drops with higher temperatures (and they get brighter / draw more current as they approach 50C), but it's still surprising how little effect +- 1V has. Even if the PSU was wildly inaccurate, the LEDs would not burn up.
Is it advisable? No. But I can have a switch near the lamp head and the lamp doesn't self-destruct when I turn it on.

(Yes, the best solution is a CC PSU with voltage set to just a few volts above, and the LEDs don't mind an inrush current that's 2-3x higher than the operating one. Most "LED driver" PSUs have TL431 or zeners clamping the output voltage and changing them is easy, as long as the PSU is not in a brick of silicone.

Oh well, in-line resistors to the rescue.)