Boiling water analogy is a good one.
Water always boils over absolute zero temperature. But very, very little boiling happens until you start approaching some arbitrarily defined threshold, say, 90 degC. Then, rate of evaporation starts going up really, really quickly. And to regulate the amount of evaporation, you can't do it by regulating temperature. There will be a massive difference between 99 and 100 degC. To control the rate of evaporation, you do it by limiting the input power - a 1kW kettle and a 10kW kettle both keep the water at approx. 100 degC, but the latter evaporates ten times as much water in the same time.
LEDs have a similar curve, but it's even more tricky. Say, you have an LED that produces just the right amount of light (taking appropriate current) when you feed it 3.0V. It's very dim at 2.8V, and blows up at 3.2V. So you need to control the voltage carefully. This wouldn't be too difficult with precision regulators, but with LEDs, the "right" voltage isn't fixed, it changes! Specifically, it changes when the temperature changes, and worst, it changes in a bad direction.
So the LED is happy with 3.0V, but it slowly heats up, and now the optimal voltage shifts to, say, 2.9V. You are still applying 3.0V, i.e., too much voltage, and the LED will be brighter - and consuming more current, producing more heat, heating up even more, and the optimal voltage shifts even lower, and you have even more excess, causing more and more heating, and a thermal runaway.
Another issue is unit variation. One LED is happy with 3.0V, while the other is at 3.2V. So you'd need to manually adjust every unit, a nightmare for manufacturing.
This is why you need to control current, not voltage.
Voltage and current are parameters which both can be controlled. Depending on a circuit, you typically choose either of them to be controlled, and the other will be whatever it will be. One parameter is controlled, another is a result of the control.
For a microprocessor core power supply, you would regulate the voltage, say, to 1.2V, and then the CPU takes whatever current it needs, depending on the computation it's doing.
But because the LED behaves like I explained, it makes much more sense to choose the controlled parameter to be the current. When you control the current, the voltage is whatever it is. And once the LED heats up, you can measure the voltage dropping, but the current keeps the same, because it's controlled.
Now, how to control current is a different topic altogether. It's not too different from controlling voltage. There are exact methods using active feedback and comparison to known reference, called regulation - the difference between voltage regulation, and current regulation, is in small details, namely, how the feedback signal is taken.
Then, there are "good enough" approximating solutions that don't offer full control, but approximate it. A series resistor from a fixed voltage source does not regulate current, but makes the changes less steep.