And design the thing for a constant voltage with a series resistor. Have an appreciable voltage drop across the resistor. Don't stack 10V worth of LEDs in series on a 12V supply. There is no rule and, mathematically, it makes no difference but a common 2V LED at 20 mA on a 5V supply uses a 150 Ohm resistor and drops 3 of the 5V available. In a long series stack, the resistor should probably do something similar - drop about 1/2 of the supply. So, for a 12V supply, no more than 3 LEDs, maybe 4, but no more. Yes, there is a lot of wasted power in the resistor. That's a cost of doing business.
If you are running all the LEDs in parallel, you just need one dropping resistor per LED. Take that 150 Ohm resistor above. If the LED shorts out, the maximum current in the loop is 5V / 150 Ohms or 33 mA. There is no condition under which the LED can get more than 90 mA. Even if the voltage went to 10V, given a 2V LED, the max current would be (10-2)/150 or 53 mA. Not ideal but still withing the rating of the LED.
It gets even better if you design for 10 mA. In that case, for the 2V LED and the 5V source, you would use a 300 Ohm resistor. If the PS spiked to 15V, the maximum current would be 43 mA. Well under 90 mA for which the LED is characterized. Heck, it could spike to 27V and only pass 90 mA.
This just isn't a problem!