firstly we are talking about a simple course, what you refer to which I understand is somewhat advanced for this course and seldom used in practice. Take a look at your current source driving an LED, what is being used to sample the current for making adjustments ? oh let me guess a resistor......
So why don't we simply run all red LEDs with a regulated 2 V supply? It's because of variations in LEDs, due to manufacturing tolerances, design variances, ambient temperature, and so on. So how do we allow for that? We use a series resistor to compensate for the differences.
I'm sorry,but that's just plain silly!
We use series resistors as it is a far more economical alternative when the LEDs are incidental to the operation of the circuit,& the operating voltage of that circuit is higher than that which could directly power a LED.
Apart from anything else,when LEDs first made an appearance,a regulated 2v supply was as rare as hen's teeth.
I don't know anyone who selects series resistors so the LED has exactly the voltage required for that individual device.
They have a look at what voltage normally appears across a generic LED of that colour,subtract it from
the supply voltage,& using Ohm's Law,determine what resistor is needed for around 20mA or so through the LED,
Or,if it is a 12v supply, they try 1kOhm,& if that isn't bright enough,go to 560 Ohm,or 470 Ohm.