Ok... let's look at the datasheet:
http://www.farnell.com/datasheets/2047884.pdfYou have there:
Forward voltage at If = 50mA typical 2.4v , maximum 3v
Luminous Intensity at 50mA : minimum 1120 mcd , typical 1600 mcd
The magic words are TYPICAL and MAXIMUM... Basically they're saying no matter what, the leds will turn on fully with 3v.
The majority of them will turn on fully with a forward voltage as low as 2.4v but that's not a guarantee, so you really shouldn't use that value.
Also, a couple of important things :
1. LEDs may turn on partially at forward voltages lower than the specified typical forward voltage but would behave sort of like having built in resistors. So for example, a led may need 2.4v to be fully on and let 50mA go through it, but at only 2.2v it may allow some current, let's say only 20mA... so in turn it will limit all the other leds in series with it to 20mA, making all that strip of leds less bright.
2. The forward voltage of LEDs goes down as they warm up. A particular LED may be fully on at a typical forward voltage of 2.4v but as it heats up to maybe 40-50 degrees Celsius, the typical forward voltage may drop down to 2.3v - so considering this and what I said at [1] , imagine a LED initially limiting the whole strip to less than 50mA but as it heats up, its forward voltage lowers a bit and may turn on completely and allow the full 50mA to go through the series of LEDs and then the series of LEDs will become bright.
So in your particular example, considering you already have the LEDs on circuit boards and everything in place... it would probably be cheaper to just replace the 12v power supply with a 18-20v power supply like a laptop adapter.
With 18v in and 5 leds in series, assuming you want to limit them to 50mA .. let's just say you'll have an average forward voltage of around 2.5v so 5 leds would have a total of 5x2.5v= 12.5v .. so now you can calculate your resistor value :
V = I x R => 18v - 12.5v = 0.05 x R => R = 5.5/0.05 = 110 ohm .. but to keep things simple, i'd probably just round that down to 100 ohms. And the resistor would have to dissipate P = IxIxR = 0.05x0.05x100 = 0.25 watts, so you should use at least a 0.5w rated resistor.
So in worst case scenario, when the leds will all have a forward voltage of 3v, then you'd have V = IxR => 18v- 15v = I x 100 => I = 3/100 = 30mA and in the best case scenario of 2.4v, you'd have I = 6/100 = 60mA
Anyway.. rather than using resistors, it would make sense to use LED driver chips to control the current better.