The leds are diodes that produce light.
Each LED has a forward voltage, a voltage at which the led starts producing light. This value is different for each led because it depends on the manufacturing process.
There is a small region between the point where the leds start producing light to where they're fully "turned on" where they behave like having a resistor inside them, only letting a small amount of current flow through them.
For example, let's say that one particular LED doesn't turn on at all at 2.9v, starts turning on at exactly 3.0v and up to around 3.1v, in only lets through 1mA or some similarly low amount of current, and from 3.1v and higher, the LED lets any amount of energy go through.
If the voltage is high enough, the leds let so much current go through them that they'll burn up (they get too hot and destroy themselves)
If you put more than 1 led in series, you need at least the sum of the forward voltages to make the leds light up and you have to limit the current going through them otherwise they'll be damaged.
Very rarely, it just happens that the sum of those 3 forward voltages is in that very narrow range where the leds acts as having a resistance, they're not quite "fully turned on"
But most often, the amount of current has to be limited by using a resistor in series with the leds, or by using a led driver which monitors in some way how much current the leds consume and if they consume too much current, the led driver lowers the voltage to reduce the current.
Some circuits don't have a resistor in series with the leds simply because the power supply behaves like it has a resistor inside it. For example, you may see a LED connected directly to a 3v button cell battery and it won't be damaged, because the battery itself can't send more than a few mA to the LED, it behaves like it has an internal resistance of a few ohms.
Now.... temperature... as the leds heat up, the forward voltage usually decreases, it drifts a bit. This means relying on that narrow region where leds limit themselves to a few mA is highly not recommended, because as they heat up, that region drifts to the left and the led may turn fully on and burn itself out. So for example, if you had 2.9v off, 3.0 start to be partial on, 3.1v full on , when hot the led may be 2.8v full off, 2.9v starting to turn on and 3v full on ... so if you originally assumed that the leds will be only partially on at 3.0v, you're screwed.
If one led in the series is better built than the others in the series, it could go full on when it's hot enough and then burn out first (because of more current going through it - it would still be limited somewhat by the other leds in series) and could be stuck as a short, and then the remaining two leds would see more voltage, which in turn may take those leds out of that narrow region and so on .. you have a chain reaction.
So as time goes by and the leds heat up, their forward voltage decreases so they'll use slightly more current.
When you measure the current, you usually do it with a multimeter, and the multimeter measures the current by placing a low value resistor in series with the string of leds. That small value resistor is usually very quality one with a low temperature coefficient meaning the resistance value of that resistor won't vary a lot as it heats up, but nevertheless it will still heat up slowly.
The multimeter measures the voltage drop on the resistor, which is in direct relation with the resistance, but if the resistance value changes slowly with the heat, naturally the measured current will also changed very slightly.
This internal resistor in multimeters is also one of the reasons people get confused about Ohm's law ( voltage = current x resistance) and why they don't see on multimeters the amount of current they expect.
For example they take a 5v power supply and put a 100 ohm resistor between + and - and expect to have a current of 5v / 100 ohm = 0.05A or 50mA but then they put the multimeter on the mA range in series with that resistor and don't realize that on the mA range, the resistor used by the multimeter could have a value as high as 10 ohm , so in reality the resistance is 110 ohm ... so that's why they see 5v / 110 ohm = 0.04545 A or 45mA
Also, they forget that the resistors they use are much worse when it comes to temperature ... the power wasted in resistor is P = I^2 x R so we have P = 0.045 x 0.045 x 100 = 0.2025 watts of heat in the resistor that will slowly heat up the resistor and make the actual resistance value change, so it slowly affects the current consumed as well.
ps Your Fluke 87v has a burden voltage on the 400mA range of about 1.8mV per mA .. the more current you have the more it will affect the reading ... on the A range, the burden voltage drops to 0.03v per A or 30mV per A or 0.3mV per mA , almost low enough to ignore its effect on measurements.
ps2. Also, in your case you may also have to take into account the contact resistance of the metal inside the breadboards, how oxidized those metal strips are, and with a 10 digit multimeter may also have to take in account the resistance of those strips, how capacitance between those metal strips may or may not affect measurements etc etc