I only found articles saying that they just need a constant current source (1/10 or less of the rated capacity to be safe) and the proper cut off voltage to stop the charge.
Is it really all there is to it? It seems way too easy and as I said before I want to be 100% sure.
I don’t know where you were looking, because that’s definitely not correct. You could design a very naive charger around that method, but since it lacks any way to actually detect “fullness”, then to prevent overcharging, you’d have to set the cutoff voltage to something well below the typical voltage of a (new) full cell, so that worn cells (with lower capacity) don’t get overcharged (which would wear them out much faster).
The gold standard way to charge them is constant-current charging with “negative delta V/delta T” termination. The first half means that you monitor the voltage (which slowly rises as it charges), and as soon as you notice the voltage start to
drop, you stop charging. Simultaneously, you monitor cell temperature, and once the
rate of temperature rise increases, you terminate. Whichever comes first. (The two detection methods are actually related: they’re both measuring the same thing, by proxy. Heat is the result of the cell being full, so temperature is a proxy for overcharging. But the increased temperature alters the cell’s internal resistance, which in turn alters the voltage. Thus, the voltage drop is also an indicator of being full.)
The drop in voltage is tiny — a few mV — so fairly sensitive electronics are needed to properly terminate the charge.
In a nutshell, NiMH is easier to handle (in the sense of physical shipping and handling, and the lack of catastrophic damage if the cell is abused), but is actually quite challenging to charge properly.
Unlike NiCd, NiMH doesn’t tolerate trickle charging. The exception are special NiMH types designed specifically for trickle charging, which are sold as “cordless phone” (as in, landline) batteries.