I don't think wanting a soldering iron that actually heats to the set temperature is being obsessed with anything
A soldering iron is a hot stick, and you're pretty crap at hot-stick-making if you can't make yours go to the agreed upon temperature. You-had-one-job and such memes.
The TS100 does work quite well in practice, good thermal mass, recovery, temperature stability. But I still want accurate temperature setting and readout even if it's for no other purpose than making your experience transferable between different pieces of equipment.
For instance, when I do SMD work I set a very low temperature. Soldering in 0603 components and removing solder bridges on a QFP requires very little heat. Can keep the risk of damaging components and the rate of burning off flux to a minimum. The same temperature will be utterly useless when trying to desolder a THT connector on some board with lead-free solder and a gigantic ground plane. But better not keep that temperature when wicking the solder of small pads as the heat can cause the epoxy to melt. I found that SMD aluminium eletrolytics require a bit more heat as they're often very inconvenient to solder and there's a risk of producing bad joints when it's hard to make contact with the pad.
So I think I'm doing what you're suggesting, getting some experience and a gut feeling for what tip/temperature to use. But I hate the idea of building up all that experience and then using a different piece of equipment where all my numbers are different 300C = 270C, 370C = 320C, etc.
Displaying and maintaining an accurate temperature does not seem like an exotic requirement, more like the bare minimum of what I'd expect of a decent tool. And it seems like some people are getting that result with their TS100, I just want to understand what the difference is.