Undershoot seems to be the natural side effect of having heater and sensor tightly thermally coupled. This is the behavior I notice with my T12 clones, too.
There has to be some electrical insulation between heater and the copper in the tip. In a T12 tip, the heater is powered by the thermocouple, so that connection must be solid. so when the power is interrupted to take a reading, the thermocouple is affected by the recent, direct heating of the sensor. Not only this causes a more gradual ramp to set temp, but if the tip is under load, this creates more differential between set temp and tip temp; at least this is what I reckon causes the worse performance in this regard with my clones. There is "slop in the lead screw." When the tip is under no load, that slop eventually near-disappears. When the tip is under load and higher duty cycle, that slop grows. I am curious how well this can be negated by software, but I suspect it cant be corrected fully, without some compromises.
JBC could be made differently. They have separate circuits for thermocouple and heater, but they are probably more similar in this regard than different.
In the classic 936, the sensor can be insulated fairly well from the heater and coupled much better to the tip, relatively (though not as closely fitted to the copper, this is overcome by surface area). It is actually superior for temp control in this way. The sensor measures almost purely actual tip temp whether the heater is on or off. At least I assume the sensor in a 936 is intentionally insulated from the heater. That is how I would think to do it. I have never taken one apart.