The main reason for temperature controlled irons is to prevent damage to the PCBs and the parts you are soldering to them. From what I have seen in the shops where I worked, many such irons are always set to the maximum temperature by the ignorant users. And I have had to repair many PCBs with traces that were overheated to the point where they de-laminated from the boards.
The iron is only as good as the user using it. And many users have NO IDEA how to do that.
When I purchased irons for common use in a shop, I ALWAYS looked for ones that were temperature controlled by the selection of a tip for the temperature you needed. The tips came in 600, 700, and 800 F ratings. So, +/-50 degrees F was OK. And I never purchased tips for over 700 F. So the dummies could not overheat the boards and components.
One degree accuracy is absurd. Ten degree accuracy is more than is ever needed. Twenty-five degree accuracy is the most anyone ever needs. And +/-50 degrees is perfectly OK.
I have two electronic benches at home and the irons on them are set for 600 F. I almost never need them any hotter than that.
Don’t obsess over temperature control unless you are manufacturing for critical uses, such as aerospace.
does everyone agree with this here?