I was able to tame it a little.
The blue annotation lines, from L to R,
"Disable low latency mode" - drops 50W probably lowering the DSP clock.
"Lower brightness to 0 then raise to 10 (from 50). Drops 100W!
"Disable Smart controls for brightness and contrast" - basically with these on, it overrides the brightness setting.
So as I'm only dealing with an agregate power value and not the monitor specifically these are all "ball park".
If it was pulling 180W it should now only be pulling 30-40W.
2x Worklaptops = 60W
1x Gaming PC @ idle = 100W
LED lights, auxiliary gubbins, 30W or so.
That leaves about 60W for the monitor which is... acceptible.
Your post had a tone which set me off on proving you wrong.
However it didn't. It proved you were right and the brightness settings had a massive impact. Maybe I should have realised I was dealing with a "nice" monitor which will come out of the box with everything maxed out in the "Wow!" factor.
It's ironic that it constantly gives you a speach about USB standby power when you turn it on, but defaults to 180W!