I was looking at some reviews of GPU's from this gen and last, and I was surprised how much power these top end cards use just to play video. I basically know why and how cards like RTX3080/4080/ RX6900/7900 use up around 300-400-500W and more, running some 10-15 Billion transistors, with GHz clocks.
But for video playback, they also use up around 70-90W. IDK what kind of video playback that is, maybe 4k streaming or something. But that's a huge percentage of total power. So do they just not care ? Cards from from a few years ago, rated for 200-300W, playing the same thing, only used 20-30W, or down around there, even less.
Is it just no one in either Nvidia or AMD bothered to "optimize" code or hardware to run video playback on less power ? Shouldn't they be able to use a small portion of the GPU for video playback ?