In the days of tube circuitry 1/2W and even 1W resistors were the default value. Then transistors came along and that became 1/4W to 1/2W. With ICs it went down again to 1/8W to 1/4W. And even to lower levels in modern circuits.
Why?
Well, tubes tended to need Voltages measured in the hundreds in order to operate. 150V, 200V, 250V or even more. Power is calculated as Voltage x Current or Volts x Amps. So tube circuit that operated with a 200V power supply could, in theory, have a current passing from that supply to ground with only the resistance of the resistor to limit the current. 200V and 10K Ohms gives you 20 mA and a power of 4 Watts! Tube circuits had to be carefully designed to never allow that much current to pass through most of the resistors. Or if it could, then they needed to have higher resistances to limit the current or higher power ratings. At 200 V, 100K Ohms would pass 2 mA and the power level would be 0.4W. So a 1/2W resistor would suffice, BUT BARELY. Not much safety margin. I was once told to calculate the power expected and then double it and then select the next larger sized resistor. That is a good rule. That 0.4W calculated power would be multiplied by 2 for 0.8W so a 1W resistor would be used.
Transistors worked with supply Voltages from 9 to 24 or so. So the power expected to be dissipated in the resistors was around ten times smaller. I guess at that time in the history of electronics there was no real price incentive to go to resistors smaller than 1/4W and the automated assembly equipment probably came into play so the 1/4W size was common.
IC circuits started with 10 or 12 Volt supplies and quickly went to lower Voltages. And miniaturization was a desirable objective. Now the 1/8W or 1/10W looked like a good idea.
All of this is based on the idea that the power supply is the highest Voltage that can be encountered in a given circuit and most of the resistors will be of high enough values to limit the current to a sensible level. BUT, even with transistors, ICs, and low Voltage power supplies, it is still possible for a low value resistor to conduct enough current to operate at higher Wattage levels and generate greater amounts of heat. And some circuits can multiply the Voltage or current levels. So some resistors need to have their power levels carefully calculated. A good design engineer will usually know when these situations are likely. And, of course, circuits should be bread-boarded and tested or simulated in software. Sometimes both are needed to fully know the power ratings required.