For anything that generates heat to stop rising in temperature, it must be in a thermal equilibrium with the environment.
Thermal equilibrium is achieved when the heating power is the same as cooling power. In your case, cooling is mostly by convection, and a bit via conduction to the table.
The only way cooling happens is when there is a temperature difference. So, in order to dump any heat at all through conduction/convection, the resistor
has to be hotter than environment. It doesn't matter how much heat it has to dump - it will always be hotter for heat to flow out of it, to get cooled. The more heat it has to dump, the hotter it will be, everything else being the same.
Since cooling rate is primarily controlled by temperature difference, heating rate is fixed, and temperature rise is by heating/cooling imbalance acting on the thermal mass of the resistor, you inherently get asymptotic behavior.
That resistor will be heating up essentially forever, just ever slower and slower. In practice it only depends on how accurate your temperature measurements are. In a precision temperature controlled chamber, with precision temperature sensing, you'd observe that resistor heat up for a day or more. Eventually it'd get buried in measurement drift and noise, but the temperature rise never stops - not unless the heat sink (here: air and table) temperature drops far enough.
So, what you observe is normal. It would be super strange if it was any other way!
I was expecting an Al block the size of a thumb to dissipate 1.7W in convection without much heating (my guess was 30-40*C)
See how big a pencil soldering iron's hot section is? It will be dissipating say 5-15W while the tip is at 250C or thereabouts. Your resistor dissipates a couple times less heat, but not much larger.
Get a 2W rated through hole axial resistor and see what it does when it is dissipating just 1W
