The datasheet
https://ww1.microchip.com/downloads/en/DeviceDoc/Web_AT30TS74_0217.pdf for AT30TS74 temp sensor has a comprehensive list of examples on register value and temperature pairs.
I wrote unit tests based on them for my driver and judging from some of them failing, it seems that my decoding is wrong (or the table is incorrect).
Here's some examples hand-decoded:
-55°C:
Register value 0xC900. Last 4 bits are irrelevant. The value is negative. Since it uses 2's complement, a 16-bit int would be 0xF
C90, which is -880 in decimal. The resolution is 0.0625 deg with the 0°C point being 0, so -880*0.0625 = -55. Great.
-50.5°C
Register value 0xCE80. Last 4 bits are irrelevant. The value is negative. Since it uses 2's complement, a 16-bit int would be 0xF
CE8, which is -792 in decimal. -792*0.0625 = -49.5. What?
The sensor has a +- 1°C accuracy in most ranges. But factoring that into such a table seems way too weird.