The question above about temperature in situations involving phase change (or change of state) prompted me to think about thermometers.
Now, a thermometer measures
its own temperature (excluding "thermal radiation thermometers" such as IR guns or cameras).
In classical thermodynamics (before statistical mechanics), an ungainly-named "Zeroth Law" says that if system A is in thermal equilibrium with system B, and system B is in equilibrium with system C, then system A is in equilibrium with system C. This means we can assign "temperature" to such systems. Before statistical mechanics, investigators defined reference values (such as 0 and 100 or 32 and 212) to physical situations (freezing and boiling) to assign numerical values on their thermometers.
To use a thermometer, one puts it in thermal equilibrium with the system being measured, and then reads it out according to its calibration.
This is obvious with a kitchen thermometer, where it starts out at room temperature in the drawer, then rises after insertion into the meat until it reaches equilibrium.
A thermometer calibrated in this manner is useful over a range with
no phase transitions in the temperature sensor: a mercury thermometer is used above -38
o C and below +350
o C to avoid freezing and boiling, respectively.
Other thermometers (such as RTDs, diodes, thermistors, and thermocouples) do not use liquids and avoid phase transitions at low temperatures.
I wondered about extremely low temperatures, below 4 K (helium boiling point).
The concept of absolute zero arose when considering the behavior of "ideal gases" (far above condensation) in Charles' Law, which became part of the Ideal Gas Law
pV =
nkNTwhere
p is gas pressure,
V is the volume,
n is the quantity of gas in moles,
k is Boltzman's constant,
N is Avogadros' number, and
T is absolute temperature.
Extrapolation from "normal" temperatures allowed the estimation of absolute zero.
Modern cryogenics can achieve extremely low temperatures.
Apparently, the best thermometer for extreme cold is a negative-tempco RTD (where the resistance rises at low temperatures).
This unit
https://www.lakeshore.com/products/categories/specification/temperature-products/cryogenic-temperature-sensors/ultra-low-temperature-rox from Lakeshore Cryotronics is calibrated down to 10 mK = 0.01 K, and useful to 5 mK. I assume the lower limit is due to self-heating: not much is required to screw up a 5 mK reading.
The typical resistance curve is shown in
https://www.lakeshore.com/products/categories/overview/temperature-products/cryogenic-temperature-sensors/ultra-low-temperature-rox and goes from about 2000

at 1 K up to 10,000

at 10 mK.