Author Topic: How does a CPU calculate it's own temperature?  (Read 3789 times)

0 Members and 1 Guest are viewing this topic.

Offline fubar.grTopic starter

  • Supporter
  • ****
  • Posts: 366
  • Country: gr
    • Fubar.gr
How does a CPU calculate it's own temperature?
« on: June 23, 2014, 05:44:58 pm »
There are several applications that will display the CPU temperature, even individual core temps.

However these applications do not all give the same temperature number. There can be differences up to 5 deg C.

This leads me to believe that CPUs output temperature in some raw form (eg. a voltage) and then each program applies its own algorithm to convert it to temperature.

What's the straight dope on CPU temps?

Online AndyC_772

  • Super Contributor
  • ***
  • Posts: 4228
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: How does a CPU calculate it's own temperature?
« Reply #1 on: June 23, 2014, 06:05:23 pm »
It's fairly straightforward to fabricate a silicon diode which is temperature sensitive, ie. its forward voltage varies with temperature.

It's quite a lot harder to make that diode a good, repeatable indicator of absolute temperature. So much so, that most devices which include them have data sheets that specifically say they should only be used to detect temperature changes, and that the absolute temperature they read has such a wide tolerance as to be essentially useless.

A CPU may incorporate many of these, which I guess might have some utility if the die area is large and it's prone to hot spots. It's even possible that the temperature diodes in different parts of the die might actually be fairly closely matched.

They might be factory calibrated, so they're measured at a known temperature as part of the wafer test process, and a calibration constant is stored in OTP memory.

Offline retrolefty

  • Super Contributor
  • ***
  • Posts: 1648
  • Country: us
  • measurement changes behavior
Re: How does a CPU calculate it's own temperature?
« Reply #2 on: June 23, 2014, 06:10:08 pm »
Also keep in mind that the die temp might be more influenced in a larger degree by the current draw from all the output pins being used. I've never seen a practical usage of the internal temp sensor in a AVR ATMEGA chip being shown. 
 

Offline katzohki

  • Frequent Contributor
  • **
  • Posts: 378
  • Country: us
    • My Blog
Re: How does a CPU calculate it's own temperature?
« Reply #3 on: June 23, 2014, 06:14:47 pm »
I would think in your typical computer CPU it outputs CPU temp as data, not as a voltage. Microcontrollers for embedded functions may be another story.
 

Offline Len

  • Frequent Contributor
  • **
  • Posts: 547
  • Country: ca
Re: How does a CPU calculate it's own temperature?
« Reply #4 on: June 23, 2014, 10:28:58 pm »
What's the straight dope on CPU temps?
For the straight dope, read the datasheet! For example:

http://www.intel.com/content/dam/www/public/us/en/documents/datasheets/4th-gen-core-family-desktop-vol-1-datasheet.pdf

That's for current Intel CPUs. In brief, they have a digital temp sensor on the die that can be read by software or by external hardware via a 1-wire interface. For these CPUs, the reported temperature is not an absolute number in degrees. It's relative to the defined maximum operating temperature, which could be different for different CPU models. So the software has to convert it to degrees based on exactly what type of CPU it is. If that's not done correctly, that could be a reason for the discrepancy you're seeing between various programs.

I've also seen differences like that due to different temp sensors in different places - e.g. one in the CPU and another on the motherboard under the CPU. I don't know if that's still an issue, it seems like all CPUs have built-in temp sensors these days.
DIY Eurorack Synth: https://lenp.net/synth/
 

Offline senso

  • Frequent Contributor
  • **
  • Posts: 951
  • Country: pt
    • My AVR tutorials
Re: How does a CPU calculate it's own temperature?
« Reply #5 on: June 23, 2014, 11:55:36 pm »
On my i7(740QM), both sensors(ubuntu), RealTemp 3.70 and FanSpeed(last version) agree on the cpu and GPU temperature, funny to see the cpu always at 50ºC or more and the GPU at 40º, in some other laptops they also agree, might be some software that is older/not as well maintained.
 

Offline SeanB

  • Super Contributor
  • ***
  • Posts: 16284
  • Country: za
Re: How does a CPU calculate it's own temperature?
« Reply #6 on: June 24, 2014, 12:48:21 am »
Generally the sensor values are read by an integrated chip that has the diode connected to it, and as well has the power supply voltages scaled and fed to the other ADC inputs as well. The voltages then are an 8 bit value that has to be scaled, and generally the diode is direct reading in temperature. Often you have a motherboard temperature in there as well, just read from the chip itself or from a small thermistor on the board. You can also get 2 temperature readings from the HDD, one being the inside airflow temperature and the other being the HDD mainboard temperature. That at least is scaled by the controller, and agrees in my case with the MB temp as well.
 

Offline raw-electrons

  • Newbie
  • Posts: 5
  • Country: us
Re: How does a CPU calculate it's own temperature?
« Reply #7 on: June 24, 2014, 05:13:17 am »
If I understand what the OP is asking, it's: "Why do different Windows temperature applications show different temperatures while running at the same time?"

From what I've read, there are quite a few ways to get temperature data, the BIOS provides some info, so do the CPU and mother board.  But the authors of various applications, faced with the daunting task of being compatible with the widest possible collection of hardware, probably each pick their own method.

On top of this there is the issue of sampling.  Two programs might sample at different times, getting different readings of "instantaneous temperature".  The applications might use an API to get their temperature readings, and who knows how the API is written, it could be a direct sample, or it could be a rolling average, or the applications themselves could be cooking the data this way.

I suppose it's possible, as suggested by the OP, that some temperature sensors simply provide a raw value for a voltage across a diode, for example, and the program itself is left to apply it's own approximation of the diode's performance to guess the temperature, thus each program with its own model would differ.

What's actually happening depends as much on the software as the hardware it's running on.  Without any specifics, it's hard to speculate on exactly what's happening in your particular case.
 

Offline Jeroen3

  • Super Contributor
  • ***
  • Posts: 4078
  • Country: nl
  • Embedded Engineer
    • jeroen3.nl
Re: How does a CPU calculate it's own temperature?
« Reply #8 on: June 24, 2014, 05:54:41 am »
If you read the datasheet of your intel chip the values on which it enables thermal protections are not given as absolute temperature but as 10 or more degree ranges.

Basically the characterization defined a value when at 25 C, then the average slope up to Tjmax and Tjmin is characterized, and that will be given as spec. Which you can linearize.

But, Tjunction can also be calculated when you know the power going through and the ambient temperature. Example of such formule:
Quote from: STM32F407xx datasheet
TJ max = TA max + (PD max x ?JA)
? TA max is the maximum ambient temperature in °C,
? ?JA is the package junction-to-ambient thermal resistance, in °C/W,
? PD max is the sum of PINT max and PI/O max (PD max = PINT max + PI/Omax),
? PINT max is the product of IDD and VDD, expressed in Watts. This is the maximum chip internal power.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16618
  • Country: us
  • DavidH
Re: How does a CPU calculate it's own temperature?
« Reply #9 on: June 26, 2014, 07:23:27 pm »
It's quite a lot harder to make that diode a good, repeatable indicator of absolute temperature. So much so, that most devices which include them have data sheets that specifically say they should only be used to detect temperature changes, and that the absolute temperature they read has such a wide tolerance as to be essentially useless.

It is trivial because instead of measuring the absolute forward voltage drop of the diode, you measure the change in voltage drop at two different currents which will be proportional to absolute temperature.  This value is about 198 microvolts per degree over a decade of current.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf