Author Topic: Soldering iron display temp vs tip temp  (Read 4568 times)

0 Members and 1 Guest are viewing this topic.

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Soldering iron display temp vs tip temp
« on: May 18, 2019, 05:09:41 pm »
The display temp on my soldering station is 842F (wide open). The tip, using a hand-held IR tester shows the tip to be about 275F. Sounds like a stupid question, but should the tip be the same heat as the set temp?
 

Offline bob91343

  • Super Contributor
  • ***
  • Posts: 2675
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #1 on: May 18, 2019, 06:01:21 pm »
Chances are the set temperature is the value at the sensor, wherever that is.  Of course the tip will be cooler.  Add to this the accuracy of calibration.

The numbers are less important than how well it does the job.  So the best course of action seems to be to adjust until it works the way you want and use that setting.
 

Offline Daruosha

  • Regular Contributor
  • *
  • Posts: 181
  • Country: ir
Re: Soldering iron display temp vs tip temp
« Reply #2 on: May 18, 2019, 06:05:53 pm »
The display temp on my soldering station is 842F (wide open). The tip, using a hand-held IR tester shows the tip to be about 275F. Sounds like a stupid question, but should the tip be the same heat as the set temp?
Make sue you set the right emissivity on your IR thermometer.
 
The following users thanked this post: Psi

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4103
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #3 on: May 18, 2019, 06:10:10 pm »
To add to this:

If the calibration were correct AND the tip is not under any load (just in free air), then you would expect the tip to reach the set temp, more or less.

When you press the tip to an effective heat sink, you will undoubtedly be able to see/record/observe a gradient.

All that said, a non-contact IR gun isn't consistent across all types of surfaces. IME, on a shiny metal surface I would expect the reading to be way low.

edit:
Quote
Make sue you set the right emissivity on your IR thermometer.
I guess that's how you correct it.
 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #4 on: May 18, 2019, 09:01:04 pm »
Thanks for all the responses. Since posting this I watched a review that Dave did wherein he used some sort of Hakko device for measuring the temp. The temp at the tip was essentially the same as the displayed temp. Neither of my two meters will accept at themocouple. Bingo: excuse for a new DMM.
 

Offline MosherIV

  • Super Contributor
  • ***
  • Posts: 1530
  • Country: gb
Re: Soldering iron display temp vs tip temp
« Reply #5 on: May 18, 2019, 10:43:25 pm »
Quote
Neither of my two meters will accept at themocouple. Bingo: excuse for a new DMM.
Nope!
No thermocouple measures soldering iron temperatures correctly!
You need the same device that Dave used! It is basically a special thermocouple that is designed to get soldered and in getting soldered, is able to measure solder temperature.

If you solder a normal K type thermocouple, you will ruin it. You will have mixed another metal with the 2 original metals, destroying the voltage generating properties of it.

Do not let this stop you get another dmm  ;)
Use this as an excuse to get a soldering iron thermometer/calibrator instead  :-+
 

Offline A Hellene

  • Frequent Contributor
  • **
  • Posts: 602
  • Country: gr
Re: Soldering iron display temp vs tip temp
« Reply #6 on: May 18, 2019, 11:20:27 pm »
There are specialised thermometers (with specially formed thermocouple sensors), out there, for reading the soldering iron tips temperature.



--George
Hi! This is George; and I am three and a half years old!
(This was one of my latest realisations, now in my early fifties!...)
 

Online wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Re: Soldering iron display temp vs tip temp
« Reply #7 on: May 18, 2019, 11:44:12 pm »
The display temp on my soldering station is 842F (wide open). The tip, using a hand-held IR tester shows the tip to be about 275F. Sounds like a stupid question, but should the tip be the same heat as the set temp?
Why did you expect it able doing the job to begin with? IR thermometers have quite wide measurement area and laser pointer is half of an inch off center as minimum.

 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #8 on: May 19, 2019, 02:11:13 am »
"Use this as an excuse to get a soldering iron thermometer/calibrator instead  :-+"

Exactly! Far better idea!
 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #9 on: May 19, 2019, 02:14:04 am »
"Why did you expect it able doing the job to begin with?"
You know how it is. You don't have anything that even claims to be the right tool. But then you think about that thing that you do have (the IR gun). I posted my question because I had no idea if the IR readings were meaningful.
 

Online wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Re: Soldering iron display temp vs tip temp
« Reply #10 on: May 19, 2019, 02:52:08 am »
"Use this as an excuse to get a soldering iron thermometer/calibrator instead  :-+"

Exactly! Far better idea!
If it works, do not bother. Needed temperature depends a lot on soldering iron, particular job (1 layer PCB or hefty multy layer PCB which sinks a lot of heat) and used tips. In any case you should set temperature based on soldering results rather than temperature reading. Once you touch a solder joint, temperature is not the same as it was in idle. In practice soldering iron temperature calibrator is really useful only if you have a lot of the same type stations (like assembly line) and you want to set them exactly the same to ensure process consistency.
« Last Edit: May 19, 2019, 02:58:02 am by wraper »
 

Offline Shock

  • Super Contributor
  • ***
  • Posts: 4218
  • Country: au
Re: Soldering iron display temp vs tip temp
« Reply #11 on: May 19, 2019, 03:12:21 am »
Watch this video by our forum member Joe Smith. There are several factors at play, shorting the thermocouple and a limited operating temp of either the thermocouple or meter can produce different results.

The Pace station shown in the video has fairly accurate regulation as you can see, on other stations calibration and regulation may have wider swings so it's not necessarily a defect on them just poorer regulation.

The latest Pace station has degree accuracy and doesn't require calibrating so one less thing to mess around with. But not all manufacturers are necessarily showing the whats really going on with tip temp. Since it's software, some stations will not display more than the user set temp regardless of the actual tip temp.

Soldering/Rework: Pace ADS200, Pace MBT350
Multimeters: Fluke 189, 87V, 117, 112   >>> WANTED STUFF <<<
Oszilloskopen: Lecroy 9314, Phillips PM3065, Tektronix 2215a, 314
 
The following users thanked this post: billbyrd1945

Offline A Hellene

  • Frequent Contributor
  • **
  • Posts: 602
  • Country: gr
Re: Soldering iron display temp vs tip temp
« Reply #12 on: May 19, 2019, 07:41:36 am »
I am sorry, I forgot to mention where and how much (no affiliations whatsoever --just a happy customer) I got that soldering tips thermometer, above:

1. FG-100 Digital Soldering Iron Tips Thermometer with 5 temperature Sensors 0..700°C (€9.75)
2. 20 x FG-100 temperature sensors 0..700°C (€5.58)

-George
Hi! This is George; and I am three and a half years old!
(This was one of my latest realisations, now in my early fifties!...)
 
The following users thanked this post: billbyrd1945

Online magic

  • Super Contributor
  • ***
  • Posts: 6779
  • Country: pl
Re: Soldering iron display temp vs tip temp
« Reply #13 on: May 19, 2019, 08:18:35 am »
If you solder a normal K type thermocouple, you will ruin it. You will have mixed another metal with the 2 original metals, destroying the voltage generating properties of it.
I broke the thermocouple which came with my DMM, twisted the wires together, soldered them, trimmed the excess, it works again. How come? >:D
 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #14 on: May 19, 2019, 02:57:09 pm »
"If it works, do not bother."
Well, that conflicts with my desire to learn how things work. If I need the full 842F to do ALL soldering, does that mean that my iron is really only getting maybe 400F at that setting? If so-- I have no headroom for dialing up hotter temps. And that would mean that my pre-set buttons have no purpose. And it does seem odd that the small diameter of solder for thru-hole soldering would need such a setting. The only thing that seemed reasonable to me was that the displayed temp was perhaps double the actual, that maybe I needed to be thinking about "what if" the wide-open setting becomes less than adequate. I don't want to be stuck waiting for a new iron or some parts to come in.
« Last Edit: May 19, 2019, 11:38:49 pm by billbyrd1945 »
 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #15 on: May 19, 2019, 03:00:54 pm »
Thanks. Will order right away.
 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #16 on: May 19, 2019, 03:16:25 pm »
So, what I'm taking from this video is that a type K thermocouple can be used by soldering a bridge from it to the iron, and by keeping them motionless. Yes? No? But, at the low price of the device suggested by A Hellene, I think I'll just go that route. Thanks to all.
 

Online wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Re: Soldering iron display temp vs tip temp
« Reply #17 on: May 19, 2019, 05:17:02 pm »
I don't get what "wide-open setting" you mentioned even means. If there was 400o F lead free solder would not even melt, tin-lead solder would barely start to melt.
« Last Edit: May 19, 2019, 05:19:43 pm by wraper »
 

Offline Shock

  • Super Contributor
  • ***
  • Posts: 4218
  • Country: au
Re: Soldering iron display temp vs tip temp
« Reply #18 on: May 19, 2019, 07:28:08 pm »
The 842F/450C is the max temp of the station but this is too hot for most soldering. You want to use as little heat as possible but have fast melt and good wetting on different sized joints.

The best way is to get a scrap board and remove and resolder some components or start reflowing some joints. Typically 570F/300C to 660F/350C is a good middle ground but it depends on your gear, your technique, the type of solder/flux and what your soldering. You will find a sweet spot with experience.

When you are soldering proficiently standard though hole joints are roughly a two count from the moment to you touch the tip and solder until removing them. So you can use this as a guide to see if you have enough temp to get a nice fillet and good wetting into the joint (inspect the board from the other side). Spending minimum time on the joint means the iron temp has less chance to damage the component. This video shows what to aim for.

Soldering/Rework: Pace ADS200, Pace MBT350
Multimeters: Fluke 189, 87V, 117, 112   >>> WANTED STUFF <<<
Oszilloskopen: Lecroy 9314, Phillips PM3065, Tektronix 2215a, 314
 
The following users thanked this post: billbyrd1945

Offline MosherIV

  • Super Contributor
  • ***
  • Posts: 1530
  • Country: gb
Re: Soldering iron display temp vs tip temp
« Reply #19 on: May 19, 2019, 08:28:22 pm »
Quote
I broke the thermocouple which came with my DMM, twisted the wires together, soldered them, trimmed the excess, it works again. How come? >:D
I never said that adding solder will stop it working, it will change the calibration of the thermocouple because the junction of the 2 metals is no longer 2 metals, it is now 3.
K type thermocouple should be chrome and alumel, and it should give approximately 41 µV/°C
As you can see, it is a very small voltage. Changing the metals will change the voltage generated thereby making the thermocouple less accurate than before.
https://en.wikipedia.org/wiki/Thermocouple#Type_K
 

Online wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Re: Soldering iron display temp vs tip temp
« Reply #20 on: May 19, 2019, 08:35:07 pm »
Quote
I broke the thermocouple which came with my DMM, twisted the wires together, soldered them, trimmed the excess, it works again. How come? >:D
I never said that adding solder will stop it working, it will change the calibration of the thermocouple because the junction of the 2 metals is no longer 2 metals, it is now 3.
3rd metal in the junction barely matters, as it's temperature difference over the wires is what generates voltage, not junction itself. Although if there are temperature gradients over such wacky junction, you may have issues. But in that case you likely would get incorrect measurement regardless. Basically you want homogeneous temperature over the whole junction and temperature change happening only over the wires.
« Last Edit: May 19, 2019, 08:53:52 pm by wraper »
 

Online wraper

  • Supporter
  • ****
  • Posts: 16864
  • Country: lv
Re: Soldering iron display temp vs tip temp
« Reply #21 on: May 19, 2019, 09:00:43 pm »
Representations like on picture below make me cringe. Because what it represents is basically shorting thermocouple after measurement point. So actual point of measurement is where 2 wires start touching each other, not where it is heated.

 

Offline billbyrd1945Topic starter

  • Regular Contributor
  • *
  • Posts: 203
  • Country: us
Re: Soldering iron display temp vs tip temp
« Reply #22 on: May 19, 2019, 11:41:20 pm »
"I don't get what "wide-open setting" you mentioned even means."

Slang for maximum setting. Comes from wide-open throttle.
 

Online magic

  • Super Contributor
  • ***
  • Posts: 6779
  • Country: pl
Re: Soldering iron display temp vs tip temp
« Reply #23 on: May 21, 2019, 04:34:42 pm »
I never said that adding solder will stop it working, it will change the calibration of the thermocouple because the junction of the 2 metals is no longer 2 metals, it is now 3.
It's not that bad, my probe measured 98°C in boiling water which is close enough and still within spec for that lousy DMM.
It doesn't matter that other metal is inserted between the chromel and alumel parts if there is no thermal gradient across it.
Absolute seebeck coefficient of metals appears to be typically a few µV/°C (link; sadly, tin isn't listed but lead is). It would presumably take about 10°C of difference across the solder layer to cause 1°C of error.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf