Author Topic: Soldering iron calibration  (Read 10172 times)

0 Members and 1 Guest are viewing this topic.

Offline G7PSKTopic starter

  • Super Contributor
  • ***
  • Posts: 3859
  • Country: gb
  • It is hot until proved not.
Soldering iron calibration
« on: February 07, 2012, 03:21:30 pm »
The instructions for the temperature controlled SCI soldering iron that I recently purchased says that it should be re calibrated each time the bit is changed.
Is this really necessary,is any form of calibration really required at all? I tested the iron with the thermocouple on my meter and found that the reading was 30 deg C lower than indicated on the dial but as I tend to drive it by feel,ie. if it seems too hot I turn it down and if too cool I turn it up any scale markings would suffice, in fact color shading would do, blue for cool and red for hot with shades of yellow and orange in between.
The truth is I cannot really see that it is required for the temperature to be variable at all.
The 63/37 solder that I use is an eutectic alloy any way with the liquidus solidus boundary shown on a Shaeffler diagram as being  183 deg C. about the only other solder used is 60/40 and that has a solidus of 183 C. and liquidus of 190 C. which is a transition range of 7 C. as long as the iron is hot enough to overcome the temperature gradient and not so hot as burn the flux or destroy the components I cannot see the need for so great a range or a range at all for that matter as long as the temperature is stable
So why the fuss about calibrating and the wide temperature range on the irons.
 

alm

  • Guest
Re: Soldering iron calibration
« Reply #1 on: February 07, 2012, 07:55:09 pm »
Calibration is not very useful when you just adjust it until it melts the solder fast enough. I see in mainly as useful in a production setting, where you want all joints to be the same. This allows you to have an ISO 9001 certified process of producing cold joints ;).

The main reason to change the temperature is to compensate for finite power and heat conductivity. If your iron would have infinite power, the pads would be at the iron temperature instantly. In the real world, it takes time for the iron to heat the pads. Setting the iron to 190°C is unlikely to work very well, since the joint will always be colder. A hot iron will get the pads above the melting range faster due to thermal inertia. Dirty pads/components or old solder with little flux can make this problem worse. This is why desoldering or big ground planes may require higher temperatures. Lead-free would be another reason.
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Re: Soldering iron calibration
« Reply #2 on: February 07, 2012, 08:06:33 pm »
Its a question we raised a while back, and there are good links and discussion:

https://www.eevblog.com/forum/general-chat/do-you-calibrate-your-soldering-station-if-not-or-if-so-why/30/
Best Wishes,

 Saturation
 

Offline benemorius

  • Regular Contributor
  • *
  • Posts: 173
Re: Soldering iron calibration
« Reply #3 on: February 07, 2012, 08:24:54 pm »
This is why desoldering or big ground planes may require higher temperatures. Lead-free would be another reason.

This.

Grab an old motherboard and try taking an electrolytic off using your normal temperature setting. Most of your heat will be carried away by multiple massive power planes and you'll quickly see first hand why your iron can go over 400C.
 

Offline wkb

  • Frequent Contributor
  • **
  • Posts: 905
  • Country: nl
Re: Soldering iron calibration
« Reply #4 on: February 07, 2012, 08:44:05 pm »
So why the fuss about calibrating and the wide temperature range on the irons.

Mankind got to the Moon and back without calibrating it's soldering irons..  ::)
 

Offline G7PSKTopic starter

  • Super Contributor
  • ***
  • Posts: 3859
  • Country: gb
  • It is hot until proved not.
Re: Soldering iron calibration
« Reply #5 on: February 07, 2012, 08:54:42 pm »
Being able to turn the heat up does not really make up for lack of thermal capacity, which is only possible by having a larger chunk of copper in the soldering iron bit.  My 40 watt temperature controlled iron has only about a fifth of the copper that my 25 watt Antex iron has this lack of copper would appear to be quite common with the temperature controlled models probably due to the ability to have a faster heat up time and the way the element is fitted to the bit. If I want to solder a really large chunk of metal I still reach for an externally heated soldering iron that I have it has about two pounds of copper in its head and takes ten minuets to heat in a gas flame, but it does have the thermal capacity to get the job done. 
 

Online IanB

  • Super Contributor
  • ***
  • Posts: 11790
  • Country: us
Re: Soldering iron calibration
« Reply #6 on: February 07, 2012, 09:00:27 pm »
Thermal capacity is not really the answer as it will only help transient response. Most of the time a big thermal capacity in an iron is more of a hindrance than a help. It slows the heat up time and slows the thermal recovery time.

What you really need is heat flow (power) supplied effectively at the tip. A small tip with good temperature controlled heat flow will solder big things quite happily. The only restriction is the thermal path (thermal resistance) between the tip and the work. If the tip contact area is too small or too poor it will throttle the heat flow.
 

Offline wkb

  • Frequent Contributor
  • **
  • Posts: 905
  • Country: nl
Re: Soldering iron calibration
« Reply #7 on: February 07, 2012, 09:20:15 pm »
Thermal capacity is not really the answer as it will only help transient response. Most of the time a big thermal capacity in an iron is more of a hindrance than a help. It slows the heat up time and slows the thermal recovery time.

What you really need is heat flow (power) supplied effectively at the tip. A small tip with good temperature controlled heat flow will solder big things quite happily. The only restriction is the thermal path (thermal resistance) between the tip and the work. If the tip contact area is too small or too poor it will throttle the heat flow.

In short: a Metcal (or similar)
 

Offline G7PSKTopic starter

  • Super Contributor
  • ***
  • Posts: 3859
  • Country: gb
  • It is hot until proved not.
Re: Soldering iron calibration
« Reply #8 on: February 07, 2012, 09:59:05 pm »
If the mass of the iron is sufficiently greater than that of the item being soldered it will not cool down appreciatively during the soldering process, we managed to solder very well for hundreds of years without temperature controls, I think that to large extent it just makes up for lack of skills. I can see where in a factory such things are required apart from anything else it will help prevent burning of the tip during lunch break when it gets left on unattended. But from my perspective as someone who has spent the best part of the last forty years in the metal joining industry including the nuclear industry, I do not personally see its advantages. 
 

Online IanB

  • Super Contributor
  • ***
  • Posts: 11790
  • Country: us
Re: Soldering iron calibration
« Reply #9 on: February 07, 2012, 10:19:50 pm »
It depends a little what you are joining. If making mechanical joints, a large iron with significant mass may not be a disadvantage. But if making electrical joints then small parts, lack of space around the joint, and the need for agility and flexibility will favour a small pencil size iron, light in weight with flexible cord.
 

Offline Kibi

  • Supporter
  • ****
  • Posts: 385
  • Country: england
Re: Soldering iron calibration
« Reply #10 on: February 07, 2012, 11:35:23 pm »
Thermal capacity is not really the answer as it will only help transient response. Most of the time a big thermal capacity in an iron is more of a hindrance than a help. It slows the heat up time and slows the thermal recovery time.

What you really need is heat flow (power) supplied effectively at the tip. A small tip with good temperature controlled heat flow will solder big things quite happily. The only restriction is the thermal path (thermal resistance) between the tip and the work. If the tip contact area is too small or too poor it will throttle the heat flow.

In short: a Metcal (or similar)

Yes, precisely. You don't quite know what you're missing out on until you use one of these type of irons. You can quite happily go from soldering little 1N4148 diodes to small traces in confined spaces to soldering great big 10W resistors to big fat traces without even realising any difference. The solder just flows as easily either way. Sure, if the trace is really big and the component is huge, you may need to use a bigger tip to be able transfer more heat, but that is just plain physics.
 

Offline saturation

  • Super Contributor
  • ***
  • Posts: 4787
  • Country: us
  • Doveryai, no proveryai
    • NIST
Re: Soldering iron calibration
« Reply #11 on: February 08, 2012, 05:49:23 pm »
Having been raised with non-regulated irons, know it can be done.  You just have to get the touch, and use far less contact time than an iron that is cooler.

However, setting temp limits came out of studying failures that gave us the MILSPEC and aerospace standards that became the civilian IPC standard.


Overall, the bottom line is simply:

Keeping your temps under 350C prolongs tip and heater life
Reduces risk of failures on soldered components

As modern components are smaller and fragile, excessive heat can easily be applied by irons that are kept hotter than ~350C.  Many data sheets tend to specify the soldering limits, for example ~300C for 10 seconds or less; this isn't linear and it doesn't mean it can take 400C for 7 seconds.

The manual adjust ability on irons basically compensates for tip wear and bring it back into spec; its not really meant to pump up the heat for large ground planes but you could do that.  Its better to use a tool or tip better suited to that that wear out a tool designed for fine work.

Analog adjustable irons use a sensor far from the true tip, its usually near the base, so the reported temp is not the tip temp, so calibration compensated for this error; tip wear can cause less conducted heat, the heating element weaken etc., all require readjustment.  A need to 'recal' with each tip replacement is to compensate for different tip geometries: larger tips can report lower tip temps if the station was calibrated against a smaller tip.

You only need to recal if you must know the temp for certain, and if you have a machine that makes it convenient to make such tests easy.

In practice, as the station ages and gets off-cal the practical result is that the soldering becomes difficult, so you bump up the adjustment without cal.  I suspect that its highly probable that the measured tip temp is closer to a calibrated value if it were formally calibrated, if adjusted by an experienced user. 

For non-production labs, one can easily check if the tip melts eutectic solder, since it goes to liquidus within 1C, to field calibrate a tip without resorting to a tip thermometer.   

There are self calibrating stations which have sensors essentially at the tip, such as with Metcal, and there is no adjustment.  The station will generate up to its rated power to maintain the standardized tip temp, even if touching a large heat sink such as a ground plane. 


« Last Edit: May 28, 2012, 12:24:19 pm by saturation »
Best Wishes,

 Saturation
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf