EEVblog Electronics Community Forum

General => General Technical Chat => Topic started by: Regected on June 17, 2017, 01:26:36 pm

Title: Lab equipment argument
Post by: Regected on June 17, 2017, 01:26:36 pm
I work as a technician for a small engineering firm specializing in oil well drilling equipment.  All of our electrical engineers are PHD students directly from university.  They have no real world experience outside of this job and as such have some very poor habits.  They think their experience at university is the end-all beat-all for how to do things.

We have been producing tools for 6 years now, with good success in our designs.  We do what we do with as little investment in lab equipment as possible.  I'm talking these guys are more than happy to forgo any Fluke or better meter in favor of one of the crappy 10 dollar meter with the built in transistor tester.  This drives me nuts.  I've designed a transfer standard to at the very least allow an in-house comparison of the multi-meters, power supplies and what not just so we know.  That's when my supervisor informed me that I was over thinking this, and none of the equipment needs to be checked.  His argument is if the meters are within 10%, then the readings are close enough.  My mind is just blown on how he can think that.  He also went on, in great depth and overly long time, to explain that the equipment also never needs to be calibrated as we've never had any problems.  I'm just gob smacked.  What do you say to someone with this mindset?
Title: Re: Lab equipment argument
Post by: CatalinaWOW on June 17, 2017, 01:45:21 pm
Sounds to me like you might both be wrong. 

I don't know your product, but it is entirely believable to me that 10% accuracy is all that is required.  That is true about a great many things.  Would sales improve if higher accuracy was used?  Would reliability improve?  Good questions, but the answer is not obvious.

One the other hand, even 10% accuracy requires some form of verification.  Doesn't necessarily need to be traceable to NIST, but needs to be credible that it is within 10%.  A transfer standard is one way to achieve this, but there are many others.  Obviously at the 10% accuracy level none of  them are costly.

Questions for your boss might be:  What happens if our meters are 50% off?  How would we know that they are with 10%?  You may find some resistance even to this since errors of this magnitude are really unusual (but not unheard of).  If you have personal knowledge, or even better records, it can be worth bringing up the circumstances.   

It is a risk management question and the low probability of occurrence doesn't justify large levels of effort.
Title: Re: Lab equipment argument
Post by: xrunner on June 17, 2017, 01:57:08 pm
One the other hand, even 10% accuracy requires some form of verification.  Doesn't necessarily need to be traceable to NIST, but needs to be credible that it is within 10%.  A transfer standard is one way to achieve this, but there are many others.  Obviously at the 10% accuracy level none of  them are costly.

Right. There is no way to have confidence that reading are within 10% if nothing has been calibrated.  :-//

That's when my supervisor informed me that I was over thinking this, and none of the equipment needs to be checked.  His argument is if the meters are within 10%, then the readings are close enough.

Where did he get the 10% criteria? He pulled it out of the place where the sun don't shine.

I think he just used that remark "within 10%" as a substitution for "leave me alone and don't worry about it. I don't know what you make but it would require some kind of analysis to show whether or not a > 10% error in some measurements would make the tool or whatever out of tolerance for your use. It may be that DMM readings could be out by > 25% and things would be OK, but the point is nobody really knows, if I understand?
Title: Re: Lab equipment argument
Post by: yada on June 17, 2017, 02:02:27 pm
And we wonder why there are so many oil spills and blow outs.
Title: Re: Lab equipment argument
Post by: SeanB on June 17, 2017, 07:12:34 pm
I would ask him if he was happy getting 10% less fuel for his dollar at the pump, or if he would be happy with 10% less pay, as that is within his accuracy spec.  Would he be happy if the equipment he supplied was 10% too large, and thus would not fit down the drill string, or if the 10% error meant they had to pay for a blow out when a valve failed from overpressure that the display showed was low.

Does he want to have an industrial incident named after him?
Title: Re: Lab equipment argument
Post by: slurry on June 17, 2017, 07:42:49 pm
As CatalinaWOW said, it might not be a problem, but if you dont have any reference at all there's no way to know that you are even in the ballpark.
There are of course always a balance between cost of investment and what the outcome of the investment is,
but a bunch of Flukes and sensible powersupplies should not pose a big problem even for a smaller company.

I once had a similar problem with convincing the management that there we had to upgrade the instruments from crappy ones to less crappy ones,
all to no avail despite perfectly good arguments... so i left for another company.
Sometimes, there is no use wasting time and effort for a company that does not want to improve in quality or production rate.
Title: Re: Lab equipment argument
Post by: kalel on June 17, 2017, 07:56:53 pm
As CatalinaWOW said, it might not be a problem, but if you dont have any reference at all there's no way to know that you are even in the ballpark.
There are of course always a balance between cost of investment and what the outcome of the investment is,
but a bunch of Flukes and sensible powersupplies should not pose a big problem even for a smaller company.

I once had a similar problem with convincing the management that there we had to upgrade the instruments from crappy ones to less crappy ones,
all to no avail despite perfectly good arguments... so i left for another company.
Sometimes, there is no use wasting time and effort for a company that does not want to improve in quality or production rate.

I'd guess they need to somehow be convinced that upgrading the equipment/quality of work/employee satisfaction will make them more money. Does it seem logical? For the second two, definitely yes, but they won't always agree and convincing them might be difficult or impossible. E.g. some might think that a satisfied employee might not be as hard working as one that gets a lot of criticism. Everyone thinks differently. Anyway, if somehow they can be convinced that it will result in them making more money, they are much more likely to listen.
Title: Re: Lab equipment argument
Post by: Electro Detective on June 17, 2017, 08:22:08 pm
You tried and got as far as it gets, let it go and keep your job,
an apparently fair and reasoning supervisor can quickly turn into a sniping pos if his boat gets rocked.

If their 10% tolerance acceptance means no personal danger or financial disadvantage, to you, then let them roll with that   :horse:

It's not your business, therefore not your problem.   :-+

Title: Re: Lab equipment argument
Post by: rstofer on June 17, 2017, 08:57:43 pm
Ever done a sensitivity analysis on the design?  This type of analysis assumes certain tolerances and iterates through various combinations to see how the output is affected.  It's a pretty complex undertaking.  LTspice allows for component tolerances but I have never tried it.  I remember doing it with IBM's Electronic Circuit Analysis Program (ECAP) back around '70 or so.  Lots and lots of paper printed doing that!

The thing is, if there is no analysis, how can you argue that +- 10% (or more) is a problem?  It's kind of like measuring the voltage of a 24V doorbell transformer.  Don't be surprised that it reads closer to 30V but the doorbell doesn't care.

So, what kind of electronics are we talking about?  Doorbells or FPGAs?

BTW, when Motorola invented the Six Sigma process, they did it to eliminate the requirement to test pagers.  They designed the entire assembly, electronics and all, such that it would always work properly.  If 3.4 in a million failed, so what?  Replace them and move on.  First two sentences here:

http://thequalityportal.com/q_6sigma.htm (http://thequalityportal.com/q_6sigma.htm)


Title: Re: Lab equipment argument
Post by: innkeeper on June 17, 2017, 09:11:15 pm
unless the meters are tested (certified) regularly, how do they know there within 10%
but that is true even for your fluke.

The fluke is less likely to drift out of spec anytime soon as we know. I'm preaching to the  choir when i say the pitfalls of cheap meters are many. they are well documented on this site and i wouldn't even think of defending using them. But frankly it doesn't mean they cant or don't work.

I've been in the unfortunate position of having to work with cheap gear out of necessity. You can make it work for you. but if it was my livelihood and my business and reputation on the line i would do what i could to get reliable equipment for the job  and make sure it is accurate. Sure it doesn't have to be a fluke.  there are less expensive options, but doesn't have to be some no name china made meter that not even the engineers in china would buy and use.

This is one of those life lessons they cant afford to learn the hard way, especially if one bad reading by and out of spec meter can cause people or the company harm.  This frankly isn't a matter of opinion, but a matter of quality control.  You question the ethics of a company that wound say "well we never had an issue before" I hope they never make airplanes. you wonder how many other things they handle in that way.







Title: Re: Lab equipment argument
Post by: CopperCone on June 18, 2017, 01:04:09 am
Like I said, there needs to be a smiley that looks like a money crazed mad man $_$.

Thats all it is. This is common in the industry. Why? Because people get props from some idiot because they 'keep costs down'.  It means peoples understanding of physics is poor. It means people dont understand failure modes and things that elevate failure probabilities. It means your management is high on yayo.

Hilarious. For instance, testing battery in a caliper. Some calipers have their scales change when battery goes down (GOOD BRANDS).  Could be a gain error where some test gauge block piece reads correctly, but something smaller does not. Some pipe fitting is inspected with this. It results in a gas leak. Boom. Could a electrical measurement prevent this problem? Sure.

But, this is basically.. yea it wont happen, the chances of that are zilch, etc. Basically you won't get anywhere talking to these people because their greedy.  :palm:

This is common everywhere. then someone just has to say 'he is paranoid and costing us money, ignore him'.

but then when a gas leak occurs because some meter is fucked up.. :popcorn:

Title: Re: Lab equipment argument
Post by: CopperCone on June 18, 2017, 01:09:26 am
Ever done a sensitivity analysis on the design?  This type of analysis assumes certain tolerances and iterates through various combinations to see how the output is affected.  It's a pretty complex undertaking.  LTspice allows for component tolerances but I have never tried it.  I remember doing it with IBM's Electronic Circuit Analysis Program (ECAP) back around '70 or so.  Lots and lots of paper printed doing that!

The thing is, if there is no analysis, how can you argue that +- 10% (or more) is a problem?  It's kind of like measuring the voltage of a 24V doorbell transformer.  Don't be surprised that it reads closer to 30V but the doorbell doesn't care.

So, what kind of electronics are we talking about?  Doorbells or FPGAs?

BTW, when Motorola invented the Six Sigma process, they did it to eliminate the requirement to test pagers.  They designed the entire assembly, electronics and all, such that it would always work properly.  If 3.4 in a million failed, so what?  Replace them and move on.  First two sentences here:

http://thequalityportal.com/q_6sigma.htm (http://thequalityportal.com/q_6sigma.htm)

if this is the company culture, with 'experience behind rational', then this kind of study will get cooked and everything you come up with will be 'low probability'.

Also, this is dog shit because without a test well, you are just bowing to some statistical god. I reason crack cocaine is behind these decisions.
 
If you work for a place like that, its basically like being a whore. Hopefully you are looking for an exit out the business, that is, you tell most consumers of the product what you allow to happen, or are complacent making money off of , they get upset with you, kind of like how prostitutes are held in low social standing. for causing marital issues and disease spread (in some places).  and of course the management team is basically pimp slapping you and telling you to suck it
Title: Re: Lab equipment argument
Post by: Brumby on June 18, 2017, 01:38:51 am
Without knowing the specifics of the situation, criticizing the "10% is good enough" attitude is not really valid.  Values might still be acceptable if they varied 20% from nominal.  Take a 7805 or an SMPS.  Their input can vary enormously, yet still deliver the correct voltage.  Checking that voltage, however could become a problem.

The most valid comment I have seen above is whether the meter used can be used with confidence.  Calibration or at least comparison of some sort needs to be in the mix somewhere.

How to get that message through? ... Unfortunately, I can see it only coming about from a problem (which, hopefully won't be a disaster) encountered by using a meter that is way off.

The question then becomes - how likely is that?  Even my oldest and most neglected meters (that still work) are still well within 10% of my latest acquisition (BM235).
Title: Re: Lab equipment argument
Post by: Tomorokoshi on June 18, 2017, 05:19:32 am
There are industrial standards that apply to any number of products and processes. What are some of the standards your company uses as they apply to the products? For instance:

- UL, ANSI, ISO, etc.
- NFPA, NEC, etc.
- Hazardous Location
- EMC

Standards apply. The implementation of those standards will at some point require tests, and those often require calibrated equipment. That's why larger organizations have metrology and calibration departments, along with procedures that enforce the calibration schedule.

For the fun of it look up a competitor, and see what they have for marking, labeling, and standards listed in their manuals. Is your organization doing anything similar?
Title: Re: Lab equipment argument
Post by: ed_reardon on June 18, 2017, 09:07:26 am
With standards,  I work for a company with the complete opposite problem.

We are bound by a host of regulations of standards (of which the company I work for readily and strictly complies with) as we deal with mission critical systems where there could be a wide-ranging impact to the general public if something went wrong.

As such, as staff are issued a Fluke 87V as part of their standard toolkit,  and these are returned for validation and calibration to the local lab every 180 days.

Trouble is,  50% of the staff with these meters simply do not have the skills or inclination to use them, another 25% are only using them to check for presence of mains voltage or other rather rudimentary requirements and only a handful of staff may ever use them for anything in-depth. 

So essentially hundreds of meters still in their retail packages are returned for traceable validation every 180 days, and it's the only time they see sunlight.

Maybe we should come to some sort of swap :P
Title: Re: Lab equipment argument
Post by: hans on June 18, 2017, 10:21:48 am
From the supplied story, it's unclear how the 10% is established.

Is it a number pulled out of the sky?
Or did they calculate it?

There are many products that do not justify the need for an accurate and industrial multimeter.
If you're working with low energy & low accuracy stuff, say you're making USB webcams, then who cares about the accuracy and CAT IV ratings on a Fluke. How big is the chance such a meter will drift or be thrown out of spec due to an 'event'? Probably very very small.

For example, I cannot justify for my hobby lab spending 200$ on such meter. I'd rather spend that money on a new logic analyzer. I don't design precision analog, power or mains equipment. So I have no need for it.

Do more with less.
Title: Re: Lab equipment argument
Post by: Neomys Sapiens on June 18, 2017, 05:50:26 pm
Of course, it is not necessary to discard an instrument if there is a minor deviation from its specs (just record it). Or, if calibration is overdue 1 day, a QC honcho placing a 'do not use' sticker on it.

But, if the accuracy of the instrument is not known, the error of the measurement will be unknown too. And if this is the case, what criteria are applied to a result and how to describe them? It should be sufficient to lay a trace for the next audit leading there and the result should be scathing!

If money is the only language understood, the magic word is LIABILITY. I have used this approach often enough to build up some pressure. Here, the QA department can be really helpful. On the other hand, I did agree to some compromise solutions regarding calibration intervals as long as I could cross-check my instruments to some degree. Better to do calibration in a 2 or 3 year interval for instruments used in a benign environment than to do it not at all.
Title: Re: Lab equipment argument
Post by: CopperCone on June 18, 2017, 06:06:18 pm
With standards,  I work for a company with the complete opposite problem.

We are bound by a host of regulations of standards (of which the company I work for readily and strictly complies with) as we deal with mission critical systems where there could be a wide-ranging impact to the general public if something went wrong.

As such, as staff are issued a Fluke 87V as part of their standard toolkit,  and these are returned for validation and calibration to the local lab every 180 days.

Trouble is,  50% of the staff with these meters simply do not have the skills or inclination to use them, another 25% are only using them to check for presence of mains voltage or other rather rudimentary requirements and only a handful of staff may ever use them for anything in-depth. 

So essentially hundreds of meters still in their retail packages are returned for traceable validation every 180 days, and it's the only time they see sunlight.

Maybe we should come to some sort of swap :P

hey john let me borrow your meter for something, i don't feel like going to my desk
Title: Re: Lab equipment argument
Post by: Brumby on June 19, 2017, 12:09:34 am
From the supplied story, it's unclear how the 10% is established.

Is it a number pulled out of the sky?
Or did they calculate it?

My vote: "a number pulled out of the sky".

The point you subsequently make about not needing high accuracy being the origin of the attitude that plucked the 10% figure.
Title: Re: Lab equipment argument
Post by: Regected on June 20, 2017, 12:20:15 am
Thank you for all the valuable information.  This is much to digest.

As for the question of what we make, it's high reliability logging tools for drilling oil wells.  This involves everything from power generation to FPGAs to RF equipment.  Most everything is low power, but we do have a few high current/high voltage modules.  Our tools are low volume, high value products.  Generally, each tool shipment (5 to 10 tools) is customized for the designated customer.

As for the 10 percent, given the general nature of the conversation, I'm leaning toward it being a number he pulled out of his rear.  He's not exactly the type of person to calculate stuff like that unless absolutely necessary.  As a matter of fact, I'm certain that he would not have as he only works with ideal values.  We ran into a problem with this years ago when trouble shooting an RF output stage.  Out of 6 engineering samples, only two would work correctly.  After weeks of him digging, his boss finally took it over.  I was not privy to the exact solution, but it boiled down to the gate threshold voltage being too varied from chip to chip.

I think I may find a different channel for funding for calibration equipment.  It's not like I'm trying to spend tens of thousands of dollars.  Heck, this all got started on $400 worth of parts.  I've started throwing around a new phrase to describe this crap.  Penny wise and dollar dumb.
Title: Re: Lab equipment argument
Post by: CatalinaWOW on June 20, 2017, 01:14:58 am
Since you don't trust your bosses answer, where are you going to go to find out how well you have to calibrate your instrumentation?  Is 0.0001% good enough?  If you go out a set up a calibration lab in the absence of this answer you are being as foolish in your own way as he apparently is.

Title: Re: Lab equipment argument
Post by: f5r5e5d on June 20, 2017, 01:51:05 am
at the least you need to be able verify supply V are safely less than datasheet abs max values for chips and caps
Title: Re: Lab equipment argument
Post by: vk6zgo on June 20, 2017, 02:03:20 am
Thank you for all the valuable information.  This is much to digest.

As for the question of what we make, it's high reliability logging tools for drilling oil wells.  This involves everything from power generation to FPGAs to RF equipment.  Most everything is low power, but we do have a few high current/high voltage modules.  Our tools are low volume, high value products.  Generally, each tool shipment (5 to 10 tools) is customized for the designated customer.

As for the 10 percent, given the general nature of the conversation, I'm leaning toward it being a number he pulled out of his rear.  He's not exactly the type of person to calculate stuff like that unless absolutely necessary.  As a matter of fact, I'm certain that he would not have as he only works with ideal values.  We ran into a problem with this years ago when trouble shooting an RF output stage.  Out of 6 engineering samples, only two would work correctly.  After weeks of him digging, his boss finally took it over.  I was not privy to the exact solution, but it boiled down to the gate threshold voltage being too varied from chip to chip.

Your 10% error if applied to RF Amplifiers would be less than 1dB,whether you measured power or voltage! ;D
I'm a little dubious about the "explanation".
RF PAs are a lot more "clunky" than that, & often need the bias, etc to be individually set for each power device.

All that said, it is easy to get caught when thinking in dB.

We had some UHF transmitters made in the dear old PRC.
They obtained the required 1kW output by paralleling up  two LDMOS amplifiers on one board, then again paralleling two boards, & in turn paralleling that pair of boards with another pair.
Each step required a combiner with an unbalance load.
The complete device had a power sensor at the output which controlled the overall drive to the amp, so as to keep the output constant.

The "rent an Engineers" they pulled off the street didn't know that the individual amplifiers needed to have matched gains at the frequency in use, & just "slapped "em together"
All this might have been OK, if they had used sufficiently rated resistors for the "unbalance loads"---but they didn't.

Unbalanced gains on one board cooked the load & killed both devices on that board.
The power control immediately increased the drive, so the unbalance load for that pair of boards died, & so on.

We found that the power gain variations were often 3-4 dB between the amps on each board.

Doesn't sound much, but that is 50% to 60% variation.----quite sufficient to kill the unbalance loads.

So there I was, in my declining years, shuttling chip capacitors around RF amp boards whilst running an RF sweep through it----something I hadn't done for 20+ years!!
All the ones I did came out within 1dB at the wanted frequency, so it wasn't hard to do, if you had the first glue about RF stuff.

We tried to get them to make us some spares which were tuned correctly----but the old "selective understanding of English thing occurred".

Quote
I think I may find a different channel for funding for calibration equipment.  It's not like I'm trying to spend tens of thousands of dollars.  Heck, this all got started on $400 worth of parts.  I've started throwing around a new phrase to describe this crap.  Penny wise and dollar dumb.

Another mob I worked for supplied earlier model UNi-T multimeters ( I'm not a fan!) for normal testing.
They had several problems:
(1) On the resistance setting, the reading took ages to stabilise.
(2) The stupid things would shut down right when you we using them.
(3) The resistance range went crazy when the battery was getting low, well before the "batt" icon came up.

After much agitation,they bought a Fluke, but it was locked away, & only the "Gurus" could use it.

These were the same lot that wouldn't supply the people actually building their stuff with schematics, to protect their IP.
The dumb thing was that if there was any IP on the boards it belonged to National Semiconductor who had been telling the world about their stuff for years.

Another clever trick was when they held up production for a couple of days because they ran out of 1.1k resistors.
(The 1.1K s were used in an integrating network with Electrolytic Capacitors having something like + or - 20 % variation in value, so 1k or 1.2k would have been OK))
Title: Re: Lab equipment argument
Post by: CatalinaWOW on June 20, 2017, 05:04:48 am
We can all tell horror stories.  Like when I was called by a buyer and asked why transistor umpty umph couldn't be substituted for the much more expensive one specified.  The buyer pointed out that both parts had the same letters - E, B and C. 

The only defense against all of this is thinking.  Sometimes 3 dB is OK.  Sometimes 0.5 dB isn't good enough.  Sometimes you need ultra accurate measurements.  Other times you only need high precision.  And there are times you don't need much of either.  If you don't understand the real needs for your application you are likely to either deliver bad product, waste money or both.

Even the simple suggestion that you need to assure that your IC's don't exceed their absolute maximum ratings can be subtle.  Are these ICs assured to fail when this voltage is reached?  Are they assured to work within these limits?  Generally the answer to both of those questions is no.  The answer to how high a voltage you can apply (and thus how accurately you need to know what you are applying) requires an understanding of your reliability requirements and an understanding of how to combine often hard to get manufacturers data with estimates of how your measurements perform.  Your expectations of making a claim on the manufacturer for a defective product will also feed into this analysis.