Products > Test Equipment

EMC conducted emissions pre-compliance testing with home-made LISN

(1/8) > >>

uski:
Hi folks!

I recently had the opportunity to do pre-compliance testing for conducted emissions (CISPR 25).
I'd like to share some feedback.

Initially, I have used a purely home-made LISN (courtesy of Idpromnut).
Then, I could use a Tekbox TBOH1 LISN. Results were similar to the home-made LISN.

I'm using a Rigol DSA832-TG (well the TG is irrelevant here), with the EMI option (I use that for the CISPR 9kHw RBW and I can also do quasi-peak measurements).

However I only had one of each of those LISNs and I wanted to do the measurements with two matched LISNs. I believe that's the proper way of doing things and it allows to differentiate between common-mode noise and differential-mode noise (by comparing the noise on the + and - power supply wires).

So I have used the design available here (thanks Jay_Diddy_B :-+) to build two identical LISNs :
https://www.eevblog.com/forum/projects/5uh-lisn-for-spectrum-analyzer-emcemi-work/msg404662/#msg404662

Here is my test setup. The LISNs are located at the top-left of the picture (you need to be logged-in to view the pictures)


Screenshots :


Here's another one (measurements are not as good, because of different input settings and lower sweep time - but I wanted to show the pass/fail limits feature) :



And here are the measurements from the certification lab :


As you can see the results are actually pretty close. It looks like the home-made setup underestimates the noise by 2dB but I wouldn't submit the design to the certification lab again with a 2dB margin only, so it's still pretty useful (plus, I know the bias now).

NB: I have enabled the input corrections to compensate for the 10dB attenuator built into the LISN, so that what's displayed on the spectrum analyzer matches the supposed readings. I like this embedded 10dB attenuator, it really makes a difference, previously (with other LISNs) I was getting warning from the spectrum analyzer when I was turning on the UUT ("IF signal out of range"), now it's silent, which is good.

Also there is a DC-block at the input of the spectrum analyzer as an additional protection (Minicircuits BLK-89S+), I could probably safely remove it, but it doesn't hurt, its insertion loss is negligible.

I have used a peak-detector for fast measurements, with a relatively long sweep time (I have chosen to use 50s to make sure I get all the noise). I could go much faster but by nature the noise of my device is transient so I need to make sure the spectrum analyzer spends some time on each frequency to get a good reading.

The span I used was 10kHz-15MHz because I know (from the pre-compliance measurements at the certification lab) that there is nothing interesting between 15MHz and 30MHz.

If anyone has suggestions for improving the setup, or if you have any question, just let me know ! :)

PS: The black cable running from the bottom right of the picture is a USB UART for configuring the UUT. Don't forget to disconnect it before starting testing however it will screw up the grounding scheme (computer ground is safety ground, just like the copper plate and the LISN "-", which is different from the UUT "-")

uski

uski:
Hi,

EDIT: This post is a reply to a post by someone, that has since been deleted (why? dunno! there's no stupid question!)
The question asked was : how do you explain the difference in frequency for the peaks, between the test house results and yours ? Also it looks like you have a serious noise problem around 650kHz !

Yes, the reason I got into all this was because we (clearly :-DD) failed the conducted emissions test.
We were able to fix the issue by adding a common-mode choke and two capacitors, in line with the power supply wires, and test the effectiveness of this with the setup above. I thought showing the failed tests was funnier than the passed tests :popcorn:

The main responsible for the failure of the test is a DC/DC switcher on our device, which operates around (you guessed it) 650kHz. All the other lines in the spectrum are also caused by it.

The certification lab used a $150'000 EMI test receiver (R&S ESIB 40) which takes many more measurement points than my DSA832, over the spectrum I show here (150kHz-30MHz), so it's more precise.
I'm pretty sure I can get a better measurement if I reduce the span of the DSA832 around the problematic frequency.

Rigol actually provides (if you buy the EMI option) a software that does this, it first does a prescan of the entire band, then it "zooms into" the problematic peaks to get more data around them. But the software is buggy and no fix is scheduled to be released before several months according to their support (thumbs down Rigol on this).

Furthermore, the board I used for the tests is not the same than the one the certification lab received. And the switcher IC I'm using has a wide range of frequencies so a small (even just 0.5%) change in value could explain part of the frequency shift we see.

To confirm this, I just did some measurements again, with the same setup, but a 100kHz span around 650kHz.

With a 9kHz EMI RBW (you need to be logged in to view the pictures) :

The peak is at 648.6kHz

And with a 200Hz EMI RBW :

The peak is at 649.5kHz (it's normal that the amplitude changes if the RBW is changed)

Pretty close to the lab results which see the peak at 654kHz !
Those 5kHz could totally be explained by a slight shift in the resistor setting the switcher frequency and tolerances of the IC itself, plus maybe some slight differences from the test equipment themselves (I'm not sure what's the accuracy of the DSA832). After all it's a spectrum analyzer, not a frequency counter, and I'm only using its internal timebase.

I checked in the switcher IC datasheet :
- 649.5kHz : 207.25kohms
- 654.0kHz : 205.71kohms

I'm not concerned too much, what concerns me more is the 2dB difference in amplitude (when compared to the results from the lab, which should be lower because they're using a quasi-peak detector, not a peak detector like me !), but that isn't a big problem by itself.
EDIT: I think I found why there's a 2dB offset. See below.

If I have time I'll try to measure the insertion loss of the LISN (I have a tracking generator so I can make use of it !)

uski

EMC:
Uski,

Well done, looks like a good result.    The main problem with the LISNs would probably not have been an issue in this test because of the low DC current.   The test lab used a LISN that still meets the impedance spec when carrying 100 amps.   The LISN inductors in the little guys you made would probably start dropping inductance at a few hundred milliamps DC; but, if thats all you need then thats great.   I would be interested to know if "verification through substitution" would have reduced or detected the error in your measurement (BTW who says you had error, it could be the test house that had the error!).   i.e.  1) remove DC power and ground the power supply end of the LISNs.  2) Attach an ARB/Sig Gen instead of the DUT at the other end of the LISNs.   Increase the ARB/Sig Gen until you get the same DSA832 deflection/amplitude.   I was wondering if the resulting ARB reading would be closer to 75dBuV rather than 72 dBuV.   If closer to 75dBuV then we could assume 2-3dB error in your setup; and, you may be able to track that down and reduce it.

Steve

PS The ARB/Sig Gen will load down below 10MHz.   The impedance at 600kHz should be about 25 ohms.   This shouldn't effect the substitution test.

T3sl4co1l:
Nice!  Excellent example of how to do it!

Tim

uski:
Hi,

Thanks guys ! It's great to discuss stuff like this.


--- Quote from: EMC on October 07, 2015, 09:06:20 am ---Well done, looks like a good result.    The main problem with the LISNs would probably not have been an issue in this test because of the low DC current.   The test lab used a LISN that still meets the impedance spec when carrying 100 amps.   The LISN inductors in the little guys you made would probably start dropping inductance at a few hundred milliamps DC; but, if thats all you need then thats great.
--- End quote ---

This particular UUT uses 200mA at most. I'm curious to see how much the inductance drops with high currents, is there a way to test this ? For example, I could draw 5 amps of DC current within the LISN (I have a floating power supply and a floating active load), and inject the same test signal through a DC block, would that work ? Or do I need a coupler or something more complex ?


--- Quote from: EMC on October 07, 2015, 09:06:20 am ---I would be interested to know if "verification through substitution" would have reduced or detected the error in your measurement (BTW who says you had error, it could be the test house that had the error!).
--- End quote ---

Well I did just that.
I used a HP 8648B signal generator (calibration due date 04-2012 so it's not terribly well calibrated but it shouldn't be too far off...), with a DC block on the spectrum analyzer, the generator, and the LISN in between (I tested with removing the DC block at 300kHz and results were the same).

I soldered a coax directly to the UUT side of the LISN (you need to be logged in to view the pictures) :


The setup looks like this (for the test open at the source side):


I took a few measurement points, setting the generator to 75dBuV:


It looks like there is quite a bit of attenuation at sub-1MHz frequencies, any idea of the possible cause ?
EDIT: I believe it's simply because the impedance starts to decrease below a few MHz. The Tekbox LISN does the same thing :



--- Quote from: EMC on October 07, 2015, 09:06:20 am ---PS The ARB/Sig Gen will load down below 10MHz.   The impedance at 600kHz should be about 25 ohms.   This shouldn't effect the substitution test.
--- End quote ---

Could that explain the dip below 1MHz ?
EDIT: I think yes, it's because of the impedance dip.

uski

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod