Electronics > Beginners

Coaxial cable for low leakage current measurement (question about connection)


Hi all!

First, sorry if the questions looks very primitive but I really need some advice to make sure I am not going to damage something!

I want to make measurement of some coax cables to see how good they are agains EMC during current leakage measurement.

At the moment I have my hands on a picoammeter and a SMU. My idea is sourcing lowest current possible from SMU (that is something around 300pA) and see how it will be measured in the picoammeter (which manual says it has a resolution of 20fA).

My first question is how to connect the coax cable (terminated with BNC on both side). As you can see the SMU just has female banana inputs (is that what it is called?!) so I am using a banana BNC plug there.

the inpu connector on the picoammeter is female BNC, which I will just plug the coax cable into it.

Now the real quetion is about the COMMON input of the picoammeter. I have really no idea how should I connect it. Or will I need to connect it to the ground of SMU?

Can someone please explain what they mean by Lo and chasis ground should not exceed 42 volts? and what does it mean connecting common or analog output to earth while floating can damage the instrument? what can be a example case for this?

this is the connection for picoammeter:

this is the connection for SMU:

I assume you are measuring the DC resistance (or conductance) of the cable.

I am not sure whether the 42V is the working range or the absolute maximum ratings.  Either way, there is no need in this case to let the shield get anywhere near 42V.

Just connect the shield to ground which will guaranty it is operating in a safe level for the meter.

Now I think what you really want to do is apply a known voltage across the cable-under-test and measure the current.  So I would probably set, say , 20V with a current limit of 100nA (assuming you are expecting much lower then that). I would apply it to the cable shield, and connect the center core of the cable-under-test to the current sense input of the meter.
Connect it via a good quality coax cable connected at the meter end to the sense input and the shield.  At the cable end, the shield of the cable from the meter will be left unconnected.  Now you shouldn't have any leakage due to this cable from the meter as there is no voltage between the sense input and the shield, but be careful the cable and connectors are very clean.

If there is a chance of environmental EMF affecting the measurement, put the cable under test in a metal box connected to the shield/ground and connect the source as well via a shielded coax cable with the cable shield connected to the meter shield/ground. Leakage in the source cable will not affect the measurement, as long as it is well under the 100nA limit.

Test with both positive and negative source voltages. To prevent capacitive transients as you apply the source voltage, you can apply the voltage with the sense input to the meter plugged into the shield/ground, then move it to the sense input.  Allow plenty if time to settle.

To explain the sense input and the shield, the meter will keep the input potential exactly equal to the shield. It does this by internally draining away any current from the input until its voltage potential sits at exactly the shield potential. (ie zero volts between the input and the shield). The amount of current the input has to drain away to make the input the same as the shield voltage is the measured input current.  Currents going into the shield don't matter the shield just sets the voltage for the input to operate at, but it just acts like another input.  You want something to set the shield potential. you shouldn't just leave it floating.  So if the shield is connected to ground, then the sense input will also be at the ground potential.



[0] Message Index

There was an error while thanking
Go to full version