Author Topic: High Voltage amplifier bandwidth - how to measure it with high accuracy?  (Read 334 times)

0 Members and 1 Guest are viewing this topic.

Offline monologue231Topic starter

  • Newbie
  • Posts: 2
  • Country: pl
Hello all,

I would like to measure the bandwidth and linearity of a high-voltage amplifier.
The exact specifications is below:

Gain: X20
Maximum output voltage: 300Vpp
Expected bandwidth: DC - 1 MHz.

My goal is to check:
- How linear this amplifier is and what is the deviation from the X20 gain for specific frequency.
- Checking the bandwidth at maximum gain to see where the 3dB drop is.

I thought it would be enough to use an oscilloscope for these measurements. In the lab I've got Tektronix MSO3054, but unfortunately according to the datasheet, the DC Gain Accuracy for above 10mV/Div is 1.5% + what will probably added by the probe itself.
I'm looking for something like 0.5%.

Another solution I came across was to use a spectrum analyzer. We have got R&S FSV model.
I wanted to use it with additional attenuation (the max. input level for this unit is +30dBm, so according to my calculation, adding 36 dB attenuation would save from burning the input with 300 Vpp signal).
Though I'm still a little bit concerned about using expensive SA and dealing with high voltage signal. I'm not sure if this is the best choice for this type of measurements, I haven't calculated the accuracy it offers yet.

Could someone more experienced give me some advice what would be the best solution? I was also thinking about using multimeters.
Thanks in advance!
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7957
  • Country: us
  • Retired, now restoring antique test equipment
The specification of gain accuracy is different from the specification of linearity for such an input.
You should use a good x100 probe to look safely at a +/- 150 V swing.
If you use a x10 'scope probe, check its maximum ratings carefully before using.
Within the maximum ratings of the probe and the DSO, the linearity should be far better than 1.5%, even though the absolute accuracy of the measured voltage is only guaranteed to 1.5%.
Spectrum analyzer:  normally, a SA has an input impedance of 50 ohms, and is used with 50 ohm matched attenuators to measure high power signals:  +30 dBm in 50 ohms is 1 W, or 7.07 V into 50 ohms.
Probably, your high-voltage amplifier is not designed to drive 50 ohms:  300 V pk-pk into 50 ohms is 106 V rms, or 225 W mean power for a sine wave.
If you use a high-impedance 'scope probe (usually 10 megohm input impedance), be sure to adjust the compensation with the DSO (see manual) to get a flat response past 1 MHz.
 

Offline Vovk_Z

  • Super Contributor
  • ***
  • Posts: 1419
  • Country: ua
300Vpp and 1 MHz isn't something extreme. That is pretty close to testing a good Hi-Fi power amplifier.
I would start testing it with testing linearity (THD) at 1 kHz. It's very easy to make a linear divider for 1-20 kHz. Almost any nowadays soundcard may help to measure a THD at 1 kHz (I use ARTA in software demo mode, but there are many other s/w).
Then, next test - bandwidth - may be measured with any >=(10..50) MHz scope and functional generator.
The only one difficult test left is a linearity at 1 MHz. But there may be easy ways too. For example, if you have linear enough 1 MHz oscillator (0.1..0.5%) you just may compare input and output sine wave form. If there is visual distinction between input and output then the amp is worse or much worse than your required 0.5%. Accuracy of this method greatly depends on an oscilloscope screen size and your eyes used to it. I'm talking about using either analog scope or good enough digital one (>=10 bit one). I'm not sure about cheap 8 bit scopes.
« Last Edit: May 09, 2023, 08:57:46 pm by Vovk_Z »
 

Offline mawyatt

  • Super Contributor
  • ***
  • Posts: 3274
  • Country: us
At 1MHz the classic two tone IMD test might prove useful. Of course you need two signal sources around 1MHz, but the OP may have access to such. If viewed with a SA, then a suitable attenuator needs to be utilized but not a usual 50 ohm In/Out type. A simple high input impedance voltage divider should work well enough, maybe just compensate with a small lead cap around the high value series R to flatten the overall attenuator response.

Anyway, just a thought.

Best,
Curiosity killed the cat, also depleted my wallet!
~Wyatt Labs by Mike~
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf