Author Topic: Is it better to operate a transmitter at high V, low I, or vice-versa?  (Read 1716 times)

0 Members and 1 Guest are viewing this topic.

Offline DrMagTopic starter

  • Regular Contributor
  • *
  • Posts: 61
I've been working toward a project where I will be transmitting data using a home-brew transmitter, likely on either the 2m or 70cm band. As I've been researching the design requirements for the project, a question popped in my head that I'm not sure I know how to answer. It mostly relates to what I use to power the transmitter.

Is it better to operate a transmitter at a low-voltage with high-current, or at a high-voltage with low-current?

I suspect it's 6s in the end, but would I be running into any problems powering a transmitter at 5V as opposed to 7.4V like is done in commercial devices?
 
The following users thanked this post: AF6LJ

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 21688
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
The motivating factor is usually the junction capacitance and operating (V, I) range of the finals.

Back in the toob days, you could choose anything from 100V (TV sweep tubes) to several kV (proper "transmitter" tubes), and up from there at higher power levels.  (The largest single linear amplifying device in history* is a ceramic transmitter triode, rated for around 30kV and 30A, capable of 2MW signal output in class C operation.)

(*Hmm, correct me if I'm wrong on this.  MOSFET dies only go to a few kW in linear operation, and the rest are switching devices: IGBTs, SCRs and so on.  Most of those are multiple dies in an integrated module, too, not necessarily "single".  I don't think cold-cathode field emission and ionization devices can do linear operation, but I've heard really very little about them.  There may be larger magnetic amplifiers; in that case, I might add the condition: largest in watts * maximum operating frequency.)

Anyways, tubes have the predominant limitation that their transconductance and load conductance are very low, relative to capacitance, and to other more fundamental limitations of the device.

The bandwidth limit is given by the load resistance times the plate (or collector or drain) capacitance.

More bandwidth is desirable, because the transmitter needs less precise tuning for a given operating band, or can span multiple bands.

It turns out that the best transistors -- in terms of bandwidth, efficiency, power capability, and generally being reasonable to operate (not requiring dangerously high voltages or annoyingly high currents), fall in the range of 30-200V (and as many amps as the junction width and power dissipation allow for).  For higher voltage ratings, the junction is physically longer (i.e., the path the current takes from "+" to "-" -- MOSFET channel length or BJT collector depletion region).  Larger distances mean lower frequency limits.

There are some high voltage RF MOSFETs available, but they're usually rated for only 30-100MHz, for Vds in the 1200V to 600V range.  (Whereas a regular LDMOS transistor rated for 120V and 10A will do low GHz without a problem.)

I'm not aware that anyone is making power PHEMTs or MESFETs or GaAs/GaN FETs in high voltage ratings, for microwave service; there are GaN power switching transistors, which are relatively new, that are much faster than Si MOSFETs for the same applications.

So, where your question fits into all of this:
- For a given device, operating it at lower voltage, and the same maximum current limit, requires a slightly lower load impedance.  You have less V and same I, so V*I = P is lower.  The reduced load impedance can extend bandwidth the same amount (but whether that is available as signal bandwidth, depends on the design of the rest of the circuit).
- For an optimal device, you can match to a lower load impedance, and draw more current, keeping power constant.

Too-low impedance is also undesirable, due to stray inductance having the same bandwidth-limiting effect that capacitance has at high impedances.

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Offline DrMagTopic starter

  • Regular Contributor
  • *
  • Posts: 61
I guess that should have been intuitive after all..

Just to be certain I understood clearly, if we assume the impedance of my antenna to be fixed (50R ish), then the only way to increase the operating power is to increase the driving voltage, as fixing the voltage would by definition fix the current.

If I power my transmitter with 5V, then the best I'd be able to get out of it is 125 mW, correct? ((5/sqrt(2))**2 / 50)
 

Offline rfeecs

  • Frequent Contributor
  • **
  • Posts: 807
  • Country: us
I guess that should have been intuitive after all..

Just to be certain I understood clearly, if we assume the impedance of my antenna to be fixed (50R ish), then the only way to increase the operating power is to increase the driving voltage, as fixing the voltage would by definition fix the current.

If I power my transmitter with 5V, then the best I'd be able to get out of it is 125 mW, correct? ((5/sqrt(2))**2 / 50)

If you are using resistors to feed DC power to your amp, instead of inductors, in a single ended configuration, then you will get up to 5V peak to peak output, or +-2.5V peak, or 1.77V RMS, which is 62.5mW into 50R load.

Which is why you wouldn't do it that way.

Most RF power amplifiers employ an impedance transformer.  So with 5V you can get as much power as need.  For example, 50 ohms can be transformed to 5 ohms.  A 5V peak voltage swing into 5 ohms is 2.5W.  The transformer would convert 5V to 15.8V at the 50 ohm antenna for 2.5W out, assuming everything is ideal and lossless.
 

Offline AF6LJ

  • Supporter
  • ****
  • Posts: 2902
  • Country: us
I've been working toward a project where I will be transmitting data using a home-brew transmitter, likely on either the 2m or 70cm band. As I've been researching the design requirements for the project, a question popped in my head that I'm not sure I know how to answer. It mostly relates to what I use to power the transmitter.

Is it better to operate a transmitter at a low-voltage with high-current, or at a high-voltage with low-current?

I suspect it's 6s in the end, but would I be running into any problems powering a transmitter at 5V as opposed to 7.4V like is done in commercial devices?
Not knowing very much about what you are doing makes this difficult.
Generally it is good to run with the highest voltage you can get away with, but that is not always true.

It's hard to answer given that...
1. we don't know what power level you are considering.
2. you haven't decided on the band you want to use, this makes a big difference in transistors you have to choose from.
3. What is your available voltage.
4. What mode of emission are you planing to use to transmit your data with.  We need to know this because your amplifier may have to be linear.

So at this point the possibilities are wide open.
Sue AF6LJ
 
The following users thanked this post: voltz


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf