I've been reading about driving mosfets and the different methods to do it. Not all of it I understand, so I'm trying to clarify a few things so that I get a better idea of how to go about it (in other words, there's no one question, I'm trying to see if I understood what I've been trying to learn... There's no other way for me to compose these questions without having to post multiple questions on the same topic).
Mosfets have a gate capacitance which is analogous to a small capacitor that has to be either filled or depleted in order to turn the mosfet on or off. Assuming that a middle-point is undesirable when PWM-ing a mosfet (it should either be completely on or completely off), the capacitor has to be filled or drained as fast as possible (the driving waveform has to be as close to square as possible). A partially-filled capacitor would lead to the mosfet being in a high-resistance state, "burning" energy unnecessarily.
At very low switching speeds, something like an IO pin from an MCU would be enough. A slight step up would be passing the IO signal through an inverter/buffer which has slightly more driving power.
At all but the lowest frequencies you need to dedicate some external components to charge and drain the gate. In the simplest form those would be an NPN and PNP BJTs. At higher switching speeds you'd use a dedicated mosfet driver.
Assuming that driving voltage isn't a problem (either you're driving a low-side n-channel or a high-side p-channel, in both cases there's no need to bootstrap), is current the only factor to driving a mosfet at high frequencies?
A slow transition time on the part of the driver would mean a less sharp waveform, which would lead to more half-on time, leading to more energy lost as heat. Is it possible for a driver (in whatever form it may take) to have a high current capability, and yet not be able to produce a "square-enough" waveform?
It seems like the larger the mosfet the higher its gate capacitance (is this always the case?). It's easier to drive smaller mosfets than larger ones because you have to move less current into and out of its gate. The faster you switch the mosfet on and off, the more times (per-time-interval) you have to fill and drain its gate.
So there are 2 factors when determining how much power is required to switch a mosfet: the gate capacitance and the frequency. Is there a practical way to calculate this? If I have the specs for a particular mosfet, and I know at what frequency I want to drive it, is it possible to determine how much current I'm going to need in order to effectively drive it?
This seems like a matter of calculating the amount of current required to fill and drain the gate capacitance at the necessary rate, but almost every document on the subject insists that this is an oversimplification. In what way?
Is driving high-power mosfets at high frequencies always going to mean a lot of energy "lost" to filling and draining the gate? Even if the mosfet is driving very little current, as long as it's being driven at a high frequency (and it's a large mosfet), the circuit will constantly be "burning" energy to switch the mosfet on and off?
I apologize in advance for the meandering post... Any feedback would be welcome.