..For example, on one part Microchip defines the "drop out voltage" to be 100mv at the given data sheet output current level, down from a 1 volt differential voltage. That would mean for a 3.3 volt output you'd have to be sure the device always had at least 3.4 volts input...
I would recommend to read the TI's slup239a.pdf about the dropout voltage and its meaning.
You will see it is not such an easy topic as you write.
This paper provides a basic understanding of the dropout performance of a low dropout linear regulator (LDO). It shows how both LDO and system parameters affect an LDO’s dropout performance, as well as how operating an LDO in, or near, dropout affects other device parameters. Most importantly, this paper explains how to interpret an LDO’s datasheet to determine the dropout voltage under operating conditions not specifically stated in the datasheet.
A perception that when an LDO have got "100mV dropout voltage" I will need >100mV higher input voltage (considering the Iout) to operate it, is absolutely wrong..
Hello again,
You've made this topic even more interesting now. I for one appreciate that.
I think I see what you are saying here. The way "dropout voltage" is specified is sort of back-asswards.
However, does it make a large difference.
What you are showing here is that if we had a 5 volt regulator with a dropout of 1.2 volts, we would probably assume we needed 6.2 volts input to get that 5v output, because the input to output differential is 1.2 volts and 5+1.2=6.2 volts.
In reality, if the dropout is specified as the differential input to output voltage WHEN the output reaches 100mv less than the nominal value, then the input voltage would really have to be a little bit higher to maintain a perfect 5.000 volts output (in theory, and output current at the specified level).
With that second statement in mind, that would mean we would measure the differential voltage when the output got down to 4.9 volts because that is 100mv less than the nominal value of 5 volts.
So let's go with this and see what happens.
If the differential voltage of 1.2v is specified at 800ma, and we know the output at that point is 4.900 volts, then the current would go up when we finally got that desired 5.000 volts output. It would go up to about 0.816 amps.
Assuming the minimum internal resistance holds over a small increase of the output, that would mean that to start with the internal resistance R would have been (6.1-4.9)/0.800=1.5 Ohms, and the new current of 0.816 amps means we would have a voltage drop of about 1.224 volts, meaning we would really need 5+1.224=6.224 volts input instead of 6.200 volts in order to get 5.000 volts at the output.
What this means is that we would have to add 44mv to that original estimate of 6.200 volts in order to be sure we got 5.000 volts at the output.
So now the question becomes, what would we get at the output with just 6.200 volts input?
If we assume the max current of 816ma at 5 volts even though we may have slightly less than 5 volts, that would be the worst case current. With the worst case current of 816ma the output resistance would be 6.127 Ohms, and using the voltage divider formula that would give us an output of close to 4.981 volts. That's about 0.38 percent (that is slightly greater than one-third of one percent). So we lose less than one-half percent if we assume the theoretically less accurate rule of thumb estimation.
The question now would become, what types of applications would suffer because of this inaccuracy.
Regular run of the mill applications would probably not be affected, but voltage reference type applications might suffer and so for that type the effect would have to be considered on a case-by-case basis.
Feel free to go over this and offer a correction if needed or whatever else you want to add.