I've attached a schematic of a possible LED driver circuit that I want to make sure is correct. I'm using the http://www.diodes.com/_files/datasheets/AP3211.pdf as a buck converter at 2.4v. It has a Feedback voltage of 0.81v, so I set R1 and R2 to 10k ohms and 5.1k ohms respectively. This should give me a power output of 2.4v, just above the forward voltage of the LED's at 2.2v. The MCU controls the LED's via BJT's that sink the current to ground. Does anyone see anything wrong with my design? Is a 0.5 amp diode ok to use if the circuit will be drawing between 400-500ma? Did I do the math correctly on the resistors?
You seem to be progressing it nicely.
The voltage drop across an LED (Vf) is not rigidly fixed. It varies from device to device, with temperature and LED colour, so trying to give it
exactly the right voltage is not the best of ways of doing it. You need to use something which controls and limits the current, or use resistors and a partially higher drive voltage, to give the resistors some headroom.
That's the advantage of switch mode LED drivers. As well as the buck converter, they regulate the current for you as well, for all 4 channels. But they are a bit harder to use, so I understand if you want to avoid them.
If you raise the voltage to about 3.5V (but a bit too low and tight), 4V or more. You can then calculate resistors (in series with each LED) which will give the desired current (100mA). The datasheet you supplied gives the
typical forward voltages. You will probably need different resistors, depending on LED colour, as the Vf changes, ideally you want different resistor values.
Here is an online LED resistor calculator, or better still do it yourself.
http://ledcalc.com/Opinions will probably vary as to how much more voltage you need, if using resistors.
Also the transistors will have a bit of voltage drop across them, depending on how saturated the transistor is, with whatever the base current is and what transistor they are. (See its datasheet).