Currently, I am studying transistors and the equations necessary to design circuits around them. I have learned that when using them as a switch, the base current dictates the collector current. If I calculate the base current to the requirements of the load is/why there be a need to limit the current after the fact? Will the transistor limit it based on base current? I ask because I see lots of circuits where a transistor is used to turn on LEDs, and most of them have current limiting resistors with them.
Thanks for any input.
Transistors aren't inherently a switch,-------they are an amplifier,so in any application,be it a "Class A" audio amplifier,or a switch,the base current dictates the collector current.
You
could,adjust your base current on test,so that your collector current was just right to light your LED.
In this mode,even though you only supply this forward bias when you need the LED on,your transistor is not a switch,but a DC current amplifier.
Here,there comes a big "BUT"---Hfe is not controlled very accurately over millions of transistors of the same type,so you would have to "tweak" each & every such circuit individually.
This is a problem with any type of transistor circuit,so practical circuits use "work-arounds" which remove the reliance upon device parameters---to such an extent,that in many cases you can replace a device with another similar one having a totally different part number.
Getting back to your "switch"---if the transistor is taking the place of a resistor in series with an LED,all the power which is normally dissipated in that resistor must now be dissipated by the transistor.
If we include a resistor in series with the LED,& arrange for the transistor to be turned on "hard",it has only a very low voltage drop between collector & emitter for the same collector current & dissipates much less power.
At the same time,the current through the LED is mainly controlled by the series resistor.