I'm glad someone brought this up.
I'm working on a circuit where I have a voltage of, for example 6.7 volts. I need to get it down to a maximum of 5.5v for logic gates. My space is limited on a board, and since the logic gates require very little current, I'm thinking a 5.1v zener would be perfect for this, vs. a voltage regulator (which requires more board space and more components.)
Thoughts?
Someone did mention they are useless vs a voltage regulator if you need any amount of current draw.
How exactly does your actual circuit look? Do you want to regulate the voltage of the supply to the logic chips, the inputs or both?
If it's for the supply, and you will only use a low current, you can get away with connecting any bog standard diode (not zener) in series with the supply to the logic chips to produce the desired voltage drop. For example, if you put two 1n4148 in series with supply, that will drop the supply by 1.4 to 2 V depending on the current. Also place decoupling caps to ground on the logic chip side of the diode.
If it's for the inputs, a series resistor, maybe 1k, or a series resistor plus a zener might do the trick. This depends on how sensitive the inputs of the logic chips are.
A zener regulator is
not recommended for most applications because it's wasting power. The zener is shunting current to keep its nominal voltage, which means you need a series resistor to limit the current. But the maximum power consumption of the circuit sets a maximum value for the resistor, or else the voltage drop will increase when the circuit tries to use more current, and the resulting supply voltage gets too low. On the other hand, the low resistance value that you need for the series resistance means that if the load drops, the zener will start conducting to make up for the higher voltage. In effect, a zener regulator will approximately always consume the maximum power consumption of the load, if the load is variable.