Hey all,
I am working on understanding the error caused by input bias currents of opamps. Now I know that fet input opamps have way lower bias current than bjt input opamps, so I took the TL081 and LM358 to experiment with. In this case the circuit I am working with is an AC coupled non inverting amplifier with a DC offset of VCC/2. To create this DC offset, a simple voltage divider is used as visible in the attached image. To improve on this design, an extra resistor can be added which "injects" the bias current onto the AC to add the offset. This is ofcourse so the lower halve of the voltage divider can be turned into a low pass filter, eliminating some power supply noise from entering the opamp.
I've been experimenting with different values for the voltage divider resistors and the injection resistor to see when some kind of DC error starts showing up. The tests were for each opamp:
1) Voltage divider 2x1k, 2x100k, 2x2M without injection resistor.
2) As above but with injection resistor of 1k, 100k, and 2M tested with each voltage divider pair.
I cranked the sinewave at the input until it almost clips at the headroom of the opamp at the output, and then scoped the output while conducting experiment (1) and (2). But while doing this, it seemed like none of the experiments had any dramatic effect on the quality of the sinewave output. The 1k divider significantly reduced the amplitude as expected, but by adding the injection resistor, this was solvable.
My expectations was a great shift in DC offset resulting in heavy assymetrical clipping, but that didnt happen. Am I not understanding this problem well? Is my experiment wrong (too low gain for it to have effect?) Does input bias current not matter with AC coupled signals (doubt...)?
To further read upon this issue
https://www.analog.com/en/analog-dialogue/articles/avoiding-op-amp-instability-problems.html under "Decoupling the Biasing Network from the Supply"
Please enlighten me!