Hi everyone,
I'm trying to calculate the input current noise for an opamp but the result doesn't agree with the datasheet: First I set a big resistor (10M) at the opamps input and run a noise simulation like so:
Suppousedly most of the noise at the input should be due to the noise current running through the big resistor (6.9fA/Hz½ @ 1KHz according to the datasheet) and noise voltage contribution should be negligible (3.2nV/Hz½ @1KHz). The math checks out, you should get a total input noise at 1KHz of:
V_RTI = √( (6.9E-15 * 10E^6)² + (3.2E-9)² ) = 69nV
And finaly since those 3.2nV are negligible we could get the noise current by dividing our total input noise by our source resistor
i_noise=69nV / 10Meg = 6.9fA
The problem is running it on LTSPice gives a total input noise of 137μV instead, which accounts for a current of 13.7pA, more than 4 orders of magnitude above the datasheet!
The opamp model specifies in it's .cir file that current and voltage noise are appropiately simulated. I don't understand where the disparity stems from but I can't proceed optimizing the systems noise floor if the simulation is wrong in a way I don't udnerstand :/
Here's a link to the sim files since the forum won't let me attach them:
https://drive.google.com/drive/folders/1zqDaC4hu1jqUYtOfWeSqcT3Ud-xvuTNG?usp=sharing