So, we have this nice requirement to make a voltage integrator circuit that takes an exponential signal at its input (called VC_OUT). It is a negative exponential, decaying from -0.8V to 0V, but the decay time itself can vary somewhere between 5...100us. The purpose of the integrator is to encode the "area under the graph" of this signal into a signal called INTEG_OUT, which has an amplitude which directly depends (in a proportional manner) on the decay time of the exponential. The final goal will be to eventually capture the peak of this signal and feed it to an ADC which does not need to be a very fast or accurate one (as opposed to the case in which we would feed the decaying exponential, perhaps inverted, directly to the ADC and doing some digital signal analysis to determine the decay time).
Anyway, we came up with the circuit in the first picture below, which seems fit for the purpose. As you can see in the second picture, the output signal (INTEG_OUT) has a variable amplitude, directly dependent on the decay time of the input signal (VC_OUT). Don't mind the two MOSFETs, they are there to allow us exactly when to start integration, relative to the input signal.
However, the circuit has a problem. If I run the simulation for a given decay time and if I vary the supply voltage of the opamp, an unexpected and rather significant dependency of the output signal is observed (see third picture). I am a bit puzzled, as I was under the impresion that the operation of the voltage integrator does not depend on its supply voltage, as long as we do not enter in voltage clipping issues.
Please help me determine what is the cause of this dependency.
Best regards,
Cristian