So here are the results with the real circuit!
With the input of the first opamp completely shorted I am getting about 8 nV/sqrt(Hz) which totally satisfies me for now, until I shield the whole circuit completely.
I made the whole setup: I connected the MCU i want to use in this application, recorded 100 ms of data, calculated its RMS, divided it by gain and divided it by sqrt(noise bandwidth of the filter). I think the circuit is mostly a success.
But then I have a problem: I use the LM2594 DC-DC converter running at 140 kHz to generate the negative -2.5V rail for the op amps. When I connect the transformer to the input of the opamp (and remove the short between opamp's input and GND of course), somehow the 140kHz and 280kHz from the converter get coupled in so much that the output of the circuit gets saturated.
I've provided much filtering for the -2.5V rail with an additional LC filter, which should result to about 2uV of ripple at the -2.5V rail. From all this I conclude that most likely the transformer is responsible for this. What do you think?
I should have probably used the pot core for the transformer because it's better shielded, but honestly I thought that toroids don't leak electromagnetic field and thus don't pick it up.
I've tried putting a tiny tin box over the toroid but it didn't change anything much.