I've been asked to design a load cell amplifier. The load cell has a sensitivity of 2mV/V and at the recommended 10V excitation I should get a reading of 20mV at the rated load.
I've used an instrumentation amplifier to amplify the differential voltage (which is usually around 2mV or so as we are not using the load on it's rated load yet). The problem I keep running into is that I notice there is a lot of noise at the output and the input - even if I aggressively put filters at the inputs and outputs. My guess is that my o-scope's probes (Rigol DS1052) are picking up RFI from the nearby radio tower (1.3 MHz). If I disconnect everything from the o-scope, the noise floor is a few hundred uV. If I connect the probes, the noise rises to a few mV.
Even more peculiar is the fact that if I have a probe connected to the output of my in. amp and I try to measure something with any of my multimeters the o-scope shows a very strong 50 Hz signal. In other words, touching the in-amp with any sort of wire/cable is inducing a 50 Hz signal. Here my guess is that the wires are picking up 50 Hz mains frequency. The meter's display also fluctuates a bit (a few mV here and there) when connected to the output of the in-amp so I can't even be sure of my readings. This has caused me a lot of headache as it seems that my gain varies with input voltage and does not seem constant at all - I suspect this is because of noisy measurements.
So I'm just looking for advice on figuring out how to measure these low voltage signals i.e. how do I know, with reasonable certainty, what I see on my o-scope is real? How do I know that the meter's fluctuation is because of 50 Hz pickup and not because the output is actually fluctuating?