Assuming the per-pixel noise is close to truly random, it should be possible to hit closer to the actual value with interpolating 4 pixels vs. just sampling 1 pixel (e.g. having 32MP instead of 8MP).

In addition this is helped by the fact that SNR of image sensors is proportional with the _linear_ pixel dimension and not the area, i.e. a big pixel will only have twice the SNR than a quarter pixel (vs. the expected four times if it would depend on the area).

Source:

http://www.imatest.com/docs/noise/"Pixel size. Simply put, the larger the pixel, the more photons reach it, and hence the better the signal-to-noise ratio (SNR) for a given exposure. The number of electrons generated by the photons is proportional to the sensor area (as well as the quantum efficiency). Noise power is also proportional to the sensor area, but noise voltage is proportional to the square root of power and hence area. If you double the linear dimensions of a pixel, you double the SNR.

The electron capacity of a pixel is also proportional to its area. This directly affects dynamic range."

A lot of theory, but as a practical exercise just compare the per-pixel quality of your images at 100% vs. downsized to the 1MP version you'd create for posting on the web.

Edit PS: Dynamic Range is another story and probably the real weak point of this approach