I've been trying to design a slightly-better-than trivial "UART over laser pointers" circuit, and this is what i ended up with:

Ideally, this should remove the ambient light from the signal.
The photodiode's pre-amplified output is split in two, one side is fed through an RC filter and a buffer producing what is essentially the average DC level of the ambient light.
So far so good.
This average level is then subtracted from the actual signal by a differential amplifier, and the remaining "useful signal" is amplified and thresholded to produce the 0-1 a computer can digest.
And here comes the problem - above a certain ambient light level the output of IC2B starts tracking the brightness rather than subtracting it completely.
It looks like a rising sheet of paper covering the signal rather than the signal getting shifted up along with a zero point.
I've tried to vary R6-R9 values, the higher they are the more light it takes for it to start rising, but the less sensitive it becomes. 100k seem to be the optimal value.
Other than that nothing i tried made any difference.
I suspect the problem comes down to some case of "real op-amps don't work like ideal op-amps", but i don't know enough about it to even know where to start looking for the problem.
So, the question is - is this a workable design, and in either case what am i doing wrong?