I've modified a power supply from an old server to give it voltage and current control from 1 to 14V up to 36A limit. The 1V minimum is due to smps's not being very happy below that. Therefore part of the design was to have a 1V drop before the output because being able to go to 0V is very useful for what I need.
I designed the circuit below to provide me with the 1V drop. It subtracts V_drop from V_supply and the output is taken across RL.
At first I had major stability problems, but after reading up on op-amp compensation methods I found that C2 in conjunction with a gain >> 1 for U1 made the simulation stable across all my loads and for a V_supply from 1 to 14V.
Now the real circuit doesn't work nearly as good. For V_supply from 1V to 4V, U1 oscillates badly, but is perfectly stable past that range. The oscillation frequency seems to be ~625Hz or so, as best as I can tell by reading my analog scope's divisions.
Despite the oscillation, the output voltage is correct, but I know it shouldn't be doing this and I'm stumped as to why. My circuit is layed out on strip board, which is probably the cause of some problems, but I still don't understand what's happening.
Edit: Scope is set to 5V/div and 2ms/div