Forgive me for not having the time to dig deeply into your problem, but what I think I got from reading your responses to others is that your output 1Hz square wave is riding on a DC offset from the battery, and the battery voltage may vary as its load changes.
Doesn't that imply that the battery voltage can be measured independently? That there's some point at which all the separate cells are tied together? If the battery's ESR isn't too high (which it shouldn't be for an EV application given the peak current requirements) then presumably your 1Hz square wave doesn't modulate the actual battery voltage. If that's true, then you should be able to take the actual battery voltage and do a simple subtraction with an opamp to remove the bias from your 1Hz output waveform. Might take some scaling to calibrate it, but once dialed in the battery's "variable" output voltage would act as an input to an opamp and be subtracted from your output waveform that contains the same battery voltage as an undesired DC offset. As the battery voltage varies, so does your DC offset - and so too does the input to the opamp. No filters, no phase problems, no firmware.
Again, I apologize for not digging deeper, perhaps there's a reason you can't do this, but from the 100,000 foot view this is where I'd start.