Did my subject line catch your attention? Yes, that's right -- if you source current from Ch3+ and sink it into Ch2-, the current flows through the voltage sense wires and induces a drop (and therefore error in voltage readout/regulation) of up to 57 mV, which is well outside spec (0.05% + 10mV = 10.5mV). Sense wires appear to have a resistance of 19 milliohm. Here's a specific example:
Configure both outputs to 1V, 3A limits. Attach voltmeter to Ch2.
A. Measure voltage. You get 1.000V. Win!
B. Short out Ch3+ to Ch3-. No worries, you still get 1.000V. Win!
C. Now, instead, short out Ch3+ to Ch2-. Now you get 0.943V, even though the display on the power supply still reports 1.00V. Boo!
Now you might be saying I'm being a bit silly, returning the current to the "wrong common terminal". Now I'm of the opinion that common terminals should be just that, common, completely identical in every way. But more to the point, if you configure a system with +/-12V rails for your power audio output (for example), and +5V for some other logic, then you've got a common ground all around and you should be able to just have one ground wire running from power supply to circuit-under-test. Having to carefully account for and make sure that every electron has an easy path back to its "home" negative terminal seems a bit nuts.
I mean, the whole point of sense wires is that they're supposed to have (practically) zero current, and therefore zero voltage drop, right?
What do you guys think? Am I being pedantic and making a fuss, is this standard practice for non-isolated channels on power supplies? Is it generally known in the profession to be bad practice to rely on common terminals being, well, common? Or is this a real problem particular to this power supply?
Update: Threw together a video to demonstrate this: