I think phase shift would be a result of not so much an alternating current or voltage, but simply a varying quantity.
Could a large capacitor be placed in parallel with the inductor to make it less inductive? Or would that make things just so much worse? Probably opening a can of worms here...
Varying quantity of current or voltage? Doesn't DC mean it will stay the same, provided the power supply is doing its job?.
An electronic current (or voltage) source is typically a closed control loop i.e. the output current will be measured, compared against some target value and then the output voltage adjusted accordingly to minimise the error. This does not happen instantaneously, i.e. the adjustment is never exactly in phase with the measurement. If you add an inductive load this delay increases even more since the inductor resists current change. A fixed time delay equates to a phase shift that increases with increasing frequency. At the same time loop gain falls with increasing frequency, but if you get enough phase shift before the loop gain has fallen to unity your current source turns into an oscillator.
The fact that you are setting a DC current makes no difference, if you introduce enough phase shift it will very likely oscillate. This is why I suggest you speak to the manufacturers regarding the application, they will (should) have an in depth knowledge of the performance of their device and should be able to tell you if your load is likely to be problematic.