A simple way to reduce the pre-bias voltage would be to connect another output from the microcontroller to the power supply feedback resistor network to change the supply voltage. It could normally run at say 140V to ensure that off elements don't glow even with a relatively low pre-bias voltage, but periodically raised to 170V+ for 50us or so to strike/ionize any digits which have changed or failed to strike earlier. Raising the anode voltage isn't strictly necessary, but it speeds up the strike time which depends on some random event such as a charged particle initiating ionization. In the dark it could take a second or more (including never) for a digit to strike but that would be unusual.
The anode resistor values would probably also need to be reduced in value as the anode voltage will rise considerably when no digits, or only the DP is lit, requiring a higher pre-bias voltage to prevent unlit elements glowing. Eg. Dave's 22k will cause a 22V rise in anode voltage when the 1.6mA digit current reduces to the .6mA or less of the DP.
You could probably get away with 33V drivers if all the Nixies are reasonably matched, but in general at least 50V would be required, even if you modulate the supply voltage, to allow for the relatively wide range of operating voltages of different Nixie types or changes over their lifetime.
Nixies are actually current controlled devices so ideally constant current drivers would be used. Using voltage drive + resistors is a proxy which is cheaper/more convenient but actually a bit tricky to design to accommodate all variances in tube characteristics, power supply and driver voltage and component tolerances etc. without having a high voltage supply of 180V+ and high pre-bias of 70V+.
The Burroughs application notes are helpful - see N102:
http://worldpowersystems.com/archives/Burroughs/.
[EDIT] Here is another useful Burrough's datasheet/applications note:
http://www.decadecounter.com/vta/pdf2/burroughs_616.pdf