Is it possible that it doesn't automatically generate a delay for a reason?
For starters, that minimum delay will depend on the output impedance of the voltage source being sampled.
For second, even if you are sampling multiple pins, you might not need those delays, at all.
You only need that delay after changing the ADC input, and there's nothing to say you need to change/select it before you take a sample. I mean, you're fairly likely to be using a timer(s) to trigger periodic sampling of these pins. So if you set the next input pin after your current read, you won't necessarily need that delay.
*Addendum: If you left that delay in and just deducted that amount of time from your timer, that would be 200 microseconds of your main level where the processor can't do anything, every cycle of your ADC read period. The processor can be interrupted, but otherwise the program counter is just going to be pissing off time when it could be doing w/e else it is tasked with when the program is not taking ADC readings. Of course, if you used that code in a top level ISR, that couldn't be interrupted, at all.
**Additional Addendum: This is just one potential example. And the cons might not be a big deal for a given project. I just wanted to throw that out there.
***Another Additional Addendum: In places where you do need an inline delay, but you have too many other high priority tasks to handle thru ISR (because it's full, perhaps), with a dab of assembly you can code software interrupts right into a custom delay. I bet you can do that in C, just fine, too.
It might be better to think of the MCC more as a device specific code template generator that spits out the bare minimum to operate that peripheral, vs an IDE "smart" feature that starts defining the sandbox without knowing what you intend to do with it.