if DC measurement is not required, it's better to use impedance transformer, it allows to add gain/attenuation with no adding noise. Just add 2:1 voltage ratio transformer on ADC input. But transformer frequency response may not be flat if you use it for wide bandwidth, so it may require to add digital FIR with compensation response which needs to be calculated for specific transformer at calibration time.
if DC (or VLF) measurement is required, then you're needs to increase Vref voltage for your ADC in order to match with signal dynamic range. If ADC doesn't allow that, then you're needs to use different ADC.
Another way is to add operational amplifier as a buffer before ADC, but it will add some noise and distortions and will reduce dynamic range which may be critical if you're use modern high resolution ADC. In addition, such a buffer (properly designed) may protect your ADC from voltage spike, but you're still needs to protect operational amplifier input according to it's specification.
PS: don't confuse ADC input protection with gain/attenuation to match with signal dynamic range. Input protection is intended to protect ADC from unexpected voltage spike which is out of range for a normal signal dynamic range. It's goal is not to affect signal when it's voltage within expected range, and enable protection when something going wrong. Usually this is two anti-parallel Schottky diodes attached to ADC input for ESD protection. But such protection will not work properly if your signal voltage range overlaps ADC input voltage range. It just leads to open diodes and as result your signal will be distorted with clipping.