Indeed much greater values can be used, but don't go too far - even PCB surface contamination leakage could cause issues.
I have seen people intuitively think that if you have a leakage resistance of, say, around +/-1GOhm, then having a measurement impedance of, say, 10Mohm would be fine as it would be "two orders of magnitude less". But this two orders of magnitude less would translate into generating a 1% measurement error, which probably is already significant for a battery SoC measurement.
(Surface leakage seems to be hard to quantify reliably.)
The same goes for the ADC input leakage current, even if it's tiny, you need to be at least 3 orders of magnitude away from it to keep the error below 0.1%. Of course, if the leakage is constant, it can be calibrated out, but how do you know it's constant?
All this being said, I'm still almost 99% sure you could just increase the divider values to solve the problem in a simpler way.
Note that while the ADC's DC leakage is small, it consumes considerable charge every time you measure. There is a cheap, passive way, and a slightly more expensive, active way to deal with this:
1) a capacitor at the ADC pin (say, 10nF, or a BOM reuse power decoupling cap of 100nF),
2) an opamp driving the ADC input.
1) suffers from the fact that the driving source needs to replenish the cap after each measurement, and if your resistor divider values are big, this takes time. Otherwise, the results start to drift.
For example:
With a ADC sampling capacitance of 20pF, a 10n capacitor would cause approximately an error of 20/10000 = 0.2 percent (the chance of voltage in the 10n cap) during sampling. So we don't want to go much below 10nF, it starts making a difference.
With a 10n cap and a 2M2 + 4M7 divider talked about earlier, the RC time constant is 1/(1/2200000+1/4700000)*10*10e-9 = 15 milliseconds. After 5 time constants, i.e., 75ms, the capacitor is charged to 99.3% of the final value, assuming it started from zero, of course, which is a fair assumption to handle the edge cases like startup. Of course, in practice, the ADC reading only diminished 0.2 percent of the capacitor, so there's less charge needed to be replenished. In any case, this means you can only trig the ADC every 75ms. Put it in a free-running conversion mode at 10ksps by accident, and you'll likely see severely drifted measurements!
Circuit-wise, such a capacitor is simplest, by far, and often analog filtering is desirable. This is often the best for battery measurement, unless you need to measure and trace voltage spikes, i.e., sample at higher rates. If this is the case, you'll need an opamp to buffer the divided signal for the ADC.