Hey,
I have had this question for a while. Can't quite figure out why for are 18V configs used for 12V installations.
Most solar cells are most efficient around 0.5V/cell (assumed 50°C surface temp). Typical configuration is 36 cells per panel, which gives around 21V open circuit.
The thing is, matched voltage/load for such panels is 18V not 12V.
When charging lead-acid at 13.5V, or li-ion at 12.0V, 33% of this potential efficiency is wasted based on the I.V curve of most cells (such as A300 here):
https://cs.wmich.edu/~sunseeker/files/A-300%20data%20sheet.pdfPreviously I though that solar controllers act as a dc-dc step-down transformers, but based on results of those I tested, that's not the case.
They output the same amps to 12V batteries compared to just the cells.
So, where's the catch?

How to get maximum output from 18V panel when charging 12V battery?
Should I set-up additional buffers closer to matched voltage and then step-down to 12V?
Prolly a stupid question, but still...