System voltage choice is really a optimization thing, but there is no clear sweet spot in general; it's a fairly wide range, with multimodal sweet spots all over the place. Furthermore, these spots live with market situation, etc. It comes down to:
* What MOSFETs/IGBTs are available at the time, at what prices (Vdsmax * Rdson / price, basically)
* Motor dimensions discretize the winding choices: i.e., have 10 turns, or 11 - can't practically have 10.5.
* Magnet wire availability and choice, can't have any arbitrary thickness, so number of paralleled strains discretizes as well
...
But the sweet area tends to be quite large, as can be seen from the fact that 500hp muscle EVs have been built as sub-100V systems, and OTOH small 20kW drivetrains designed with voltages over 700V.
So the actual choice is far from trivial, but on the other hand, many different choices work almost equally well, it's small optimization.
I'm assuming a lawn mower is around 1kW, maybe 2kW max. For this power, the battery-inverter-motor system can be sanely designed between about 30 to 150V, I'd say, with little difference over the resulting efficiency or performance.
And guess what thinner wiring will be lighter, therefore boost efficiency, in a self-propelled lawnmower.
Assuming total wire length (battery to inverter, inverter to motor) of around 500mm, and 1kW power, between 40V (25A) and 80V (12.5A), the difference would be tens of grams.