The conductor thickness comparison was mostly an exercise to illustrate how much more copper is needed for those higher currents. It is a lot. (P=I²*R losses)
I can see it was an exercise, but clearly an incorrect one which is why I corrected you. Obviously higher current requires more copper, there is no way around this. This is not related to 1-phase vs. 3-phase. Your example supposedly showed that 3-phase requires lower mm^2
per A - in other words, better utilization - but I showed that your calculation is incorrect, and now you just changed the goalpost to match the reality that the 1-phase 75A use case actually carries more current, which is indeed true.
If you need x A, then you need x A. Single-phase system has optimum copper use, since the current in incoming conductor and return path is always automatically equal.
Perfectly balanced 3~230V equals 1~400V system regarding losses, AFAIK, so it does a bit better. But you could as well use the 400V in a single phase system, or do what our US. of A friends do and use a center-tapped transformer to power "small" trivial loads with lower voltage, and big loads with higher.
3-phase is excellent for large-scale distribution and motors, but what it does
not offer is good copper utilization in residential wiring.
In other words, in 3*25A nominal systems it's completely possible and usual that some of the wires carry 30-35A (beyond the design current) while the total is still just say 50. This leads to
poor copper utilization. While in 1-phase house wiring, copper utilization is always optimal and geez, if you don't
need 75A, then don't wire for 75A.
Now I understand where your confusion is coming from: "3-phase distribution has good copper utilization" is a widely touted claim, because it is true. But you need to understand the scope of this claim. It applies to distribution at wide scale, where balancing the phase currents is well possible; this enables
complete elimination of the neutral wire - or utilizing the Earth as neutral - or using very undersized neutral wire.
But at small scale, this completely turns around; due to heavily unbalanced loads and zero sequence harmonics, neutral wire has to be dimensioned larger than phase wires (or, all conductors are just regularly overdimensioned).
Another strong reason for the 3-phase distribution are naturally 3-phase generators, and large 3-phase motor loads in the industry. But again, 1) this does not apply to households, 2) this is slowly changing.
Third reason for the "3-phase is superior" claim is that this still lives from the days when it was compared to a 2-phase system (not split-phase; 2-phase with 90deg phase shift). This discussion was relevant over 100 years ago, but back then 3-phase system offered zero neutral conductor current in a motor, while 2-phase system had double the neutral conductor current. Tesla knew what he was doing!
But Brits and Americans are not stupid in their 1-phase house wiring scheme. It is used because it is superior. Our inferior system leads into stupid compromises like manufacturers not willing to design 3-phase loads (they
are significantly more complex) and instead design 400V single-phase loads to be connected between two phases, which allows at least to extract 2/3 of the available current without causing massive imbalance. Examples of such systems are some Scandinavian water heaters, and most EVs which come in Europe with a 3-phase plugs, but only utilize 2 of the phases, causing unbalance and arbitrarily limiting their charging power. But 3-phase design would have been prohibitively expensive. No such issue if the house was just 1-phase to begin with.