I'm sorry, but I think you are alone with this viewpoint.
Saying that, when others such as myself raised the same point, underlines the "secret math cult" arrogancy. You are totally proving our point.
Besides, this thread is titled "admit your brain lock". What the fuck are you doing, we are discussing honestly about issues we are having understanding stuff and you come here to ridicule us for that.
There really is a reason to use different versions of letters in formulas. For example, \$v\$ can represent velocity while \$V\$ can represent volume, and both can appear in the same formula.
Nice idea, but this shows how little you have actually worked with. The idea of fixed assignments of <100 symbols is completely dead. v and V can represent a lot of other things besides velocity and volume, and notoriously E can and WILL represent both energy and voltage, both of which often appear within the same formulae (also, W can represent energy (often change in energy, why won't you ever say dE for consistency?), U and V voltage - so much for "volume"). This is a total mess. Improvement starts from accepting this fact and not expecting fixed symbols to work.
The only real way to deal with this mess is to choose best suitable symbols for the job,
and always explain the symbols used. Come one, if you have a formula with 10 variables, it's ten lines of text to explain them all. There is no excuse not to do this. But no, you guys here come to defend this mess and outright refuse to admit its existence. You think math and its notation is elegant and consistent, in reality this couldn't be further from the truth.
This discussion has made me more certain than ever that what we are indeed seeing is 100% deliberate obfuscation.
I'm not suggesting of implementing modern-day programming practices into math directly, but it is quite revealing to think how most of mathematical work would never pass any code review process.