I'm afraid I (as most of us) would tend to focus on details while missing out on the big picture.
And precisely you focus on details while missing out on the big picture:
Uh? I didn't see this coming.
Humility is probably not one of your strong points.
I am just saying -1- do not buy x86 only because it's the most popular when you can buy something else, and -2- inform yourself before pointing the finger only at the most popular solutions!
That's a very general statement that can be said for almost anything. And it does look sad in a way. Now I'm guessing you're addressing this to the world in general - you can't hold that against me as I've been repeatedly talking about alternatives for various different topics rather than always going for what's trendy and what everybody else uses just because it's popular.
That said, what I've been trying here is to say that there is usually a reason why such decisions are taken, and you are refusing to understand those reasons. And the reasons are not just all because everyone is a stupid moron. It's a bit more complicated than this. If you had just said what you said right above, I don't think your thread would have triggered the reactions it did.
I've refered you (and anyone interested) to a book that talks about it in details and that I find pretty interesting. Basically again, innovation can't really happen in a large established corporation. Pretty much all people that had good ideas that turned out to be great innovations made them happen after quitting their job and starting their own company, or joining a small one. Just the way it works.
Understanding and even accepting a situation doesn't mean that you have to agree with it. But just keep in mind - as you have probably noticed over the years - that fighting a reality, especially in the business world, comes at a cost.
As to:
Second, because it is really difficult to give an estimate of the deadline both to the customers and to oneself for the calculation of the hours, and therefore we often end up "working practically for free".
Yep, that is already true for software development in general. Estimation is a very hard thing to do right - many argue that it's impossible.
I feel the pain, but what you do in such a case is that 1/ either you refuse the project altogether (again if you can afford to) or 2/ you manage to get paid hourly (which the client may not want, in which case, the client will refuse and you're good), or 3/ you make your best estimate and multiply it by 2 or 3. Probably 3 or 4 in your case for anything x86 related.
As a general rule, computing systems can be more or less defined as a triangle (a bit like the concept of the project management triangle): hardware, low-level software, and application software.
One of the vertices always has to give in. It tends to turn around a bit over time, but in most cases, it's the low-level software side of things that has to be the adjustment variable (and thus has to suffer).
So as another example, while writing application software on Windows has always been rather easy with a good level of backward compatibility, writing device drivers for Windows is close to a nightmare. (I think it has become a bit easier, I think mainly because you can now write many drivers as just filter drivers. But I still wouldn't want to have to touch that ever again.)
Regarding ARM, the issues are different. Calling ARM rubbish as magic did was extreme, but he probably wanted to make it sound a bit trollesque (as he likes to) while mimicking your own bold assertions. Now the issues have more to do with fragmentation, which is precisely what you tend to get over time when you don't/can't favor compatibilty. You can't have it all.
In that regard, RISC-V with its heavily modular ISA runs the risk of ending up even more fragmented. While I fully support RISC-V, mostly in the sense that it's an open-source ISA and allows anyone to design their own CPU, royalty-free and with no potential licensing issues ever (which opens a world of new possibilities), I don't know how the fragmentation problem is going to get solved if it's to be used for general-purpose computing (unless it gets gobbled up by a big fish which will dictate a standard all on its own, which would not be good news.)