I do not believe for a second that ARM Cortex M processor family is being "phased out".
It's like claiming English is being phased out, because more and more people speak X.
Just look at the existing 8-bit processor families, which are still used in new designs.
And they will be used in new designs for as long as it makes commercial sense.
For obvious reasons, the 32-bit processor niche is much, much larger.
It is, however, true that with at least the GNU toolchain, it is relatively painless to move between hardware implementations; and in that sense, those looking for ARM chips for new designs don't care that much about backwards compatibility.
Nevertheless, the existing codebase for ARMv6-M to ARMv8.1-M is so large, that existing product lines are easier to upgrade with somewhat compatible newer processors. And this is a very big reason why ARM Cortex M 32-bit family at least will not be going anywhere soon.
If we extend this to 32-bit processors in general, we'd need to look at the business cases where a 64-bit processor would be a hindrance compared to 32-bit, if not for anything else, then because of the added code size, or the need to port existing codebase to 64-bit compatibility (which traditionally has revealed quite a lot of idiotic assumptions programmers make, and have to be fixed in such transitions).
The vast number of processors are in embedded devices. Cellphones are just one class, albeit a big one; and because of their multipurpose use, they probably do benefit from 64-bit support. But what about the display controllers ("graphics cards") in them; the biggest scalars those use are 32-bit, so 32-bit plus SIMD extensions on vectors makes most sense. And what about modems, routers, TVs, storage devices, and so on?
A vast majority of embedded devices at this point gains basically nothing from having 64-bit support. Typical routers, modems, TVs, etc. that you do not notice, have 32 - 256 megabytes of memory, and basically gain nothing from having more than that; they just aren't even hitting the limits of 32 bits. (Again, multifunction devices do differ. And perhaps programmers are worse and worse year after year, so that it makes more economical sense to buy more powerful hardware, so that even bloated crappy code works, somewhat, on them. After all, Microsoft has already managed to convince at least one generation of humans that devices are supposed to crash every now and then for no discernible reason.)
One should examine what kind of processors human interface peripherals – mice, keyboards, etc. – still use, when considering answering the above question. They do not need more than a few hundred bytes of memory, plus a full native USB support, so the vast majority of them run on some 8-bit processor. Why would they move to 32-bit? The same is with currently 32-bit appliances that are not hitting any limitations for the kinds of workloads customers have.
Put simply, I think the claims in the initial post are complete bullshit by someone who can only perceive a small slice of the world and believes that is everything there is, and thinks it is his Dog-given mission to convince everybody else. Or at least get them to part with their money. The only reason the video exists is to gain views by sowing discord (like this thread here), just like news no longer report actual events or facts, but just spin everything in an effort to rile people up.