No, it is not useless. You really need it:
- To learn how microcontrollers (and real computers, but to lesser extent) really work and what they really are doing
- To learn why you don't want to use it, unless you really have to
- To write compilers and OS kernels
- When you need sub-microsecond jitter or response time in few microseconds in your interrupts (but more often than not, you are doing it wrong)
- When it makes economical sense to use a chip that is 10c cheaper and the application is very simple (but you still might be doing it wrong, although not as often as in the previous case)
- You are forced to update a very old system (but then, you are doing it wrong) or one that was done wrong to begin with
- In a hobbyist cases, when you can't afford to do it right
And yes, it is mostly useless in real world, where productivity and reliability counts.
[ I build real time embedded systems for living. As a rough estimate, I would say that the ratio assembler vs high level language is 1 to 100000. But I don't write OS kernels, I just use them. At design time, the estimated quantities are in thousands (not in hundreds of thousands, that would change the equation!). So, a chip costing 1€ more is ok, if it saves a month in engineering. And a month is not much at all amortized over design, development, testing and maintenance. And 1-2€ buys a lot today: That's a difference of a tiny 4 or 8 bit micro and powerful 32 bit. It also pays for a dedicated peripheral controller or FPGA. YMMV, but assembler is hardly used in real life, in my work, practically never.
When I hire microcontroller engineers, I would not hire somebody who doesn't know assembler, nor would I hire somebody who prefers to use assembler. ]