IMHO it is still better to use industry standards to teach principles than using obscure languages and hardware. There is a general concensus that there is a disconnect by what students learn in school and what they meet at their first job. There is also a difference between the various levels. A BSc graduate will be expected to work hands-on on hardware where an MSc graduate is likely to land a job which is at a more theoretical level.
Yes and no. I agree with you theoretically, however practically that is just not possible in many cases. There are both pedagogical reasons for it (teaching concepts not tools) and purely practical ones.
E.g. I am software engineer by trade who taught a lot of programming and computers graphics courses. If I had to teach programming using "industry standards", what should I pick? Should I focus on javascript because web programming is *the thing* today? Which one? The classic one? The ES6? Which frameworks? React? Vue? Angular? Whatever-the-fad-of-the-day-is? Which versions? (they tend to become obsolete 2 months after release ...). This is an extreme example, but it isn't that far off in other fields neither. And then someone will cry why am I teaching Javascript and not Typescript, why I am using node.js instead of Rails (or vice versa), etc.
Something changing that fast is not something you can build a curriculum on, especially when curricula are being updated and approved in 5-10 year cycles. The teacher is not free to teach whatever they want, they have a curriculum to follow, with more or less implementation freedom, depending on course/department/university.
Also, in order to teach a certain technology you need to have the people who actually know it - the university profs are rarely following the latest tech for teaching because teaching is almost never rewarded. Their tenures and funding hang completely on their research output (papers, patents, etc.), teaching gets little to no credit. You can be the best teacher in the world but if you don't produce research then you get fired/your funding gets cut. Given such conditions, teaching is often see as only the necessary evil the profs don't really get paid for and don't get resources for. So only minimum time and money is spent on it. So nobody is going to train themselves for e.g. a new CPU architecture only for the sake of teaching it.
Apart from teachers, you then have all the supporting materials for the new tech (books, software, lab equipment, etc.). That all costs non-trivial amounts of money and e.g. a school can will not be all crazy about replacing e.g. Z80 based lab course with an ARM based one - it is not only about replacing the devboards but all the course material has to be rewritten, lab staff retrained, new hardware bought, new software acquired etc. This is a huge and expensive job, even if you use vendor provided course material (common in the US, much less common in Europe, requiring obscenely overpriced textbooks & tools is just not the thing many teachers want to do).
It is the main reason why universities are still teaching with 20+ year technologies. They work, all the course materials are there, tested and proven, they allow the students to learn the concepts (also, will you really learn more about how a CPU works using the newest whizbang ARM CPU that you cannot even probe without special tools or an old Z80 in a DIP socket where you can literally stick LEDs on a bus?) and the money can be spent on other, more useful/urgent things.
I am going to repeat myself, but university is not there to teach tools and technologies. That's the job of a professional training or a vocational school. Uni is there to teach you the concepts, any tools you learn in the courses are only a bonus. The students are expected to learn the tools they need (editors, IDEs, compilers, stuff like Matlab, ...) themselves.
And no matter what you do, no university will never be able to teach the latest and greatest tools for the reasons outlined above. This is a common misconception many students have too - "Why do I have to do X using Y and not Z that everyone in the industry uses?!" Well, because there are reasons for it that you may not see or understand (e.g. the pedagogical ones). Or simply because the Uni doesn't have resources for it - it is not just about acquiring the tool, the most significant costs are the overhead around.
So bear this in mind before bashing someone's course only because they are using old technology to teach it.