Simplifying the interdependencies is always important, but I'm not sure I'd put it in terms of "... same level of abstraction".
What I mean by that, is that I want a low level imperative language with minimal abstractions to implement low level library, number crunching, and data processing libraries and system services. For graphical UIs, I want an event driven language with high abstractions (but good interface to low level machine code, preferably with minimal added machine code).
I do not want a single language that can do both, because that means others modifying or writing more code have a high chance of using the wrong paradigm. (And that is assuming
I get it right myself, which is definitely not guaranteed!)
A classic example is distributed computing, where too many libraries try to pretend that https://en.wikipedia.org/wiki/Fallacies_of_distributed_computing don't exist. Many times low-level failures can only be correctly dealt with at the top level.
Ah yes, now I see your point. I definitely agree, because one of my buttons is how most current distributed molecular dynamics simulators compute then communicate, wasting a lot of potential CPU time, rather than communicate while computing, simply because their designers didn't want the added complexity of thinking how to do it right (which is spatial subdivision of the simulated volume, and computing these in a specific order that allows the shared regions to be communicated to neighboring nodes while computing the non-shared regions).
That is a serious and important impedence mismatch, but I don't agree with your conclusion. That impedance mismatch is largely a culture clash, and with the technology clash being the less important aspect.
Perhaps; I am often wrong. In that case, however, I don't see how a cultural change will solve the added workload of maintaining the C-Rust interfaces required for Rust. My opinion is also coloured by observing how much abstractions and changes to Rust itself I see discussed at LKML, for achieving an usable C-Rust interface.
The counter example to your conclusion will come when Rust becomes sufficiently effective to implement a kernel.
No, because you can write a kernel in any sufficiently low-level programming language. The counter example will be when you have a functional kernel written in one of these new languages with core functionality easier to manage –– considering the
modification rate in the Linux kernel, i.e. refactoring and changes, not just bugfixes and maintenance against bitrot –– and preferably with lower bug density, than the Linux kernel.
That, I believe, is the true test by fire for a low level language. For a kernel, it will also need to have high enough performance, which in practice means low cost of kernel calls, high-bandwidth low-latency userspace - kernel data transfers, regardless of kernel design (monolithic or microkernel), but the main weight, in my opinion, should be how well human users can use it.
(I do not believe in using the
average human developer as a metric, though. Any tool can be misused; see real world issues with firearms. Rather, I want to focus on the best existing examples, best open source code implemented with finite resources, to evaluate the language used.)
Unfortunately in the 1990s C also fell into the trap of trying to be both a low level and an application level language. Either choice would have been OK, but the result is a mess.
Yep. I can do a graphical UI in C using Gtk+, and it isn't horrible, but one has to be quite careful to avoid bugs and issues.
Combining Gtk+ UI and low level data processing or computing, I personally need to "switch" between paradigms or complexity levels or whatever you want to call them, and treat the two as if I were using different programming languages: I need to think in different patterns. When you add to that the communication needed between the two, it just isn't worth it for anything except basic tools and examples (of what is possible, to motivate learners).
I like languages where their white paper states "X and Y have previously been shown to work well and work together, but Z has been omitted because of the problems it causes". In other words, simple modular concepts and abstractions that play well together.
Yep, exactly: interoperability, not encompassing frameworks.
I do not particularly
like Python either. I use it because it interfaces to low-level code so easily, and allows one to create event-driven graphical user interfaces at the same level that you can get humans to think of. (My examples of Python + Qt5, using Qt Designer -edited XML .ui files, with events mapped to systematically named methods in your main window class implementation, are exactly that: no matter where or what it is, you need to give a widget a name, and look up the name of the signal/event generated from the documentation, and that will tell you the name of the method you need to implement. Plus, it allows UI editing by modifying the XML –– even multiple UIs one can choose from at run time ––; and UI behaviour by modifying the interpreted human-readable code.)
I do believe it was a masterstroke from Guido van Rossum to ensure C interoperability. Without that, I do not believe Python would be as popular as it is now. Interoperability with any shimming implemented in the high-level language side is possibly Python's strongest point.
Every {technical, legal, corporate, political} empire starts small, then accretes crap, and that forms the basis for their demise. They rot from within.
Competition that drives crap removal is the only cure I know.
I do not believe in revolutions, because they just tend to turn the pyramid upside down, but not change its composition otherwise; and very soon it will be indistinguishable from the previous one. Same with enemies, as we see in the world right now: all countries seem to be striving towards 1984-style dystopian eusocial systems, pointing out the flaws in others while blithely ignoring the same ones in themselves.
Gradual changes seem to work much better.
It is also the reason why I don't dabble with creating my own "better than C" low-level programming language. C has a track record and untold hours of development by brains better than mine. What I do as a hobby, is investigate the free-standing environment; replacing the (optional!) C Standard Library with something better, because that is compatible/interoperates with everything fluently. It is by no means perfect, and I do wish I could change specific things even in C (namely, swap pointers and arrays, allowing compile time bounds checks over function calls), but it is better than any "revolution" I could instigate (programming language I could design).
None of this is to say I won't use Rust myself! I wrote all this because I wanted to point out how silly the
idea of automatically converting code from one paradigm and language to another is, and how silly, in light of history and all we know, the idea of Rust becoming the new C is. Rust should be Rust, and evolve to fit whatever niche it finds, and not try to be everything for everyone because that has been tried several times and has always failed –– including C!
What that niche will be, I do not know, but seeing how I happily use even PHP in specific circumstances (webhotels with limited server-side language support), I fully expect myself to be happily using Rust also at some point. (Thus far, I've been looking at how its developers approach problem-solving, and I'm not sure I like it yet: it seems too
abstract for my needs at this point, too focused on achieving their goals, and not focused enough on actual results and behaviour of the result. Like academics discussing mathematics from the point of proofs, when all I'm interested in is using it as a tool to solve problems. It is my impression only, though.)