You're right there, I'm certainly not even a programmer, let alone a software developer.

Though, the SW dev you are saying looks more like a SW researcher, rather from academia than from a SW company. In SW production, software development is rather prosaic execution than philosophical thinking.
The fun philosophical realization is that
everything around computes. Even a sitting still pebble computes in its existence - by computing, thinking the laws of physics that made the pebble possible and sitting there like it does.
Absolutely everything computes something, on its own way, just that we usually don't take advantage (in an explicit manner) of the computing properties of the world. We don't usually harvest the computation in a numerical form, but we indirectly use/take advantage of those pebble-specific computations as properties of the given object.
For example, we harvest the computation result of a water molecule by simply drinking the water

instead of laying down in a table all those numbers computed by that water molecule. The sliding ruler is a good example where we explicitly harvest the computation power in the form of numbers, and not as ruler's object properties (though one may improperly use the sliding ruler as a pointing stick instead of a multiplier

).
In my understanding, the need of numbers and computation (in a mathematical way) came first as a need to keep an inventory (to count the sheep), then from the need to simulate "something" in order to predict the future evolution/behavior of that given something. And in the last century the computation turned into a goal in itself and now we have armies of computer science engineers, and thinking trends like Unix philosophy. (maybe in the Unix philosophy should be explicitly added the rule saying "Unix will not try to stop you doing wrong things, because that will also stop you doing smart things").
We are now at the point where we start realizing the importance of yet another thing coming after computation, the so called
big data. Huge piles of data, which are nothing but stored computation results

. And suddenly we are in the ML (Machine Learning) and AI (Artificial Intelligence) realm, where a trained NN (Neural Network) computes in a very different way than a sliding ruler or a digital computer.
Very different in the sense that there is a new factor coming into play, that new factor being the
training data, more precisely the
context (the world) where from the training data was collected. If we look at a computer as a "crunching numbers" machine, then a trained NN will also crunch numbers, but the computation laws of a NN are not coming from math, but from the training data. The numbers from the training data were also capturing the behavior of an entire world, the world from where the training data was harvested.
It's interesting to compare how a NN based computation differs from an ALU (Arithmetic Logic Unit) based computation:
ALU based computation- requires an algorithm for the given computation/problem
- the programmer must completely describe the algorithm
- produces exact results, relevant for each and every input data
NN based computation- requires a pile of data harvested from a similar situation/world with the problem to solve, and the data doesn't have to be a complete description of that world
- the programmer must describe only the goal function, not the algorithm of how to achieve that goal
- produces good enough but inexact results, relevant only from a statistical standpoint over all the possible input data
From a bird's eye view, we can as well conclude that the ALU type is based on
mathematically exact truths, like true/false, while the NN type computation is based on
statistical and contextual truths, like good/bad.
(In fact there is no such thing as good, or bad, by itself. Good and bad makes sense only in the context of a given goal. Same thing can be either good or bad, depending on the given goal. Good is whatever suits that goal, bad is whatever stands against the goal. Sure, one can say that setting a goal on which we define good/bad is similar with setting the math axioms on which we define what is true/false, but the fuzziness difference between good/bad and true/false still remains.)
For a NN, the rules (axioms) are extracted from the training pile of data, and thus whatever situations happens more often in the training data will become the truths for the trained NN. A corollary for this is, for a NN, the definition of truth can be changed by simply cherry-picking the training data. Whoever controls the data can create a totally new/different reality for a NN.
The funny thing is, we humans are the NN kind of computing machines, and thus we have the saying "
A lie repeated a thousand times becomes truth" to remind us all the above in just one line.
Sorry for the long rambling slip, it's a "rainy Sunday afternoon"
TM outside.

Pic embedded from: http://readingjimwilliams.blogspot.com/2011/09/book-1-chapter-7.html\[ \star \ \star \ \star \]
Another interesting book in my collection of "many a quaint and curious volume of forgotten lore" from before that:
R M Howe: Design Fundamentals of Analog Computer Components, D Van Nostrand Co, 1961
Couldn't find that one on
https://www.pdfdrive.com/ but found a few other Analog Computation books from around the same decade. Some are very well structured, and it was very entertaining to browse them.

\[ \star \ \star \ \star \]
[analog computing] is one of my favorite topics!
Totally believe you about that.

pic stolen from https://www.eevblog.com/forum/testgear/what-did-you-do-with-or-to-your-oscilloscope-today/msg3030544/#msg3030544It's funny how the "analog electronic" term we use nowadays was coined from "analog computation", where the "analog" word was denoting the similarity between a physical
dynamic system and its corresponding
differential equation(s) describing the given dynamic system. It just happened that "the analog" was also producing continuous and smooth signals. Or at least that's how I remember reading somewhere.
[quote user=https://www.vocabulary.com/dictionary/analogue]The word analogue (also spelled analog) comes from the Greek ana, meaning "up to," and logos, meaning, among other things, "ratio" and "proportion." In 1946, it entered computer language as an adjective to describe a type of signal that is continuous in amplitude.[/quote]
Analog circuits could have been as well called "similar circuits", and the the term "Analog Engineer" in EE would have become "Similar Engineer".
\[ \star \ \star \ \star \]
Just imagine the dialog:
- I'm a mechanical engineer working for Ford, and you?
- I'm a similar engineer working for Similar Devices.
