Certainly he does! He was a smart guy. He states the reason why earlier:QuoteMy second remark is that our intellectual powers are rather geared to master static relations and that our powers to visualize processes evolving in time are relatively poorly developed.
Iteration in a program requires the programmer to think explicitly about time: THIS happens, and then THIS.
Recursion expresses a static relation: if THIS is correct, then THIS is also correct. e.g. if fib(9) can be somehow relied upon (we don't need to care how) to produce the correct result, then 10 * fib(9) is certainly the correct result for fib(10). No notion of time is required, only correctness.
If this premise was true, HDL languages (e.g. VHDL, Verilog) would be much easier to master than C. In reality, the exact opposite happens. People grasp time relationships and learn programming in C rather quickly, but HDL learning curve is usually much steeper and people find HDL concepts more confusing than C.
QuoteThis is tied in again with whether you need the notion of time or not.That would neatly explain the way that embedded programmers tend to disagree with algorithm designers.
Arguably, real time and interactive programming (both relatively rare at the time of The Letter) always include a notion of time...
I'm not talking about ignoring execution time or efficiency
QuoteI'm not talking about ignoring execution time or efficiencyI wasn't, either. Although at some point that comes into play, of course. I mostly meant, oh, "process oriented" vs "data oriented", or something like that. I wasn't even necessarily including "fast" under "real time" (although perhaps understanding the difference is an example of what I'm talking about.)
I'm a professional compiler guy. What you say is ABSOLUTELY untrue of any modern production compiler. In particular it is untrue of gcc or llvm.
If you write this...Code: [Select]int sum(int *p){
int i, t = 0;
while ((i = *p++)) t += i;
return t;
}
I'm a professional compiler guy. What you say is ABSOLUTELY untrue of any modern production compiler. In particular it is untrue of gcc or llvm.
If you write this...Code: [Select]int sum(int *p){
int i, t = 0;
while ((i = *p++)) t += i;
return t;
}
I did not specifically mentioned "while" loop - try on "do - while".
And I would never write such pointless function. Then you make test case around it and specific compiler optimization parameter - all that is pretty much useless. What is usually used in a loop as a limited factor is a counter as a simplest form, or complex boolean expression. Try with better example, then results and conclusions can be slightly different...
I hope you know that any "while" loop you can turn in "do-while" in assembler quite trivially. Then you will have no unconditional jumps and a bit more compact and faster code. There is no much point for unconditional jump and right after that conditional...
I suppose the same company allowed labels, goto, endless loops, multiple continue and breaks? That certainly break fundamental rule about structural programming, which is paradox. I would left that company immediately.
Who made these "rules?" Even MISRA-2012 has relaxed the rule banning gotos after it INCREASED the error rate. The goto rule comes from a horribly misunderstood paper by Dijkstra. Dijkstra was writing about people creating spaghetti code using C by misunderstanding how C was supposed to work, and simply using gotos to control their control flow. Somehow this got turned into the rule "you can't use gotos...gotos are bad." It's a STUPID rule and has caused countless extra lines of code, filled with countless extra bits of logic and countless more bugs. More than even just being a stupid rule, I'll go one step further and say that I believe the rule is simply wrong because it actively leads to code which is more complicated with more potential for bugs, more difficult to understand, and more difficult to test (higher complexity).
But is that true for random people from the general (intelligent) population who have never been exposed to programming before? Or is it only true for people who have already passed through the filter of "CS101" and learned C/Java/Python?
Don't forget that beginner programming courses all have HUGE dropout rates, despite a lot of effort to understand and change that. Reasoning about assignment and loops is HARD at first, even for people who eventually get it. And many never do.
I don't know about dropout rate statistics, but if you're right, then perhaps programming courses for the beginners do not try to teach basics before they attempt to teach programming. If someone knows how computer memory works, then understanding variables and assignments won't be hard. If you don't explain the concept of memory first then teaching programming will be hard.
As we know now ... programming is best approached not as highly abstracted science, but rather as a down-to-the-earth trade
As we know now ... and programming is best approached not as highly abstracted science, but rather as a down-to-the-earth trade, were languages such as C are the most practical.
Thus there cannot be any theoretical rules which would mandate use of recursion or forbid it.
As any other tool, recursion gets used where it is called for.
People thought that they will be able to create abstract languages (not unlike Algol-68), which will make programming more organized through application of higher abstractions. I have a feeling that the author of the article had similar vision.
As we know now, 50 years later, this vision happened to be incorrect, and programming is best approached not as highly abstracted science, but rather as a down-to-the-earth trade, were languages such as C are the most practical.
I don't know about dropout rate statistics, but if you're right, then perhaps programming courses for the beginners do not try to teach basics before they attempt to teach programming. If someone knows how computer memory works, then understanding variables and assignments won't be hard. If you don't explain the concept of memory first then teaching programming will be hard.
I agree. I believe everyone wanting to learn imperative programming (C, Pascal, Java/C#, Python, ...) should start with assembly level programming. Not on something torturous like PIC or 6502 or even x86. Something where you can document the entire instruction set and programmer's model on two sides of an A4 sheet (which admittedly you can with many 8 but CPUs) AND where you don't have to jump through hoops to manipulate decent sized integers and pointers and function call/return and a stack and struct members etc.
Who made these "rules?" Even MISRA-2012 has relaxed the rule banning gotos after it INCREASED the error rate. The goto rule comes from a horribly misunderstood paper by Dijkstra. Dijkstra was writing about people creating spaghetti code using C by misunderstanding how C was supposed to work, and simply using gotos to control their control flow. Somehow this got turned into the rule "you can't use gotos...gotos are bad." It's a STUPID rule and has caused countless extra lines of code, filled with countless extra bits of logic and countless more bugs. More than even just being a stupid rule, I'll go one step further and say that I believe the rule is simply wrong because it actively leads to code which is more complicated with more potential for bugs, more difficult to understand, and more difficult to test (higher complexity).
I agree. I believe everyone wanting to learn imperative programming (C, Pascal, Java/C#, Python, ...) should start with assembly level programming. Not on something torturous like PIC or 6502 or even x86. Something where you can document the entire instruction set and programmer's model on two sides of an A4 sheet (which admittedly you can with many 8 but CPUs) AND where you don't have to jump through hoops to manipulate decent sized integers and pointers and function call/return and a stack and struct members etc.
What you're looking for is CESIL - Computer Education in Schools Initial Language. It's a pseudo assembler designed by a couple of chaps at ICL back in the mid-seventies as part of ICL's planned schools computer educations course (well before it was fashionable to teach computing in schools).
While something like that might have value for very young students, or for the first one or two lessons, it is far too limited to be useful.
- no subroutines
- no arrays or structs
- no conception of memory as storage locations with addresses
It's pretty much BASIC with worse syntax and all the useful parts taken out.
The worst sin of all -- it doesn't even give you the tools required to build the useful and missing parts yourself!
Why would anyone want to change that bootloader - what could it do differently for better or worse, than the one that it comes with ?
QuoteWhy would anyone want to change that bootloader - what could it do differently for better or worse, than the one that it comes with ?Meh. If you're going to teach assembly language, you ought to at least teach (a subset of) a REAL assembly language.
One of the tenets of the RISC movement might be stated "it doesn't make sense to design a processor with an "elegant" assembler/machine language; you should design the processor to execute code that compilers can emit easily." So modern cpus tend to be a bit ugly.
a[x] = b[y] + c[z]
in a single instruction, and also added single instructions to do all the work of function call or return -- saving registers, adjusting stack frames etc. x86 got some of the same philosophy too.a[x] = b[y] + c[z]
into four simple instructions (or even seven or ten) and it runs just as fast or faster -- EVEN ON YOUR VAX".If you want elegace, If the absence of the lovely PDP11 (sigh), I think I'd recommend the MSP430. ARM or AVR wouldn't be awful.
a[x] = b[y] + c[z]
that's only enough to hold a, b, c, x, y, z but not the actual data! Need to keep at least some of them in the stack frame. Ugh. (32 bit x86 has the same problem of course)PDP11 doesn't have enough registers. Only six, effectively (or five if you don't save/restore the link register). In a
- = b[y] + c[z] that's only enough to hold a, b, c, x, y, z but not the actual data!
mov b(y), a(x) ; destination on the right, IIRC?
add c(z), a(x)
it's not like [RISC designers] made things deliberately ugly or something.