Author Topic: Compiler Design at Cornell  (Read 3901 times)

0 Members and 3 Guests are viewing this topic.

Offline EEVblogTopic starter

  • Administrator
  • *****
  • Posts: 38995
  • Country: au
    • EEVblog
Compiler Design at Cornell
« on: October 16, 2024, 09:59:45 am »
Can someone please explain what on earth all this crap has to do with actual compiler design?
Looks more like some arsehat professor proving how smart and obnoxious they can be.

https://x.com/deedydas/status/1846366712616403366







 

Online tszaboo

  • Super Contributor
  • ***
  • Posts: 8174
  • Country: nl
  • Current job: ATEX product design
Re: Compiler Design at Cornell
« Reply #1 on: October 16, 2024, 10:26:36 am »
Can someone please explain what on earth all this crap has to do with actual compiler design?
Looks more like some arsehat professor proving how smart and obnoxious they can be.

I've definitely had uni. courses like this. "Consider a programming language Xi (Greek character)". And then just a bunch of nonsense after that. It's just way too much abstraction to be useful. I remember absolutely zero from that class except the memes we made on the spot. And I got an EE degree at the end, not computer science.
 

Offline EEVblogTopic starter

  • Administrator
  • *****
  • Posts: 38995
  • Country: au
    • EEVblog
Re: Compiler Design at Cornell
« Reply #2 on: October 16, 2024, 10:31:40 am »
I've definitely had uni. courses like this. "Consider a programming language Xi (Greek character)". And then just a bunch of nonsense after that.

Luckily I didn't have any of that in programming courses at uni, but then again, it was EE, not CS.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4696
  • Country: nz
Re: Compiler Design at Cornell
« Reply #3 on: October 16, 2024, 10:46:37 am »
This is standard kind of stuff for people who are more logicians than programmers. They probably love Haskell and related things. And what they refer to as "strong typing".

For what the stuff at the top means see...

https://en.wikipedia.org/wiki/Natural_deduction

Basically, a notation for modus ponens.
 
The following users thanked this post: RAPo, 5U4GB

Online tszaboo

  • Super Contributor
  • ***
  • Posts: 8174
  • Country: nl
  • Current job: ATEX product design
Re: Compiler Design at Cornell
« Reply #4 on: October 16, 2024, 10:50:55 am »
I've definitely had uni. courses like this. "Consider a programming language Xi (Greek character)". And then just a bunch of nonsense after that.

Luckily I didn't have any of that in programming courses at uni, but then again, it was EE, not CS.
Yes, same, EE. It was unnecessary and exhausting to attend these classes, I probably skipped a few.
I think it was shoehorned in in some class where they also taught about machine learning, understanding the concept of that was useful. I don't even think this was on any of the tests.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4696
  • Country: nz
Re: Compiler Design at Cornell
« Reply #5 on: October 16, 2024, 01:05:52 pm »
I've definitely had uni. courses like this. "Consider a programming language Xi (Greek character)". And then just a bunch of nonsense after that.

Luckily I didn't have any of that in programming courses at uni, but then again, it was EE, not CS.
Yes, same, EE. It was unnecessary and exhausting to attend these classes, I probably skipped a few.
I think it was shoehorned in in some class where they also taught about machine learning, understanding the concept of that was useful. I don't even think this was on any of the tests.

I didn't have any of that rubbish because it hadn't been invented yet, lol. The closest we got to super formal stuff was Dijkstra's predicate transformer semantics with preconditions, postconditions, and weakest preconditions. They mentioned van Wijngaarden's two-level grammars defining Algol 68 but we didn't go into details.

I think I first came across "natural deduction" and its notation from hiring and otherwise interacting with new grads in around 2005.
 

Online golden_labels

  • Super Contributor
  • ***
  • Posts: 1439
  • Country: pl
Re: Compiler Design at Cornell
« Reply #6 on: October 16, 2024, 02:57:16 pm »
I had Prolog and basic logic course in my masters degree curriculum. And that was a major leaning towards software engineering, not proper computer science.

Post 2010 a mandatory compiler design course almost always(1) implies those are proper, hard computer science studies. If that’s the case, the students are not prepared to be webapp frontend sculpting drones. They are getting reared to deal with theoretical foundations of computing. That kind that is as relevant to IT as it is to physics, mathematics, (neuro)psychology, and philosophy. And this is deep abstraction. Exactly like that, which would be merely a mind exercise.


(1) If not, I’d say somebody is stuck in the 1980s and 1990s. It’s not about the notation used. It’s failing to see there is  too much knowledge to drag students through to dedicate an entire course to something as niche nowadays as compiler design. Maybe somebody treats this as a course to shape students. But that would be the only explanation I can see other than being out of one’s mind. I myself had to pass philosophy, physics, and law.
People imagine AI as T1000. What we got so far is glorified T9.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4696
  • Country: nz
Re: Compiler Design at Cornell
« Reply #7 on: October 17, 2024, 01:24:49 am »
there is  too much knowledge to drag students through to dedicate an entire course to something as niche nowadays as compiler design

I disagree!

I agree that rigorous formal semantics and designing whatever the people who used to do Haskell are working on today is too niche to make everyone do it. Designing stuff that might filter down into industry in 20 or 50 years from now.

But basic compiler knowledge is valuable everywhere! There are many things in practical programming -- web sites or otherwise -- that benefit from knowing how to design a DSL (Domain Specific Language), the syntax and semantics, parse it, interpret it or transform it to another language (whether actual asm/machine code or C or Python or whatever).

These days the lexing and to some extent parsing is often side-stepped by using JSON or XML or similar (YAML, S-exprs, TOML) instead of using lex&yacc (or similar), but it's still valuable to learn the basics of grammars, regexps, LL1, LALR, FSMs, NFAs.

Web pages and web back ends often have quite complex protocols for communicating between them and keeping what is seen in the browser in sync with the back end database. I don't know how you're supposed to understand and maintain that if you don't know anything about state machines.

It's also useful, for example, to be able to at least understand and modify, if not design from scratch, things such as this simple compiler from a subset of Lisp to RISC-V machine code, which runs on a Raspberry Pi Pico 2 microcontroller.

http://www.ulisp.com/show?4Y20

This is very very far from the formal programming language material Dave showed in the first post in this thread, but it's definitely "compilers". And useful for real-world practitioners.

Maybe you'll be generating Javascript on the fly (or WASM), not RISC-V machine code, but all the ideas are the same.
 

Online golden_labels

  • Super Contributor
  • ***
  • Posts: 1439
  • Country: pl
Re: Compiler Design at Cornell
« Reply #8 on: October 17, 2024, 08:17:11 am »
brucehoult, I believe you misunderstood, what I mean. I didn’t say this knowledge is not valuable. What I mean, is that there is other, much more useful knowledge and it all has to be fit in unadequately short time. Choices need to be made. There are hundreds of “valuable items” to put in your backpack, but the backpack is small.
People imagine AI as T1000. What we got so far is glorified T9.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4303
  • Country: gb
Re: Compiler Design at Cornell
« Reply #9 on: October 17, 2024, 12:24:44 pm »
Can someone please explain what on earth all this crap has to do with actual compiler design?
Looks more like some arsehat professor proving how smart and obnoxious they can be.

In my opinion, it's a bit like getting to the third year of electronic engineering and having to choose between
  • (A): { optoelectronics, digital electronics (HDL, ASIC, etc, ... ), telecom electronics (GSM, spread sprectrum radio applications, ATM, etc), power electronics }
  • (B): { solid state electronics }

A: 90% of the choices fall into package (A), and the common fundamentals of analysis, physics, computer science, analog electronics fundamentals, digital electronics, basic components, numerical analysis, and telecommunications are needed.

B: 10% of the choices fall into the package (B), additional courses in differential mathematics, numerical mathematics are needed, it is necessary to introduce Hilbert spaces, Hamiltonians, AutoCat, wave equations, etc ... up to deep quantum mechanics.
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline xvr

  • Frequent Contributor
  • **
  • Posts: 536
  • Country: ie
    • LinkedIn
Re: Compiler Design at Cornell
« Reply #10 on: October 17, 2024, 01:35:58 pm »
This is a very academic style of presentation. In real life it is practically never used.
But it can be even worse - when such an academic style is attempted to be presented in ordinary language.
I once tried to write a program to simplify the system of logical expressions (for CPLD) using a book that described algorithms for this. First, the authors used exclusively single-letter variables, and by the end of the book they had almost exhausted the alphabet. Of course, there are no comments in the algorithms (and very few in the text of the book itself). And they saved the most amazing thing for last. Some variables in the algorithms turned out to be not numbers but vectors, and arithmetic on them was performed in Galois fields. Of course, there was no mention of this in the texts of the algorithms or in the rest of the book.
 

Offline DimitriP

  • Super Contributor
  • ***
  • Posts: 1415
  • Country: us
  • "Best practices" are best not practiced.© Dimitri
Re: Compiler Design at Cornell
« Reply #11 on: October 18, 2024, 01:56:19 am »
Quote
webapp frontend sculpting drones
    :-DD

   If three 100  Ohm resistors are connected in parallel, and in series with a 200 Ohm resistor, how many resistors do you have? 
 

Offline KE5FX

  • Super Contributor
  • ***
  • Posts: 2088
  • Country: us
    • KE5FX.COM
Re: Compiler Design at Cornell
« Reply #12 on: October 18, 2024, 04:51:02 am »
Answer: "Meh, go ask ChatGPT.  I can't be arsed to do a robot's job for free. ... Wait, what?  I'm paying you?  Excuse me, I have pressing business at the registrar's office."
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 7133
  • Country: fi
    • My home page and email address
Re: Compiler Design at Cornell
« Reply #13 on: October 18, 2024, 07:01:12 am »
Some people latch on to a notation rather than the ideas and concepts that notation describes.

Requiring some rigorous formal notation is important, because it is our best tool for organizing human thought and logic.  The problem is, latching on to a specific notation, to the exclusion of others, is always an error.  It is only the conveyance of the idea and concept, not the idea and concept itself.  Ceci n'est pas une pipe, as René Magritte illustrated in the Treachery of Images.

When the lecture notes fail to convey the key concepts and ideas, it is the lecturer's fault. It does not matter if they are the most brilliant human scientist ever to live, the wisest of us all: if they cannot convey the ideas and concepts across, they are a shitty lecturer.

Here, the problem is the ultra-dense notation.  There is no need for condensing and abstracting programming logic and rulesets to such a degree, it simply hinders the propagation of the ideas and concepts.

Being a good lecturer is not easy, and is an utterly separate skill from domain knowledge.  It is why science popularizers are so famous and well-regarded: they managed to do it well to an audience with wide variety of backgrounds.

All the best physics lecturers and professors I've met have had one thing in common: they never hesitated or thought twice about switching notations, and simply used whatever conveyed the idea across best.  They all were passionate about the ideas and concepts, and not of any specific notation.  Some end up inventing their own, simply to use whatever fits their workflow best and lets them concentrate on the content instead of the appearance.
 
The following users thanked this post: nctnico, tggzzz

Online nctnico

  • Super Contributor
  • ***
  • Posts: 28351
  • Country: nl
    • NCT Developments
Re: Compiler Design at Cornell
« Reply #14 on: October 18, 2024, 06:09:10 pm »
brucehoult, I believe you misunderstood, what I mean. I didn’t say this knowledge is not valuable. What I mean, is that there is other, much more useful knowledge and it all has to be fit in unadequately short time. Choices need to be made. There are hundreds of “valuable items” to put in your backpack, but the backpack is small.
Very true but OTOH it is important to know a bit about the basics as at some point the grey haired people will move to a tropical island for retirement and nobody understands the underlying theory of what they have programmed.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15694
  • Country: fr
Re: Compiler Design at Cornell
« Reply #15 on: October 18, 2024, 07:51:20 pm »
Also keep in mind some of these lectures are not made just for teaching a topic, but for selecting students. Uni is not just for teaching stuff, it's also a selection "tool".
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4696
  • Country: nz
Re: Compiler Design at Cornell
« Reply #16 on: October 19, 2024, 07:33:06 am »
Some bloke by the name of Chris Lattner (@clattner_llvm) has commented on this on twitter:

Quote
I’m glad I didn’t take this compiler class, I would have also gotten 0/100. No wonder people think compilers are scary, they shouldn’t be taught this way!  It’s also flawed in many ways (and old) but I think this is more approachable https://llvm.org/docs/tutorial/
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4341
  • Country: us
Re: Compiler Design at Cornell
« Reply #17 on: October 19, 2024, 07:42:33 am »
I probably learned more in my compilers class than in any other class that I failed.  :-)
(Failure to complete a required final project.  I just had a lot of trouble getting excited about writing a “compiler” that didn’t actually generate code.  (Though I realized sometime after graduating (EE) that I had a side project that might have qualified.  Based more on a Byte article than material from the class, though.  Sigh.))

 

Online golden_labels

  • Super Contributor
  • ***
  • Posts: 1439
  • Country: pl
Re: Compiler Design at Cornell
« Reply #18 on: October 19, 2024, 10:33:34 am »
(…) it is important to know a bit about the basics as at some point the grey haired people will move to a tropical island for retirement and nobody understands the underlying theory of what they have programmed.
Oh, I certainly agree that knowing more is usually(1) valuable! But it’s from one’s own perspective. The view that people constructing the curriculum have is very different. Their task is fitting 10,000 pigeons in 50 pigeon holes. Dirichlet correctly observed this isn’t possible.

Though, perhaps, I find it valuable for different reasons. Humanity isn’t going to run out of people possessing necessary knowledge. If that was possible because of the process discussed, we’d be back to before paleolithic already. This would be true, if two conditions are met. First is that people don’t learn that “extra” knowledge. Second is that the “extra” knowledge remains relevant. But clearly cutting down one curriculum doesn’t eliminate others. It’s specialization, not extinction. And the needs are not static either. Yes, we do retire, but more than ever so does the knowledge.


(1) I feel tempted to write “always”. But there are edge cases. Knowing too much becomes a burden in efficient decision making, under deadline pressure, or when perfectionism kicks in.

People imagine AI as T1000. What we got so far is glorified T9.
 

Online tszaboo

  • Super Contributor
  • ***
  • Posts: 8174
  • Country: nl
  • Current job: ATEX product design
Re: Compiler Design at Cornell
« Reply #19 on: October 19, 2024, 10:36:47 am »
I probably learned more in my compilers class than in any other class that I failed.  :-)
(Failure to complete a required final project.  I just had a lot of trouble getting excited about writing a “compiler” that didn’t actually generate code.  (Though I realized sometime after graduating (EE) that I had a side project that might have qualified.  Based more on a Byte article than material from the class, though.  Sigh.))
Engineers solve problems. The problems for these classes is not that you need to learn compiler design. The problem is that you might need to get a passing grade to get a degree.
Solutions vary, might be that you take the class with a different teacher, different semester. Take different class for a the same amount of credits. Learning all this is probably just one way of solving the problem.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4303
  • Country: gb
Re: Compiler Design at Cornell
« Reply #20 on: October 19, 2024, 01:25:08 pm »
Also keep in mind some of these lectures are not made just for teaching a topic, but for selecting students. Uni is not just for teaching stuff, it's also a selection "tool".

(that's what I thought above when I described the distribution function of the choices of the third year of electronic engineering ...
{ 90%, 10% } -> there were undoubtedly selections for those (the 10%) who would "digest"
and those who "digest_and_appreciate" lessons based on the super advanced mathematics needed for quantum mechanics)
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 7133
  • Country: fi
    • My home page and email address
Re: Compiler Design at Cornell
« Reply #21 on: October 19, 2024, 03:34:47 pm »
Still, using one specific dense notation in course materials as the selector/filter is stupid: you'll select only for overachievers that way.

Programming language theory and compiler theory grew out of lambda calculus, which has its own established "standard" notation (similar to basic algebra).  The dense notation used is based on combining that with standard notation for inference rules, where premises are above the line and conclusions below the line, plus on sequent notation formulasconsequents where at least one consequent is true if all formulas are true (which is why ⊢ is typically read as "proves", "implies", or "yields").

It isn't difficult to learn, but it takes months if not years of using it for it to become faster than written human language.  The entire notation can be described in just a couple of pages, so the notation being a question mark to Cornell students is, in my opinion, a definite indicator of poor lecturer and/or lecture notes.  If it is an intentional "student filter", their selection criteria is such that I want nothing to do with Cornell students or alumni.

Plus, the inference rule notation has that annoying significant whitespace issue above the line: whitespace acts as a separator between rules.  Comma cannot be used, because it is already a formula and consequent separator.  I can accept significant whitespace at the beginning of a line like in Python, but as a list separator, I find it extremely slow to parse correctly.
:--
 

Offline Haenk

  • Super Contributor
  • ***
  • Posts: 1278
  • Country: de
Re: Compiler Design at Cornell
« Reply #22 on: November 03, 2024, 08:53:11 am »
This is the "science" part of "computer science", it is not meant to be "practical programming on RasPi".
Even though I only had "basic" maths at university (compared to the "full on maths" study), we had our share in theoretical maths and abstraction. Not quite at this shown level, but not too far away.
You will learn the nomenclatura, and this will make it an almost readable text with maximum efficiency of information. Of course, this is an extremely abstract way of doing things, with the advantage of creating "provable" concepts, which is what science requires.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 7133
  • Country: fi
    • My home page and email address
Re: Compiler Design at Cornell
« Reply #23 on: November 03, 2024, 08:16:05 pm »
Apologies if I sound like I want to have the last word in this thread.  I do not, and it is not my intention. 

I just feel like my points have been ignored by those who defend the lecture approach.  I've been bitten by this myself (and it hurt my pride, being one of the few courses I've ever failed), and have overcome it quite successfully too, so I have rather strong personal opinions on this –– and I believe that applying something like this in practice will help others learn better, especially more thoroughly; leading to understanding rather than memorization (which I detest, but is sufficient to pass most University courses).

You will learn the nomenclatura, and this will make it an almost readable text with maximum efficiency of information.
Incorrect in this context.

Maximum density of information ≠ maximum efficiency in learning.  This is the core of my point above.

There is a reason humans tell stories, why we have parables and archetypes.  It is not cultural at all; only their content varies from culture to culture, but all human cultures have them.  The reason is how we learn.

Providing the minimal facts, or the proof of said facts, without proper context, is an extremely poor and inefficient way of lecturing/teaching/learning.

I realized this myself when I failed –– one of the very few courses I've ever failed –– a physics math course on special functions (Bessel functions et cetera) and second-degree and higher nonlinear calculus.  The lecturer said absolutely nothing about applying these functions, and instead spent the lectures showing how to derive them and prove them, with "applications left as an exercise".  (About a third of the students in that class did pass.)

At that point, I'd already created a couple of courses for art and design students on basic computer skills and such, so I created my own lecture notes (about ten pages) from the opposite viewpoint: I noted the basis of the derivation and the proof, but looked up all the various applications, and their base idea, and how they linked to the proof or derivation or both, and the base solution methods (separation of variables and so on).  Ended up getting 4/5 ("B" on A-F scale), on a course where pass rate was about 35%, and 65% or so failed.

There are many different ways to learn, and that is why you do not want maximum density in your learning materials.  The opposite, in fact: for best results, you want to provide multiple approaches to the same conclusion, from which the students can choose the one that suits them best.  The most skilled lecturers do this automatically, testing the most likely one, and quickly changing to a different tack if the students do not respond positively to that particular one.  The most skilled science popularizers can construct one that will work for almost all readers/listeners.  For fun, just compare them to the structure of legends, parables, and archetypical hero stories!

It is also exactly that that makes teaching or lecturing a completely different skill to domain knowledge.

Information-dense exact notation does have its place: when that knowledge and understanding is applied, either in further science, or to solve problems or prove results in relation to that knowledge and understanding.  It should be built either as part of the learning process (but always secondary), or afterwards.  If you disagree, consider why learning Standard Chinese is easier if you learn Hanyu Pinyin first.
« Last Edit: November 03, 2024, 08:20:42 pm by Nominal Animal »
 
The following users thanked this post: rhodges, magic

Offline abeyer

  • Frequent Contributor
  • **
  • Posts: 408
  • Country: us
Re: Compiler Design at Cornell
« Reply #24 on: November 03, 2024, 10:53:23 pm »
Information-dense exact notation does have its place: when that knowledge and understanding is applied, either in further science, or to solve problems or prove results in relation to that knowledge and understanding.  It should be built either as part of the learning process (but always secondary), or afterwards.  If you disagree, consider why learning Standard Chinese is easier if you learn Hanyu Pinyin first.

While I don't disagree with the premise that it would be easier to decouple learning the notation from learning the concepts... given the context of this being something that would be taught as an upper-level undergrad or graduate course, there's also going to be an expectation that students are at least reading and engaging with academic research, if not doing their own, and being able to consume the notation is pretty much table stakes there.

This style was never intended to maximize efficiency in learning, but rather for communication among the learned. And even there, I think it's debatable to what degree it's actually more efficient vs just customary and expected (and a bit of a shibboleth.) But either way, someone getting started in the field has to at least learn to read it, and trying to buck the trend and write in a different style isn't impossible, but is definitely a risk. I've seen people do it, but the ones who succeed tend to be people who have really good research content to start with, and are exceptionally good communicators and writers of prose on top of that. Getting to that level if you aren't already might be even more difficult than just learning the notation.
« Last Edit: November 03, 2024, 10:55:54 pm by abeyer »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf