General > General Technical Chat

New technology doesn’t exist; everything was invented in the 60’s

<< < (8/8)

Cerebus:

--- Quote from: magic on June 01, 2021, 10:37:00 pm ---Theory of computability is still mathematical esoterica today and has little to do with designing real world machines capable of solving real world problems. Turing completeness easily "emerges" from an ability to follow any arbitrary finite flowchart of operations combined with "infinite" memory and pointers. That stuff has so many obvious uses that it would be built with or without Turing.

--- End quote ---

I've actually seen in real life someone set out to try and solve a computationally undecidable problem, but they didn't know it was computationally undecidable, so it was giving them problems to say the least. While computability does indeed seem like esoterica it's surprising how often you run into it in real life and i've seen it and the closely related field of computational complexity floor more than a few programmers.

it's easy to say that you don't need Turing to get to computers, but in the same vein you don't need Newton to get to gravity, Maxwell to get to electromagnetism, or Watt or Newcomen to get to steam engines, but history is what it is. What is obvious now often wasn't obvious to everybody else until someone pointed it out. We managed to build things that didn't fall down for many of thousands of years before Hooke and friends came up with elasticity but until they did we couldn't say with certainty that a given building design was sound. Practical things need sound, proven theoretical underpinnings.

It's all well and good to claim that Turing completeness "emerges" from having a Turing machine, but you'd have to recognise that and then prove it if you start from the machine, completely arse about face from what Turing set out to do and did. The important rôle that Turing's work probably took beyond giving theoreticians the warm fuzzies in the creation of computers is likely rather more prosaic and familiar to any of who have had a non-technical boss.

Without Turing:
Minion: "I want to build this machine I'm calling a computer. It'll cost $XYZ."
Boss: "Will it work?"
Minion: "I think so."

With Turing:
Minion: "I want to build this machine I'm calling a computer. It'll cost $XYZ."
Boss: "Will it work?"
Minion: "Yes, and I can prove it mathematically."

My parts just arrived, I'm off to solder something at last.

coppice:

--- Quote from: magic on June 01, 2021, 10:37:00 pm ---Theory of computability is still mathematical esoterica today and has little to do with designing real world machines capable of solving real world problems.

--- End quote ---
The theory of computability is foundational to security, and is widely important in all sorts of heavy compute activities.

CatalinaWOW:

--- Quote from: magic on June 01, 2021, 10:37:00 pm ---
--- Quote from: Cerebus on June 01, 2021, 02:26:18 pm ---On how long it takes to get from conception to application, there are some notable exceptions. Alan Turing's "On Computable Numbers, with an Application to the Entscheidungsproblem." was true mathematical esoterica when it was published in 1936, but I think it can be argued that this led to the first modern computer (Eniac 1945) in next to no time at all. (Yes, I know I'm conveniently ignoring the parallel work of Konrad Zuse, but his work was both mired and concealed by WWII).

--- End quote ---
Everything can be "argued" but in truth nothing new was invented in the 20th century and the first "Turing complete" computer was of course designed long before his birth :D

Theory of computability is still mathematical esoterica today and has little to do with designing real world machines capable of solving real world problems. Turing completeness easily "emerges" from an ability to follow any arbitrary finite flowchart of operations combined with "infinite" memory and pointers. That stuff has so many obvious uses that it would be built with or without Turing.

--- End quote ---

Since "nothing new was invented in the 20th century" I would really enjoy seeing the 19th and earlier century descriptions or uses of:

LASERs
LEDs
Transistors
Fission bombs
Fission reactors

For the first two one could argue that light sources such as candles were invented far earlier and these are just improved light sources, and you could argue that transistors are just an improved embodiment of the spinning ball speed governers that came a couple of centuries ago and the last two are really just improvements on fire.  Personally I would say they were new inventions.

magic:
You may want to carefully read the very first line of the very first post ;)


Meanwhile I came across this paragraph in nonsensopedia article on von Neumann:

--- Quote ---Independently, J. Presper Eckert and John Mauchly, who were developing the ENIAC at the Moore School of Electrical Engineering, at the University of Pennsylvania, wrote about the stored-program concept in December 1943. [8][9] In planning a new machine, EDVAC, Eckert wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal delay line memory. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work.
--- End quote ---
No source is given as usual, but if true, the famous ENIAC would be another example of a "Turing complete" system designed without Turing.


--- Quote from: Cerebus on June 02, 2021, 11:29:53 am ---With Turing:
Minion: "I want to build this machine I'm calling a computer. It'll cost $XYZ."
Boss: "Will it work?"
Minion: "Yes, and I can prove it mathematically."

one year later:
Boss: "What do you mean, additions is O(n²)?!" :-DD

--- End quote ---
FTFY. The truth is, nobody cared about theoretical "universality". From hand-cranked calculators, through Babbage's Analytical Engine and electro-mechanical cipher-breaking machines of WW2 up to ENIAC, stuff was designed with applications in mind and people weren't stupid, they knew if the application will be possible and how long it will take to execute. And if a new more complex application arrived, the design was expanded. Apparently both Babbage and ENIAC reached the limit of "computability" without even knowing a limit exists. That's how easy it is. Even the page fault handler of x86 and C++ templates are said to be Turing-complete.

At the same time, execution time is a rather important factor that the "foundations of mathemathics" crowd didn't care about at all because building actual machines was not their point, the point was to come up with any non-handwavy definition of "effective method" and that's what they did. Turing may or may not have done some later work on complexity too, I don't know, but that's not what he is famous for and that's not what his "computing machine" paper was about.


--- Quote from: coppice on June 02, 2021, 01:00:16 pm ---The theory of computability is foundational to security, and is widely important in all sorts of heavy compute activities.

--- End quote ---
That's complexity, not computability.
Every problem that can be solved by brute force is computable and that's another reason why computabilty is mostly irrelevant in practice.

I will tell you who actually cares about that Turing/Church stuff outside of academia. Advanced compiler design, static analysis, solvers to abstract logical problems, tools for abstract verification of digital IC designs, that sort of stuff. And even there, it's a 50:50 mix of undecidable problems and "merely" NP-hard. But of course, as Cerebus says, you won't get far in those fields if you can't recognize the former, so it is important in this context.

tpowell1830:
"... standing on the shoulders of giants..."  Isaac Newton

Navigation

[0] Message Index

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod