General > General Technical Chat
New technology doesn’t exist; everything was invented in the 60’s
<< < (7/8) > >>
thermistor-guy:

--- Quote from: TimNJ on May 30, 2021, 07:11:15 pm ---Joke title and obviously not true..
...
What are some other examples?

--- End quote ---

A counter-example: Polymerase Chain Reaction, invented in 1983 by Kary Mullis, used to quantify DNA. He shared a Nobel Prize in Chemistry for it; the first time it had been awarded for an experimental technique, because PCR was that important. It changed bio-technology profoundly: there is before-PCR and after-PCR.
CatalinaWOW:

--- Quote from: Cerebus on June 01, 2021, 01:08:59 am ---
--- Quote from: nctnico on May 30, 2021, 07:15:46 pm ---Just the other day I was explaining to my youngest son that the majority of the math that drives our digital world has been invented before or soon after 1900.

--- End quote ---

Try starting earlier, in the 18th century, and then carry on until the early 20th century and you're nearer the whole picture.

Leonhard Euler 1707-1783 (Not really a direct electronics connection, but you can't leave old Leonhard out.  :) )
Pierre-Simon Laplace 1749-1827
Joseph Fourier 1768-1830
Carl Friedrich Gauss 1777-1855
Friedrich Wilhelm Bessel 1784-1846
George Boole 1815-1864
Pafnuty Chebyshev 1821-1894
James Clerk Maxwell 1831-1879
Gustav Kirchhoff 1824-1887

To name a very few of the 18th and 19th century Mathematicians whose work underpins electronics. When you look at the dates you realise that a lot of the higher level mathematics we rely on are really quite old.

--- End quote ---

Another way of looking at this is that it takes about 100 years for math esoterica to develop real engineering application.  There are newer mathematics in use now, and probably stuff that is fresh in the math department now will find uses in the future.
Cerebus:

--- Quote from: CatalinaWOW on June 01, 2021, 02:59:56 am ---Another way of looking at this is that it takes about 100 years for math esoterica to develop real engineering application.  There are newer mathematics in use now, and probably stuff that is fresh in the math department now will find uses in the future.

--- End quote ---

I'm sure that is entirely true but it's more of a perspective thing for me. Until nctnico forced me to think about it, for me it hadn't quite sunk in how old some really quite complex (no pun intended) maths is. A lot of stuff from the likes of Laplace and Gauss is hard going for modern students of it, so think how much of a mountain it was to climb to invent the stuff out of wholecloth in the first place.

On how long it takes to get from conception to application, there are some notable exceptions. Alan Turing's "On Computable Numbers, with an Application to the Entscheidungsproblem." was true mathematical esoterica when it was published in 1936, but I think it can be argued that this led to the first modern computer (Eniac 1945) in next to no time at all. (Yes, I know I'm conveniently ignoring the parallel work of Konrad Zuse, but his work was both mired and concealed by WWII).
dietert1:
The OP wanted to discuss semiconductor packaging and wiring/bonding. That's why i mentioned FPGA = "runtime wiring" and AI systems = "self wiring". Those technologies did not exist in the 1960s.
Since about 2000 chess computers reliably beat all humans. I guess it requires coherent action of many, many human brains to build something like that. There is no genius and no mathematical solution. The same will be true for autonomous vehicles. Also you may know that digital computers played a decisive role in deriving solutions to old number theory problems. In some sense the modern digital infrastructure is post-mathematics and its time scale of development seems to be shorter.

Regards, Dieter
magic:

--- Quote from: Cerebus on June 01, 2021, 02:26:18 pm ---On how long it takes to get from conception to application, there are some notable exceptions. Alan Turing's "On Computable Numbers, with an Application to the Entscheidungsproblem." was true mathematical esoterica when it was published in 1936, but I think it can be argued that this led to the first modern computer (Eniac 1945) in next to no time at all. (Yes, I know I'm conveniently ignoring the parallel work of Konrad Zuse, but his work was both mired and concealed by WWII).

--- End quote ---
Everything can be "argued" but in truth nothing new was invented in the 20th century and the first "Turing complete" computer was of course designed long before his birth :D

Theory of computability is still mathematical esoterica today and has little to do with designing real world machines capable of solving real world problems. Turing completeness easily "emerges" from an ability to follow any arbitrary finite flowchart of operations combined with "infinite" memory and pointers. That stuff has so many obvious uses that it would be built with or without Turing.
Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod