General > General Technical Chat

How correct is this video about the future of silicon

(1/3) > >>

pcprogrammer:
Being an, out of touch of development, engineer I don't know if what is presented in this video is correct.

Have read about this tunneling effect mentioned as a hold back for smaller transistors before and understand it to some extend, but wonder about if the other materials mentioned can indeed overcome this problem.



Also in the comments someone wrote:


--- Quote ---Silicon photonics is what I have the most hope in for in the next couple of decades. Imo, transitioning from electricity to light is just the most logical step forward. It will set moores law back by quite a bit, but the insane clock rate of the processors will make up for it. Most modern keyboards already use light to transmit signals.

--- End quote ---

And here I wonder about these keyboards using light to transmit the signals?

Sure my communicating with the eevblog server runs via fiber optics and I understand the working of it, but keyboards with a fiber optic connection, I have not heard of these before. Audio equipment using fiber, yes, but other computer peripherals, no.

Another interesting aspect of course is switching with photons and how this can be done. Is it possible to make something similar to a transistor for photons. But this falls outside of my knowledge too.

But the technology far beyond my grasp is quantum computing  :o
Will it indeed become a replacement for what we have now on our desks and how will they be programmed?
Being a very binary person used to sequential instructions, making sense with zeros and ones, wonder about the intermediate states of these qbits  :o

So a wide area of discussion possible here.

pcprogrammer:
Apparently not much of interest these topics.  :-//

But to address one issue of what I mentioned, it dawned on me that the "modern keyboards" referred to using light to pass the signals are the ones that use infrared like remote controls. Can hardly call that modern since the infrared remote control is around since the nineteen eighties.

Also doubt it being used much with computers. Wireless based on radio signals is much more likely.

T3sl4co1l:
I mean, radio is light, too.  Don't be a visible chauvinist. ;D

As I understand it, at least as of the last state of technology I read much about -- photonics is a lie because it can't be miniaturized.  Roughly speaking, nothing can be smaller than the wavelength of the light itself (within the medium).  So, ~50nm sorts of things, but we're already well below that.  The other oft-touted premise is faster interconnects, which apparently hasn't come to fruition: it seems the burden of a transmitter, detector, and perhaps all the encoding/decoding to make that useful, makes the cross-chip propagation delay that much worse than just using stupid old wires and repeaters (buffers).

The equivalent circuit in play here is a loss-dominant (RC) transmission line, for most metal layers with respect to substrate or other metal layers as ground plane.  So the bandwidth / delay is... I forget what delay is, bandwidth I think drops quadratically with length, maybe delay too or maybe it's linear?  Anyway, clearly you combat that by adding repeaters -- just toss in an inverter every so often, which incurs some gate delay but by balancing both you at least avoid the quadratic cost.  And it hardly takes any additional space, so it's feasible for wide buses (256 bits+).

The alternative is an optical waveguide, which either uses metal layers and dielectric (SiO2), or additional materials (perhaps including a different index to make a waveguide akin to optical fiber), to guide waves without wires.  These tend to have higher Q factors, but are bulky and have other sometimes undesirable properties (dispersion: velocity varies with frequency).  So the interconnect itself might be good (give or take optical index, but it's likely better than the RC TL overall), but the problem is generating and receiving the signals themselves, which inevitably must be translated to logic-level voltages in wires, among other things.

The intermediate case you might ask about; and indeed there are modest-Q transmission lines (LC dominant).  AFAIK, these involve many metal layers, so that reasonable conductor thickness can be built up (for ~GHz on monolithic processes, layers are comparable or thinner than skin depth, remember!), and so that reasonable volume can be had for the dielectric (TLs are waveguides that support a DC mode: remember the wave flows in the space between conductors).  So you might have an, I don't know, eight metal layer process, and in those ~um of height, a "bathtub" structure can be made, with a wall of stitching vias through all layers making up the sidewalls (well, all layers that don't need to cross the TL -- obviously you can't go up or down forever in the stack if you also need signals or power to cross these TLs!), and solid metal on the bottom for the base, or perhaps substrate still (preferably at degenerate doping levels for higher conductivity).  The line itself is some metal layers near the top in the middle.  So, surrounded by SiO2 (an excellent dielectric, at least) and made of Al or Cu (mediocre given the tiny thicknesses, but usable).  The PCB equivalent is CPWG, but with more layers relieved under the trace, and vias densely packed (unlike for PCBs, "drilling" is 100% free on a stereolithographic process).

I don't know that such TLs are feasible for logic chips (CPUs etc.), haven't heard anything with respect to that really -- but they can be used for low-Q and resonant structures, like the inductors and tuned circuits in Wifi radios and etc.  A typical inductor might be, whatever, 20pH or something? (I forget the typical orders of magnitude..) And have a Q around 5-12 in the 10GHz range.  So, pretty piss in general, but definitely good enough to be usable.

And not to say there's no application for optics and waveguides.  It's basically the only feasible way to get signals off chip in the fractional THz range and up.  Hence we have RADAR chips today which are something like, phased array over the chip itself, with the package being a "transparent" (i.e., dielectric) window on top.  There are optical devices (probably 10G+ fiber transceivers are?, but I haven't read about them in a long time, I don't know what current tech is -- probably conventional or VCSEL laser diode plus opto-acoustic modulator, something like that?) with the optical interface integrated with the package, no faffing about on-board with any of that alignment mess.  (Not that you'd be likely to work with PCB designs of them anyways, as the modules come standard as well i.e. SFP and such -- unless you work at such an OEM, you're probably just putting in the module socket, even!)

So, as for the fate of silicon -- it's just clickbait.  Silicon is too damn useful to ever go away.  I mean, how do you know, of course you can't, but just for sake of comparison, we have all these fancy compound semiconductors and GHz computers and THz (nearly) radios now, and... we still have fuckin' CD4000 logic floating around, man.  It's not going away, it's only getting better as far as having everything from the highest to the lowest technology node available to us.

Put another way: technology very rarely if ever wholly discards things.  There aren't many stepping stones, at least in practice.  Vacuum tubes are probably the most important example of one -- we'd never have developed the level of technology we did without them, but we have superseded them in all but a few niche cases (and many of those are more like physics apparatus than electronic devices, anyway).  That is, I posit it was a necessary condition to have vacuum tubes, to then develop semiconductors.  The chemical processes alone (like precision analysis (AA, MS) and zone refining) require precise control systems (or wholly electronic signals) that an electronic control is almost mandatory; while limited aspects of the semiconductor process could likely be implemented mechanically (or say with mag amps), I doubt the whole thing could've been developed without something as general as the vacuum tube -- or the transistor, hence it's a self-sustaining technology, but not self-starting.  (Whereas vacuum tubes are tolerant enough of chemical variation, and mostly depend on mechanical tolerances; precision metalforming is a precondition, which had been ably solved through the 19th century.  And all the required chemical elements were available by then, too.)

And even then, it's not that vacuum tubes -- or, say, horses -- are gone.  They're just not economic dependencies now.  Almost everything that's been developed, will still be around in some capacity, albeit maybe just by enthusiasts and historians.

So, I don't see silicon ever going away.  It's just too goddamned useful.  Even if only used as a base to build on (see current eGaN transistors for instance), it's just that good.

As for the main active semiconductor, who knows; it seems likely that, at least the way things are going these days, technology will continue to spread out -- that is, literally, widen.  Clock speeds may even decrease, but total computation continues to rise, moving ahead with more generalized types of computation (like neural network stuff -- this is probably reading into current hype too much, but just to say, it's conceivable that NN, ML, AI whatever stuff will continue to miniaturize, and implement deeper and deeper in the hardware (not necessarily as like memristor arrays and stuff, that might not be consistent enough even with significant development -- but still, arrays of specialized "neuron"-like CPUs can do the job), and eventually lead to basically silicon (or whatever) "brains" that are "taught" by dumping in a bitstream, and then you get out whatever (still approximate) function you need computed.  Along with this, low-loss logic methods may be combined with multilayer chips to get unprecedented volumetric density and compute-per-watt performance.

And to facilitate these approximate methods (and perhaps some of the computing becomes stochastic by itself), new programming paradigms will have to be developed that check the results and correct them, either by reinforcement, or more fundamental (error correcting computing?) methods.  There's probably some kind of problem space similar to "no knowledge proofs" where, given an exact specification of some problem (function, algorithm, etc.?), a process can be devised to compute it in enough alternative ways simultaneously, and combine the results into a more-correct result, either giving complete coverage (all errors provably corrected), or repeat the process until it's arbitrarily correct.

Quantum computing seems unlikely to be better than luggably portable, and at significant power consumption at that, due to the need for cryocooling.  But noise coupling is an arbitrary thing, and some scheme may yet be found that can be effectively shielded from the disturbances of ambient temperature, meanwhile growing to arbitrary sizes (qubits).  In general, once more people are working in this domain (designing, programming, using), solutions to far more difficult problems will become available, especially to notoriously intractable problems in QM for example -- and so bootstrapping can continue.  There is provably no limit to the power available from condensed matter*, I mean, up to the limit of information theory (information flow is power), so we have plenty of orders of magnitude to continue, we just have to figure out how.

*There's a proof in physics, equivalent to the Halting Problem.  It's no accident that condensed matter physics is notoriously difficult; you could potentially be integrating over a manifold of computers, and, who knows what the hell that even means in terms of statistical mechanical properties!

Tim

pcprogrammer:
Thanks Tim,

some bits cleared up and others darkened a bit.  :o


--- Quote from: T3sl4co1l on February 26, 2023, 11:37:35 am ---I mean, radio is light, too.  Don't be a visible chauvinist. ;D

--- End quote ---

Already wrote that I'm a binary person  :-DD

Light is light and radio signals are radio signals.

But I see what you mean.

I agree with you that silicon is very useful and available in abundance and don't see it fully replaced by something else. But what I read about it and what is shown in the video, it is reaching the limits of miniaturizing.

So it is just wait and see what is thought of next.

Thinking back to my early day's when starting to play with electronics, it was with germanium transistors and diodes and if I recall correctly they broke down more easily then the later silicon ones. But this is long long ago.

About the mix of technology, I was also wondering about the size problems that would bring. Even though fibers are very thin, the connection units are still big compared to it, and it would take a lot of it to interface between a cpu and memory. Sure serialization can be used, but that reduces the speed, limiting the gain of switching to light.

And even when faster systems are developed the software will ruin it again, as seen over the many years since the first home computer.  >:D

But kidding aside, today I looked a bit more into quantum computing, and have some understanding of it know and think it will always be a special branch of computing. They are apparently good for solving complex algorithms based on a lot of mathematics. Not really your average embedded controller that you write some nice code for in some mainstream language.



This video provided some insights, but most of it is still above my pay grade. With the superposition and entanglement I get this parallel universes vibe of of it. We put an apple in the box and as long as it is closed it can be any fruit you like, but as soon as we open it, it is an apple kind of logic.

We think it is the right answer, because the computer says it is the answer, but is it the right answer out of all the probabilities that can be the right answer. I will just have to wait and see what the future brings.

But it is good for the brain to ponder on these things once in a while.

Cheers,
Peter

T3sl4co1l:
Yeah I have no idea if QC will bring anything more than niche or mainframe sorts of applications, but as with the other materials there's no proof whether it can/'t be miniaturized or generalized like anything else.

And there's a ton of "room at the bottom" for molecular computing, which might be anything from synthetic chemistry to genetic engineering, most likely carbon-based in either case.  The problem space for that kind of work is just so vast and complicated that we have no hope of working with it right now, but it's very much something QC can contribute to (and is presently, as I understand it, e.g. protein folding).  Maybe that's next, or maybe we need one more stepping stone first.

And yeah, QC is above my grade, to say the least.  I certainly know the basics of QM, well enough to understand a low-level description of QC hardware, say -- but how that relates to what space of equations you can program into, and solve, with such a system, I don't know.  The basic idea is to effectively run a superposition of states, then "refrigerate" away the "lossy" (read: by applied constraints*) energy, cooling it towards a desired final state, which happens to be your desired transformation.  In this way, most (all?) states of the system are explored, while the system is annealed towards a local (or hopefully global) minima.  (And global minima are reachable via tunneling between states, so that's big.)

*The setup is something like setting the coupling constants (phase and magnitude) between qubits.  Whereas on a conventional computer, you'd initialize the memory of a process then let it run, here the system is set up with initial conditions (at least, to the extent those states can be set, i.e. energy levels or even superpositions thereof on the qubits), and coupling factors between qubits, then left to run (perhaps with removing excess energy from the system, driving it towards a desired final state).  That's a very linearized sort of explanation, anyway; I don't know how much nonlinearity affects that (or if such are being used or researched at all, presently), but probably the problem space of nonlinear systems is even more vast than the space of "just doing anything at all" in QC right now.  That's also a very "analog computing" explanation, and, I don't know offhand how procedural, or functional or other conventional paradigms for that matter, QC can really get; these are all very much a matter of developing the very frameworks, to develop the tools, to develop the programs to actually run on the things.

So yeah, very early, very high-concept and hard to do any sort of work with them; in time, applications will get easier, more broadly applicable, and they may even become general, who knows.  That will surely take decades to figure out.  There's no instant revolution, you'll see it coming and before you know it it'll seem natural and great (and terrible, all together at once as technology has always been).


And there's carbon-based computing, which very much seems a likely "end game" of self-sustaining tech, much as life is already (by definition).  I suppose that's the most likely case where [pure crystalline] silicon goes away.  (Silicon, the element, likely remains useful for polymer backbones and structures -- silicones, and glass and ceramics, are likely to stick around a long while.  Heck, even plankton, and some (many??) grasses, themselves use silica structurally!)  But that could take centuries (with shades of transhumanism in there if you like).


With respect to germanium (and other early semiconductors: copper oxide, selenium, etc.), it's easier to process but performs poorer.  Carrier mobility is actually quite good (Ge is a bit better than Si even, especially for holes), but the maximum temp limit is quite low (corresponding to bandgap, which is lower than Si).  Actually that's part of the deal, lower bandgap <--> lower temp limit <--> higher n_i (intrinsic carrier concentration) <--> higher doping required for consistently p/n domains <--> more tolerant of impurities (n_i dominates over n_impurity).  So it's no accident it came first (sort of).  It takes really pure Si to do the job, which came along later.

Likewise, high bandgap materials are harder to process, and while they can offer higher on/off ratios, and gm, current density, etc. (especially because of compound-gradient effects like 2DEG, the key ingredient of a HEMT), they're also not likely to displace Si for computing because of the higher switching voltage (or something like that? I forget exactly how it works out despite gm being higher) and quite poor hole mobility (CMOS is basically exclusive to Si and Ge, and a few others that aren't nearly as easy to use).  Hence why for example that one Cray (Mark 2?) was GaAs NMOS -- good performance sure, but resistor pullups cost stupendous power consumption!

Or consider SiC: with higher bandgap and poorer mobility than Si, it seems an inconvenient choice; its high thermal conductivity, and exceptional breakdown voltage (critical field strength), win out however.  The chip inside say a 1200V 30A 200W MOSFET is absolutely dinky!  Bulk resistance still dominates Rds(on) (less substantial in MOSFETs probably, but extremely noticeable in SiC schottky).  Hence the need for die thinning, backside grinding, that stuff.  On top of the hardness and high-temperature processing, the sheer number of polymorphs it can crystallize into, and its propensity for pernicious defects like screw dislocations, are reasons for its late development.  The other thing it's got going for it is high temperature operation (again, mainly thanks to the higher bandgap), not that commercial parts benefit from it, but specialized applications (downhole sensors, mil/aerospace) can.

I forget why GaAs, GaN, etc. aren't suitable for high temperatures, or not normally anyway.  Other than packaging of course -- also the main reason AFAIK why commercial SiC parts are limited to 150/175C max, the damn epoxy!

Anyway, wide bandgap probably won't lead computing, is my understanding -- aside from possible exceptions like computers using quantum dots made of whatever materials; but they'll continue to serve high frequency and high power applications ably.

Tim

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod