Electronics > Projects, Designs, and Technical Stuff

What's 50 fF?

<< < (8/8)

MK14:

--- Quote from: mawyatt on January 29, 2022, 12:47:37 am ---The speed limitation with todays small featured CMOS processors is not the device speed but thermal limitations, it' been that way for some time now.

Best,

--- End quote ---

Assuming you were referring to my post(s). The upper frequencies, are no higher, than the fastest frequencies already in the cpus I was talking about. But the relatively large number of logic delays needed, in order for the big main processor, to calculate powerful things, such as floating point. Are not needed in a tiny CPU or FSM, with a tiny and very simple instruction set, it needs (potentially) few logic delays, per clock cycle. So can run at far higher frequencies, while still keep the IC transistors at the same frequency, as the main processor(s).
I.e. If each transistor stage (logic gate), takes 100ps, and there are 20 logic gates worth of delay, for the main cpu (per pipeline stage). It can run at 20 x 100ps = 500MHz. Ignoring safety margins and other real life practicalities.
But if a tiny CPU or FSM, only needs 5 logic gates worth of delays, with the same 100ps per transistor/gate stage. Then it can run at 100 x 5 = 500ps = 2GHz.
If I remember correctly, there is some internal trickery, which allows the 2GHz section, to effectively run at 2GHz, despite actually running at a lower physical frequency (1GHz ?). Which is some technique (I've forgotten), like double edge clock triggering, or very advanced, clock phase distribution. Whereby the clock is kept at 500MHz, distributed to where it needs to be, then double edge clocking and/or local PLL and/or phased clock techniques (whatever its called), allow it to effectively perform at 2GHz worth of operations, while not consuming the huge amount of power it might/would, if it reached the true GHz needed for a simple, one phase clock.

MK14:

--- Quote from: mawyatt on January 28, 2022, 11:58:49 pm ---Funny you mention that!! I argued that very Solid State Physics fact in 80s about hole/electron mobility against a GaAs CMOS that Honeywell was developing. They wanted to have a process that would produce a Rad-Hard Space Computer that was also very fast, and even got a bunch of USG funding!! I lost this argument on a number of occasions, and basically was told to "go away, you don't know anything about GaAs!!"....
...
....
so after dumping ~$1B into the GaAs CMOS fab process & designs it was abandoned, and nobody remembered those earlier discussions about hole mobility in GaAs |O

--- End quote ---

That's sad. A pity they didn't listen to you.

I've encountered that sort of situation myself. What can be even more annoying, is that nobody remembers the original discussions, except the person, who pointed out the serious problems, long before things went horribly wrong.

In all fairness to everyone, it is difficult to accurately predict the future. Sometimes things seem virtually impossible to get right. Nethertheless, it does eventually get sorted out and invented.
For a long time, most people thought computers would never beat humans at Chess, or at least not for another 50 or 100 years. But the huge pace of ever faster CPUs, ended that argument.
Computer Speech recognition, was also considered a hurdle, that would take a very long time to master. Yet we now have Amazons Alexa, and other speech recognition devices.

I suppose the current argument, would be when/if we will have self-driving cars, in a big way, in most road systems round the world. Will it be now, 5, 10, 25, 50 or even 100 years into the future ?

mawyatt:

--- Quote ---author=MK14

Yes. The Intel 4004 is on record for being the first (1971) commercially available, single chip microprocessor. but it was not only possible, to do these before then. They actually did (for a US fighter jet, it even hard built in hardware floating point, which is quite amazing for the time). But it was both secret (military) and not commercially available, so is not usually considered first, even though it was around and in use, before the Intel 4004. Also, at a much later point, someone claiming (it probably was them) to be the inventor of that military air-craft, rather advanced cpu. Actually complaining on this very forum, about their lack of recognition, for inventing an earlier microprocessor.

https://www.eevblog.com/forum/chat/first-microprocessor-mp-944-and-the-f-14-cadc/msg1600678/#msg1600678
Invented by: Ray Holt  https://en.wikipedia.org/wiki/Ray_Holt


--- End quote ---

Very interesting, didn't know that history. Thanks!!

Best,

Edit: Back in those days the the military and various USG agencies led the advanced silicon developments, but after the advent of the PC the tide began to shift towards commercial and mostly CMOS. After the arrival of cell phone, the acceleration rate in commercial CMOS completely outpaced any government investment and quickly became the flagship technology we see today.

MK14:

--- Quote from: mawyatt on January 29, 2022, 02:12:31 am ---Edit: Back in those days the the military and various USG agencies led the advanced silicon developments, but after the advent of the PC the tide began to shift towards commercial and mostly CMOS. After the arrival of cell phone, the acceleration rate in commercial CMOS completely outpaced any government investment and quickly became the flagship technology we see today.

--- End quote ---

The big, almost certainly confirmed rumor (expanding on what you just said), is that around the late 1950s, and early 1960s, they (US), were extremely worried about the Cold War, especially if it went nuclear. The electronics of the time, would have meant (for precise navigation and location calculations and things) strapping a massive, multi-cabinet Mainframe computer, on top of a ballistic missile. Maybe not impossible as a payload, but highly impracticable, and would be difficult to power and there would be many other problems, like surviving the G-forces, at times, during the flight.

So, the US Government, poured a ridiculously (normally impracticable) amount of money, into getting, what ended up being the TTL 7400 Integrated Circuits. Which could be turned into, (relatively) much more compact units (computers), for fitting to ballistic missiles. Hence the TTL 7400, helped launch the modern computer technology.
There were also, partly competing (or earlier), logic IC types, such as Resistor Transistor Logic (RTL), and Diode Transistor Logic (DTL). But I recently read, that  Fairchild, had developed, the best Logic Gates of the time (effectively the first good TTL or similar technology).

So, Texas Instruments (TI), realized  how good they were, and tried VERY hard to buy a licence to make them, from Fairchild. But they refused to make a deal (they later acknowledged it was a big mistake on their part), so TI, made their own versions of the chips, anyway (I'm not sure of the right terminology for that time, but it was something like 'clone', 'copy but changed enough to NOT go to court', or similar versions). Hence the (US/TI) TTL 7400 logic devices were born, and the military spec'd/cased versions, were used in the applicable military projects.

The 74181 (ALU), could with suitable surrounding TTL chips, be made into a reasonably compact and powerful, computer. Suitable for going somewhere inside a ballistic missile, and with not too unreasonable power requirements. Perhaps compact suitcase sized, depending on its specifications and the 74 packages used.

Off-hand, I'm not sure of the various internet source(s), I read, somewhat recently about it. But I just had a quick look, and found this:
https://computerhistory.org/blog/the-rise-of-ttl-how-fairchild-won-a-battle-but-lost-the-war/

EDIT: I mentioned UK (Fairchild), but the source doesn't (it says US), so I removed references to the UK from my post.

Navigation

[0] Message Index

[*] Previous page

There was an error while thanking
Thanking...
Go to full version