Author Topic: MCU with FPGA vs. SoC FPGA  (Read 24007 times)

0 Members and 1 Guest are viewing this topic.

Online coppice

  • Super Contributor
  • ***
  • Posts: 8652
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #100 on: July 14, 2023, 08:04:09 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.
Er, very much no. Bit slice was the basis for a large part of the mini computer industry for a number of years. It was how the DSP industry got started, especially the AM2900 family. It was the basis for many microcoded systems, like graphics machines and video manipulators. It was big before the design of the MC68k family had even started.

 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3915
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #101 on: July 14, 2023, 08:10:15 pm »
CAD system based on a 68000 type processor (don't recall exactly which one)

must have been 68020 or 68030.
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #102 on: July 14, 2023, 08:37:29 pm »
CAD system based on a 68000 type processor (don't recall exactly which one)

must have been 68020 or 68030.

No, that would have been much faster.  It was either a 68000 or 68010.  I can't recall which one, but either the 68010 or the 68020 was quite a leap from the previous version. 
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #103 on: July 14, 2023, 08:47:48 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.
Er, very much no.

I'm not saying it was not used.  I'm saying the LSI overtook it rapidly.  Bit slice has significant limitations, such as the slow carry propagation, due to going off chip so much.  They had to produce ECL versions to try to maintain reasonable speeds, but were still outclassed by LSI when the entire ALU was on chip.


Quote
Bit slice was the basis for a large part of the mini computer industry for a number of years. It was how the DSP industry got started, especially the AM2900 family. It was the basis for many microcoded systems, like graphics machines and video manipulators. It was big before the design of the MC68k family had even started.

Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16621
  • Country: us
  • DavidH
Re: MCU with FPGA vs. SoC FPGA
« Reply #104 on: July 14, 2023, 08:49:06 pm »
No, that would have been much faster.  It was either a 68000 or 68010.  I can't recall which one, but either the 68010 or the 68020 was quite a leap from the previous version.

Apollo?

https://en.wikipedia.org/wiki/Apollo_Computer

The dual 68000 processor configuration was designed to provide automatic page fault switching, with the main processor executing the OS and program instructions, and the "fixer" processor satisfying the page faults. When a page fault was raised, the main CPU was halted in mid (memory) cycle while the fixer CPU would bring the page into memory and then allow the main CPU to continue, unaware of the page fault.[8] Later improvements in the Motorola 68010 processor obviated the need for the dual-processor design.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #105 on: July 14, 2023, 08:52:47 pm »
In theory you can have latches in asynchronous logic but you'll need to setup delay paths to meet setup & hold to avoid metastability. I guess you'd have to analyse the whole circuit as being combinatorial logic. There are probably good books on how to tackle such a project.

If you can find a way to avoid metastability without putting limitations on the external inputs, do yourself (and everybody else) a favour and write it up for publication in, say, an IEEE Journal of record :)
There are always limitations where it comes to timing. But the nice thing about asynchronous logic is that it can react to external signals without needing those signals being synchronised to a clock first. Think of a simple D-latch as an example. Or the good old ripple counters like the 7493 (and all its later CMOS incarnations)

Yes, those are advantages.

But all technologies have their disadvantages; with asynchronous logic there is metastability and dynamic hazards.
No. Metastability only occurs when your timing analysis is incorrect and thus setup & hold times of flipflops are violated. This is equally true for synchronous and asynchronous logic. Metastability is such a fundamental problem for digital designs (*) that working around it, is about the first thing they teach you and thus create robust designs.

* Actually not only for digital designs but for any system be it software or hardware.


It appears that "asynchronous" is being used with two different meanings: "no clock" and "no defined timing relationship".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8652
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #106 on: July 14, 2023, 08:53:13 pm »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #107 on: July 14, 2023, 08:54:48 pm »
No, that would have been much faster.  It was either a 68000 or 68010.  I can't recall which one, but either the 68010 or the 68020 was quite a leap from the previous version.

Apollo?

I don't  think so, but I can't say for sure. 


Quote
https://en.wikipedia.org/wiki/Apollo_Computer

The dual 68000 processor configuration was designed to provide automatic page fault switching, with the main processor executing the OS and program instructions, and the "fixer" processor satisfying the page faults. When a page fault was raised, the main CPU was halted in mid (memory) cycle while the fixer CPU would bring the page into memory and then allow the main CPU to continue, unaware of the page fault.[8] Later improvements in the Motorola 68010 processor obviated the need for the dual-processor design.

I read the book, "Soul of the new Machine", which was pretty good.  He mentions how they designed it to handle page faults, but forgot to consider a page fault in the page fault handler.  lol  They had to wire in the pages for the page fault handler, so they were never swapped out.

This book also has one of my favorite quotes.  A guy on the team got so tired of counting nanoseconds, that he quit, bought a farm saying, "I don't want to deal with any time frames shorter than a season."  LOL  I know the feeling.
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #108 on: July 14, 2023, 08:56:34 pm »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?

How long did they use any of the bit-slice designs? 

I don't want to argue with you.  Believe what you wish. 
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #109 on: July 14, 2023, 09:01:40 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips
« Last Edit: July 14, 2023, 09:03:43 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8652
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #110 on: July 14, 2023, 09:02:55 pm »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?

How long did they use any of the bit-slice designs? 

I don't want to argue with you.  Believe what you wish.
Early 70s to early 80s. If you look at the number of generations things like TI's bit slice chips went through, they were obviously selling, with customers demanding better follow ons.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26907
  • Country: nl
    • NCT Developments
Re: MCU with FPGA vs. SoC FPGA
« Reply #111 on: July 14, 2023, 09:05:08 pm »
In theory you can have latches in asynchronous logic but you'll need to setup delay paths to meet setup & hold to avoid metastability. I guess you'd have to analyse the whole circuit as being combinatorial logic. There are probably good books on how to tackle such a project.

If you can find a way to avoid metastability without putting limitations on the external inputs, do yourself (and everybody else) a favour and write it up for publication in, say, an IEEE Journal of record :)
There are always limitations where it comes to timing. But the nice thing about asynchronous logic is that it can react to external signals without needing those signals being synchronised to a clock first. Think of a simple D-latch as an example. Or the good old ripple counters like the 7493 (and all its later CMOS incarnations)

Yes, those are advantages.

But all technologies have their disadvantages; with asynchronous logic there is metastability and dynamic hazards.
No. Metastability only occurs when your timing analysis is incorrect and thus setup & hold times of flipflops are violated. This is equally true for synchronous and asynchronous logic. Metastability is such a fundamental problem for digital designs (*) that working around it, is about the first thing they teach you and thus create robust designs.

* Actually not only for digital designs but for any system be it software or hardware.


It appears that "asynchronous" is being used with two different meanings: "no clock" and "no defined timing relationship".
As a real world circuit can't work without a defined timing relationship, it means I -as someone that is dealing with mostly practical sides of things- am discussing circuits with no clock.  ;D
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8652
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #112 on: July 14, 2023, 09:08:53 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips
The 11/730 was the biggest selling mini-computer of all time, and it was awful. I used to use them. I was shocked when I found how well they sold.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14488
  • Country: fr
Re: MCU with FPGA vs. SoC FPGA
« Reply #113 on: July 14, 2023, 09:16:42 pm »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?

How long did they use any of the bit-slice designs? 

They were used, a lot.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #114 on: July 14, 2023, 11:06:15 pm »
In theory you can have latches in asynchronous logic but you'll need to setup delay paths to meet setup & hold to avoid metastability. I guess you'd have to analyse the whole circuit as being combinatorial logic. There are probably good books on how to tackle such a project.

If you can find a way to avoid metastability without putting limitations on the external inputs, do yourself (and everybody else) a favour and write it up for publication in, say, an IEEE Journal of record :)
There are always limitations where it comes to timing. But the nice thing about asynchronous logic is that it can react to external signals without needing those signals being synchronised to a clock first. Think of a simple D-latch as an example. Or the good old ripple counters like the 7493 (and all its later CMOS incarnations)

Yes, those are advantages.

But all technologies have their disadvantages; with asynchronous logic there is metastability and dynamic hazards.
No. Metastability only occurs when your timing analysis is incorrect and thus setup & hold times of flipflops are violated. This is equally true for synchronous and asynchronous logic. Metastability is such a fundamental problem for digital designs (*) that working around it, is about the first thing they teach you and thus create robust designs.

* Actually not only for digital designs but for any system be it software or hardware.


It appears that "asynchronous" is being used with two different meanings: "no clock" and "no defined timing relationship".
As a real world circuit can't work without a defined timing relationship, it means I -as someone that is dealing with mostly practical sides of things- am discussing circuits with no clock.  ;D

A synchroniser is asynchronous circuit with two inputs and no defined time relationship between the two inputs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline chickenHeadKnob

  • Super Contributor
  • ***
  • Posts: 1056
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #115 on: July 14, 2023, 11:08:59 pm »
No, that would have been much faster.  It was either a 68000 or 68010.  I can't recall which one, but either the 68010 or the 68020 was quite a leap from the previous version.

Apollo?

https://en.wikipedia.org/wiki/Apollo_Computer

The dual 68000 processor configuration was designed to provide automatic page fault switching, with the main processor executing the OS and program instructions, and the "fixer" processor satisfying the page faults. When a page fault was raised, the main CPU was halted in mid (memory) cycle while the fixer CPU would bring the page into memory and then allow the main CPU to continue, unaware of the page fault.[8] Later improvements in the Motorola 68010 processor obviated the need for the dual-processor design.

I bet it was Mentor Graphics software running on an Apollo DN660. I used one of those. It had a bit sliced main CPU that had a horrible number of design revisions. The board was covered with patch wires and was very unreliable. The repair guy admitted it was a stop gap design because the 68020 was delayed, so somehow they were using the 2900  to emulate a 68020.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #116 on: July 14, 2023, 11:11:25 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips
The 11/730 was the biggest selling mini-computer of all time, and it was awful. I used to use them. I was shocked when I found how well they sold.

The alternatives weren't wonderful.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #117 on: July 15, 2023, 01:59:31 am »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips

Dude, read what I write.  "bit slice would always be overrun quickly".  Every one of the examples you cite, were either very low volume, or quickly overrun by fast CPUs.  Why do you continue to argue about this???
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #118 on: July 15, 2023, 02:07:40 am »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?

How long did they use any of the bit-slice designs? 

I don't want to argue with you.  Believe what you wish.
Early 70s to early 80s. If you look at the number of generations things like TI's bit slice chips went through, they were obviously selling, with customers demanding better follow ons.

None of that is in contradiction of what I said.  The volumes of bit-slice products were never large, and any given product with potential of high volume was redesigned with custom chips, or even general purpose CPUs as they ramped up in speed. 

BTW, the AM2900 family was not released until 1975, so "early 70s" is a bit of a stretch.

Bit-slice was always a niche, able to obtain high performance for very high power consumption, and high cost.  It had inherent speed limitations that doomed it from continued improvements.  The entire history of electronics has been as much about economics as it has been technology.  Now, with the huge market for mobile products, it's as much about power. 
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #119 on: July 15, 2023, 02:09:15 am »
Yep, bit-slice was used in many systems, for a very short time, virtually all of them low volume, very high priced.  Calling that "big" is a misuse of the word "big". 
So you think DEC + data general + most other mini-computer makers added up to a small business? Just how big is your threshold for big?

How long did they use any of the bit-slice designs? 

They were used, a lot.

Hard to argue with such a clearly defined point.  "A lot"!  LOL
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #120 on: July 15, 2023, 02:12:01 am »
No, that would have been much faster.  It was either a 68000 or 68010.  I can't recall which one, but either the 68010 or the 68020 was quite a leap from the previous version.

Apollo?

https://en.wikipedia.org/wiki/Apollo_Computer

The dual 68000 processor configuration was designed to provide automatic page fault switching, with the main processor executing the OS and program instructions, and the "fixer" processor satisfying the page faults. When a page fault was raised, the main CPU was halted in mid (memory) cycle while the fixer CPU would bring the page into memory and then allow the main CPU to continue, unaware of the page fault.[8] Later improvements in the Motorola 68010 processor obviated the need for the dual-processor design.

I bet it was Mentor Graphics software running on an Apollo DN660. I used one of those. It had a bit sliced main CPU that had a horrible number of design revisions. The board was covered with patch wires and was very unreliable. The repair guy admitted it was a stop gap design because the 68020 was delayed, so somehow they were using the 2900  to emulate a 68020.

What I used was definitely not Mentor Graphics.  This was before Mentor was much of a player.  We talked to their rep, who had a "mouse", which he described as a fantastic invention, which he couldn't get to work! 

No, their product was more of a toy at the time, very incomplete.  A lot of the "magic" of Mentor came a bit later, through acquisitions.
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #121 on: July 15, 2023, 06:38:33 am »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips

Dude, read what I write.  "bit slice would always be overrun quickly".  Every one of the examples you cite, were either very low volume, or quickly overrun by fast CPUs.  Why do you continue to argue about this???

I, and other people, did read what you wrote, and have pointed out the inaccuracies.

The DEC machines were the highest volume (mini)computers of the time. Not research tools.

Arcade machines were widely distributed production machines. Not research tools.

Everything was "quickly overrun" in the 70s and 80s, from mainframes downwards. In the ARM/x86 era, it may be difficult for youngsters to imagine, but that was a "pre-Cambian" evolutionary era.

And I'm not a "dude".
« Last Edit: July 15, 2023, 06:43:05 am by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2218
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #122 on: July 15, 2023, 07:59:30 am »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips

Dude, read what I write.  "bit slice would always be overrun quickly".  Every one of the examples you cite, were either very low volume, or quickly overrun by fast CPUs.  Why do you continue to argue about this???

I, and other people, did read what you wrote, and have pointed out the inaccuracies.

The DEC machines were the highest volume (mini)computers of the time. Not research tools.

How long were the bit-slice models sold?  Like I said, DEC had the LSI-11 in 1975, using custom LSI chips.  They were not designed to be especially fast, but it was inexpensive (relatively speaking).  I did a bit of reading and found the VAX-11/730 and VAX-11/725 (same CPU) were the only bit-slice renditions of the VAX line.  DEC was using custom LSI for the VAX line, and used bit-slice to build less expensive and much slower versions of the VAX line.  So, in reality, this was not a "high performance" machine using bit-slice, but low end economy models!  LOL 


Quote
Arcade machines were widely distributed production machines. Not research tools.

Sorry, arcade machines were not widely distributed in any real sense.  They sold a tiny fraction compared to high volume devices like PCs.  If you consider arcade machines to be "high volume", then  you are right.  Bit-slice was wildly successful.  But again, how long were the bit-slice models sold, before being replaced with much cheaperCPU based designs?


Quote
Everything was "quickly overrun" in the 70s and 80s, from mainframes downwards. In the ARM/x86 era, it may be difficult for youngsters to imagine, but that was a "pre-Cambian" evolutionary era.

And I'm not a "dude".

Ok, dudett... 

Yes, every individual design is quickly obsoleted, but that's not the same as obsoleting a technology, or design approach.  Once LSI started producing CPUs with significant processing power, they outpaced what could be done in bit-slice and that technology was effectively buried.  Do you actually read what I write?  Read the next paragraph carefully. 

Bit-slice was made up of TTL logic slice chips.  That was faster than CMOS in 1975, when the devices were introduced.  But anytime a signal goes off chip, it is much slower than signals remaining on the chip.  The carry chain of bit-slice had to propagate through multiple chips giving a fundamental limit to the speed.  By 1980 or so CMOS was faster, because the entire CPU could be put on a single die.  Why didn't they use CMOS for bit-slice?  Because it was effectively dead at that point.  They tried ECL, which gave them more speed, but at a huge cost of power.  So, used in specialized products with big price tags and lots of power. 

This is essentially the same thing that happened with the array processing business.  They were cabinet sized machines that cost $200,000 and up, performing 100 MFLOPS (in the case of the machines I worked on).  Within 10 years, this technology was available in a chip from Intel.  Maybe not 100 MFLOPS, but a significant number.  Slave a few together and you now have a $10,000 machine with more performance.  The company is now out of business. 
« Last Edit: July 15, 2023, 08:22:06 am by gnuarm »
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19518
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #123 on: July 15, 2023, 09:45:09 am »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips

Dude, read what I write.  "bit slice would always be overrun quickly".  Every one of the examples you cite, were either very low volume, or quickly overrun by fast CPUs.  Why do you continue to argue about this???

I, and other people, did read what you wrote, and have pointed out the inaccuracies.

The DEC machines were the highest volume (mini)computers of the time. Not research tools.

How long were the bit-slice models sold?  Like I said, DEC had the LSI-11 in 1975, using custom LSI chips.  They were not designed to be especially fast, but it was inexpensive (relatively speaking).  I did a bit of reading and found the VAX-11/730 and VAX-11/725 (same CPU) were the only bit-slice renditions of the VAX line.  DEC was using custom LSI for the VAX line, and used bit-slice to build less expensive and much slower versions of the VAX line.  So, in reality, this was not a "high performance" machine using bit-slice, but low end economy models!  LOL 


Quote
Arcade machines were widely distributed production machines. Not research tools.

Sorry, arcade machines were not widely distributed in any real sense.  They sold a tiny fraction compared to high volume devices like PCs.  If you consider arcade machines to be "high volume", then  you are right.  Bit-slice was wildly successful.  But again, how long were the bit-slice models sold, before being replaced with much cheaperCPU based designs?


Quote
Everything was "quickly overrun" in the 70s and 80s, from mainframes downwards. In the ARM/x86 era, it may be difficult for youngsters to imagine, but that was a "pre-Cambian" evolutionary era.

And I'm not a "dude".

Ok, dudett... 

Yes, every individual design is quickly obsoleted, but that's not the same as obsoleting a technology, or design approach.  Once LSI started producing CPUs with significant processing power, they outpaced what could be done in bit-slice and that technology was effectively buried.  Do you actually read what I write?  Read the next paragraph carefully. 

Bit-slice was made up of TTL logic slice chips.  That was faster than CMOS in 1975, when the devices were introduced.  But anytime a signal goes off chip, it is much slower than signals remaining on the chip.  The carry chain of bit-slice had to propagate through multiple chips giving a fundamental limit to the speed.  By 1980 or so CMOS was faster, because the entire CPU could be put on a single die.  Why didn't they use CMOS for bit-slice?  Because it was effectively dead at that point.  They tried ECL, which gave them more speed, but at a huge cost of power.  So, used in specialized products with big price tags and lots of power. 

This is essentially the same thing that happened with the array processing business.  They were cabinet sized machines that cost $200,000 and up, performing 100 MFLOPS (in the case of the machines I worked on).  Within 10 years, this technology was available in a chip from Intel.  Maybe not 100 MFLOPS, but a significant number.  Slave a few together and you now have a $10,000 machine with more performance.  The company is now out of business.

Some of those points have some validity, but mostly they miss the context of the times.

They completely fail to support your false contention that "Bit slice was always, in essence, a research tool".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8652
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #124 on: July 15, 2023, 03:49:32 pm »
Bit slice was always, in essence, a research tool.  While you could build systems with it, IC technology was advancing so fast, the bit slice would always be overrun quickly.  I guess there were a few designs that never made it to high volume production, where the bit slice was the right choice.

I didn't realise these graphics terminals, arcade games (and computers) were research tools:
PDP-11/23, PDP-11/34, and PDP-11/44 floating-point option, DEC VAX 11/730
Tektronix 4052, Pixar Image Computer, Ferranti Argus 700,
Atari's vector graphics arcade machines
and many others
https://en.m.wikipedia.org/wiki/AMD_Am2900#Computers_made_with_Am2900-family_chips
The 11/730 was the biggest selling mini-computer of all time, and it was awful. I used to use them. I was shocked when I found how well they sold.

The alternatives weren't wonderful.
A modestly configured 11/730 was 100k pounds. There were a lot of things you could buy for that much which performed so much better. We used them because of software and hardware issues that locked us into "needing a VAX". VMS was the problem. It was a dog on most VAX machines, because of its weird file system. This required a super complex disc control to recover some of the performance its design lost you.

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf