Author Topic: if any computer was send back to 1960,how it would effect computer evolution?  (Read 9784 times)

0 Members and 1 Guest are viewing this topic.

Offline jmelson

  • Super Contributor
  • ***
  • Posts: 2758
  • Country: us
There is also the distinct possibility that by sending back "advanced" technology that you will actually slow down technological development.
Yes, for a number of years in the '70s and '80s, it was said, fairly seriously, that the IBM 360 slowed down computer advancement in the Soviet Union by at least a decade.  Instead of thinking ahead, they blindly copied a design that was already a little dated when introduced in 1965.  They did develop their own circuit technology, but they just STOLE the OS, compilers, etc.

Jon
 
The following users thanked this post: Richard Crowley

Offline Circlotron

  • Super Contributor
  • ***
  • Posts: 3168
  • Country: au
Instead of sending back the latest and greatest computer that would be impossible for them to reproduce, why not something more in reach like say a Z80 or even a 68000 based system? Still quite a jump from what they had at the time. They might be way impressed instead of just hopelessly bamboozled. A case of less is more.
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Also, do we want to advance computer evolution?

http://www.smbc-comics.com/comic/captcha
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 
The following users thanked this post: BrianHG

Offline Mr. Scram

  • Super Contributor
  • ***
  • Posts: 9810
  • Country: 00
  • Display aficionado
Also, do we want to advance computer evolution?

http://www.smbc-comics.com/comic/captcha
Why not? Humans will go extinct eventually and whether our evolutionary branch ends there, we're superseded by our own evolved selfs or are superseded by our own technology isn't wholly relevant to us. Thinking our species can and should remain relevant is just more typical human hubris. Whatever happens, we've had a pretty good run.
 

Offline RayHolt

  • Newbie
  • Posts: 5
  • Country: us
"1970 The first practical IC microprocessor (Intel 4004) in 1970."

I could not think of a more IMPRACTICAL small computer chip set than the 4-bit 4004. It was so incapable Intel quickly replaced it with the 8008 and then the 8080.  Date LATE 1971 and not really  used much until 1974 as engineers did not know how to program.

On a practical side a 20-bit microprocessor chip set that actually flew a mil-spec fighterjet in 1970 and all with the exact technology as the 4004.
http://FirstMicroprocessor.com
 
The following users thanked this post: Richard Crowley

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4922
  • Country: si
That is quite the impressive 20bit chip.

You have to keep in mind that the Intel 4004 CPU was originally designed to run a calculator and that's it. Any electronic calculator of the time had to be heavily optimized in terms of part count so they ware never powerful. Also the humans that use them are very slow so they didn't need to calculate particularly fast. Also the chip design processes of the time ware very primitive. The schematics for the whole CPU got drawn by hand and then the manufacturing masks layed out onto transparent film by hand using black tape that then got optically reduced down to the final mask. The manufacturing process was also not as sophisticated so yield might suffer if the chip gets too big. To top it off this chip was designed on a incredibly tight deadline so the thing had to be kept as simple as possible and use as few transistors are possible.

What you get as a result of this "transistor penny pinching" is a very dumb and clumsy CPU. But it was plenty powerful to run the calculator it was designed for. Only after it was designed was it realized how useful a CPU on a chip is and that its worth putting more effort to make it better.

You could see the eminence of transistor penny pinching on the good ol PIC16F series of MCUs. Memory addressing is horrible as it can only address 128 bytes of memory at a time (including hardware registers) so paging needs to be constantly used, there are barely any CPU registers so you constantly have to shuffle things to RAM and back, call stack is in hardware and limited in depth and the instruction set leaves a lot to be desired. Yet this crappy MCU got everywhere because it was cheap.
 
The following users thanked this post: Richard Crowley

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Quote
it can only address 128 bytes of memory at a time (including hardware registers) so paging needs to be constantly used
128 bytes of data memory (including hardware registers, so maybe 96 or so bytes of actual user memory) per bank, not per page.

The pages are for the program memory. So both banking and paging must be constantly used. |O But pages are larger, at least.

Also, you have to remember what the competition was back then. And what it took to for a chip company to succeed. There was no internet. There was no adafruit and Sparkfun teaching people how to buy a chip and wire it up and make stuff as a hobby. There was no ICSP. Early chips were OTP. The cost of microcontrollers wasn't as dirty cheap as it is now. The reason they succeeded may have had something to do with the infrastructure, toolchain, documentation, availability, and marketing, in addition to w/e was the price point per device.  And also the peripherals. I believe they were a bit novel in making PIC available to individual engineers and students and small businesses, whereas some of the other chips were sorta only available to specific industries or buyers who wanted thousands of chips, minimum.

As for the OP... Steve Jobs would have been a motivational speaker living in a van down by the river?
« Last Edit: January 10, 2019, 07:45:39 am by KL27x »
 

Offline Berni

  • Super Contributor
  • ***
  • Posts: 4922
  • Country: si
Ah yes sorry i meant to say banking rather than paging. But the worst thing about the PIC16F family are really the CPU registers. You only get a single 8 bit accumulator register as your general purpose register to actually do stuff. All other registers are for status, PC and the awful memory addressing.

And yes Microchip always tried to make there chips as accessible as possible, cost has to do with it too.

The importance of getting your chips in the hands of of the average joe did show too. The biggest competitor Atmel was making 8bit MCUs for the exact same market with a architecture superior to PICs yet Microchip still kept a big flowing and there chips found there way in to so many products. Atmel was still very successful, but you would expect them to sell even better since they made better chips.
 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4317
  • Country: us
  • KJ7YLK
I could not think of a more IMPRACTICAL small computer chip set than the 4-bit 4004.
"Practical" in the sense that it was used in a commercial product and sold for a profit by the manufacturer.  i.e. not  laboratory curiosity or a secret government project.

Quote
It was so incapable Intel quickly replaced it with the 8008 and then the 8080.  Date LATE 1971 and not really  used much until 1974 as engineers did not know how to program.
Here in the commercial, for-profit world, the concept of a monolithic processor evolved into the 8008 and then 8080. The 4004 had a 10 year product life-span. Even today very small-scale microcontrollers are very practical for many applications.  Ben Heck recently posted a video about the ATTINY10 microcontroller in a SOT-23-6 package. And Dave has been doing videos about the 3-cent Padauk PMS150C in SOP-8.


Quote
On a practical side a 20-bit microprocessor chip set that actually flew a mil-spec fighterjet in 1970 and all with the exact technology as the 4004.
http://FirstMicroprocessor.com

A very remarkable accomplishment, and well ahead of what Intel was doing.  Too bad it was hidden away for so long because of government/military secrecy. Also too bad such technology was apparently unknown to NASA. The MIT-developed technology for the Apollo Guidance Computer seems pretty crude by comparison. They even used hand-woven mag core technology to store the firmware.

Your story deserves better coverage.  Have you contacted the computer history people? There are many videos on YouTube covering much more mundane projects than what you are describing.

The 4004 was developed rather from the opposite end of the process.  Federico Faggin had to "sell" the Intel management on making a general-purpose, programmable solution for Busicom's desk calculator.  And then Intel had to buy-back the rights from Busicom in order to sell it as a general commercial product.  Very interesting to hear your scenario of just pouring more money/resources on the project to meet the schedule.

Mr. Holt's video is very interesting and amazing that it is not better covered by documentary videos, etc.

https://youtu.be/aVEm5SSUULc
« Last Edit: January 10, 2019, 01:25:48 pm by Richard Crowley »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf