EEVblog Electronics Community Forum

Electronics => Microcontrollers => Topic started by: Trader on July 11, 2022, 04:01:01 am

Title: Zilog & Z80 History
Post by: Trader on July 11, 2022, 04:01:01 am
https://www.youtube.com/watch?v=P1aqtfXUCEk (https://www.youtube.com/watch?v=P1aqtfXUCEk)
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 11, 2022, 08:12:12 am
Z80 is cool to be programmed, and used in some first-year university course  :D
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 11, 2022, 09:37:03 am
And Soviet (https://www.cpushack.com/2021/01/26/the-story-of-the-soviet-z80-processor/) z80 processors  :o :o :o
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 11, 2022, 12:02:03 pm
The video says at 3:15 that i4004 was the first microprocessor, not exactly correct, it seems the first microprocessor was Ray Holt's MP944, designed before i4004, but MP944 was for the F-14 fighter jets, and therefore kept secret:
https://www.eevblog.com/forum/chat/fun-for-nerds/msg3990257/#msg3990257 (https://www.eevblog.com/forum/chat/fun-for-nerds/msg3990257/#msg3990257)



Back to Z80, we used to have a Romanian Z80 before 1989, produced by Microelectronica:
https://en.wikipedia.org/wiki/MMN80CPU (https://en.wikipedia.org/wiki/MMN80CPU)

During the 80's I've designed a Z80 based computer compatible with both the CP/M OS and the ZX Spectrum home computer.
(https://cdn.hackaday.io/images/2572221402262586660.jpg)

(https://cdn.hackaday.io/images/5382331402262596379.jpg)
https://hackaday.io/project/1411-xor-hobby-a-vintage-z80-computer-prototype (https://hackaday.io/project/1411-xor-hobby-a-vintage-z80-computer-prototype)


Z80 was a great microprocessor for its time, I think the Z80 core (Z80 code compatibles) are still in production in some microcontrollers.
Title: Re: Zilog & Z80 History
Post by: chickenHeadKnob on July 11, 2022, 02:35:48 pm
When the Z80 came out Intel was not as dependent on microprocessors as they are now. They were more of an all-rounder, with dram and eproms and other stuff.
Year over year they were still making good profits, so no big panic, certainly not a night mare. Besides many players including Intel had more advanced microprocessors in the design pipeline. The general competition was  stronger back then.
Title: Re: Zilog & Z80 History
Post by: Sal Ammoniac on July 11, 2022, 08:52:31 pm
I have fond memories of the Z80. The company I worked for in the 80s used the Z80 and derivatives (such as the Z180 and Hitachi 64180) as an embedded processor for their products and I probably wrote more than 100,000 lines of Z80 assembler code.
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 12, 2022, 01:06:21 am
Nice video. And the 6502 one too.

I'm a bit concerned over the 8080 is not mentioned until the part where old ads are shown comparing the z80 to 8080. Did these guys do 8080 before they left, or it happened while they were already designing the z80?  It's not at all clear in this video.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 12, 2022, 01:27:24 am
The 8080 was released in early '74 by Intel. The engineer that founded Zilog left Intel in late '74. I'll let you do the maths. ;)
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 12, 2022, 03:15:49 am
The 8080 was released in early '74 by Intel. The engineer that founded Zilog left Intel in late '74. I'll let you do the maths. ;)

"it's not at all clear in this video"
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 12, 2022, 06:10:47 am
IIRC, a few engineers from the team that designed the Intel 8080 left Intel to form Zilog, and launched the Z80.

- Z80 is backward compatible with i8080, can run 8080 binary code without any change.
- Z80 had some hardware improvements, reducing the total number of chips needed to build a computer.
- Z80 has different mnemonic names for its assembly language (for the same binary instruction codes).
- Z80 has more instructions than i8080
- Z80 was the precursor of 8086, 80186, 286, 386, 486, etc.
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 12, 2022, 06:14:35 am
- Z80 was the precursor of 8086, 80186, 286, 386, 486, etc.

wtf?  :o :o :o
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 12, 2022, 06:41:16 am
- Z80 was the precursor of 8086, 80186, 286, 386, 486, etc.

wtf?  :o :o :o

"Four things obvious to anyone who knows anything about it, and one lie"
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 12, 2022, 07:02:07 am
8085/8088 and 8086 were perceived as Intel's response to Z80.  IIRC 8088 was still 8bits and was heaving the same registers as 8080 plus some more, though I think it was not binary compatible with 8080 or Z80, yet the assembly source code was directly translatable (one to one) into 8088/8086 assembly, or at least that's how I remember.  Certainly not a lie.  Lying means to intentionally try to deceive.

If my info is not correct, would you mind telling how was it, please, or why do you think that's a lie?
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 12, 2022, 07:07:51 am
When IBM announced the "Personal Computer" program, Intel x86 was chosen only because the Zilog Z80 was banned for being "financially problematic".

What a pity, with the { z80(8bit), z8000(16bit), z80000(32bit) } we would have had better computers nowadays.

z80000 includes a memory management unit that provides protected memory and virtual memory addressing with three methods of accessing memory:

Isn't that better than x86-intel? isn't this already "Unix-style segment"?
For over 30 years intel have promoted nothing but confusion with its lousy definition of "Intel segment"  |O

- - - -

intel-x86 sucks, but evolved during years. It's now on multiprocessing, and both Intel and AMD now have problems with their prose causal-consistency descriptions of the then-current "x86-CC" documentation, which unfortunately turned out to be unsound, forbidding some behavior which actual processors exhibit.

I still remember the early revisions of their "Intel SDM" (2006) which gave an informal-prose model called ‘processor ordering’, which was .... "unsupported" by any example, basically because it doesn't work when you try to implement it, and it doesn't work because it is hard to give a precise interpretation of this description.

Doesn't this sound familiar with intel? when they do something they always do it either badly or too complicated or messed up

Indeed, Intel SDM rev-29 (2008), now they talked about weird memory barrier instructions, now included as "Reads cannot pass LFENCE and MFENCE instructions", but "Writes cannot pass SFENCE and MFENCE instructions”, but writes are now explicitly ordered, but stores are not reordered with other stores so writes by a single processor are observed in the same order by all processors, which unfortunately, it is still problematic, it doesn't tell how to interpret “causality” and says nothing about observations of two stores by those two processors
themselves.

In short, it was still so weak, ugly, complex and potentially catastrophic that you would have given a fsck to that crap. Again. Pretend nothing has happened, touch the kernel as little as possible, as long as it seems working, that's okay  |O

Then someone looked at SPARC. Oh, see, their Total Store Ordering TSO memory model actually works! Why don't we copy it? And Now they are talking about "x86-TSO", a memory model which suffers from neither problem, formalized in HOL4.

- - - - -

That's intel!

A mediocre company that has spat out a crappy disgusting architecture that has nevertheless been able to sell well; now it's POWER10/11 and RISC-V time, and since they are still with their x86-crap, I hope the company dies a painful but quick death so they won't ruin the future of computing as well like they did during 90s and 2000s.
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 12, 2022, 07:31:02 am
Olivetti M20 (1982), OS = PCOS (developed by Olivetti), CPU = Zilog Z8001 ---> business failure due to the lack of software (90%, for x86-PCs)
Olivetti M24 (1983), OS = Olivetti-MSDOS (developed by Microsoft, rebranded) + GW-BASIC (ROM), CPU = intel 8088/86 ---> business success

There was no opensource (and no GNU/Linux | BSD) at that time ... programs were binary only  :o :o :o
Title: Re: Zilog & Z80 History
Post by: David Hess on July 12, 2022, 08:08:32 am
IIRC 8088 was still 8bits and was heaving the same registers as 8080 plus some more, though I think it was not binary compatible with 8080 or Z80, yet the assembly source code was directly translatable (one to one) into 8088/8086 assembly, or at least that's how I remember.

The 8088 and 8086 have the same 16-bit ALU and 16-bit register widths so both are 16-bits despite the 8-bit external bus on the 8088.  Their instruction set had a lot of the same improvements, especially in orthogonality, that the Z80 had over the 8080.
Title: Re: Zilog & Z80 History
Post by: chickenHeadKnob on July 12, 2022, 08:26:50 am
I have fond memories of the Z80. The company I worked for in the 80s used the Z80 and derivatives (such as the Z180 and Hitachi 64180) as an embedded processor for their products and I probably wrote more than 100,000 lines of Z80 assembler code.

The problem for both 64180 and z180 is that they came too late. I considered the 64180 for a design at one point. It looked good but by that time the NEC V25 was available   in the same market segment which was what I ended up choosing. The V25 had an 8088 compatible core and some peripherals and despite my initial doubt turned out to be very pleasant to program.
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 12, 2022, 09:40:07 am
Thanks god, my Vt100 terminal has a z80 chip inside. Some vt100s(1), vt220s and successive units have a Intel 8080 or 8086/88 cpu, I feel lucky ;D

The Vt100 Computer terminal released in 1978 by DEC comes with an Intel 8080, the later VT102 uses an Intel 8085, and Vt180 model released in 1982 by DEC introduced a Z80-based(1) option board which turned the VT100 into a CP/M microcomputer, but the main board is still based on Intel 8080.

(1) some are Mostek MK3880N (Z80 compatible) based, some are Zilog Z80 based

[attachimg=1]
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 12, 2022, 06:02:44 pm
It's interesting to note that Intel's CPUs were not particularly great compared to their competitors. Also, while not everyone will agree here, I consider the most interesting Intel CPUs to be the ones that were commercial failures. Go figure.

Title: Re: Zilog & Z80 History
Post by: Benta on July 12, 2022, 06:35:51 pm
In the late 70s, Intel was teetering on the edge of bankrupcy. What saved their bacon was IBM's choice of the 8088 for the IBM PC.
The tech guys at IBM actually preferred the M68k from Motorola, but as Motorola was a computer manufacturer in those days (VersaDOS, UNIX, VersaModules, big racks, minicomputers etc.), selecting Intel was a nice choice. A small guy that could be kept under control and supported as necessary to supply the CPUs.
Boy, was that a miscalculation.
Title: Re: Zilog & Z80 History
Post by: bson on July 12, 2022, 06:37:56 pm
8085/8088 and 8086 were perceived as Intel's response to Z80.  IIRC 8088 was still 8bits and was heaving the same registers as 8080 plus some more, though I think it was not binary compatible with 8080 or Z80, yet the assembly source code was directly translatable (one to one) into 8088/8086 assembly, or at least that's how I remember.  Certainly not a lie.  Lying means to intentionally try to deceive.

If my info is not correct, would you mind telling how was it, please, or why do you think that's a lie?
What?  The 8088 was an 8086 with an 8-bit external data bus.  Both are 16-bit processors and both have the same 20-bit segmented memory model and will run exactly the same binaries. They have absolutely nothing in common with the Z80 or 8080.  The 8086 was released in 1978, designed as a general-purpose personal computer microprocessor.

The 8085 is an 8080 with a simplified pinout and some minor tweaks (like improved interrupt handling).
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 12, 2022, 06:51:24 pm
Nothing in common with the 8080 is a bit of a stretch (give us your definition of that), but otherwise yeah, the 8088 was just a 8086 with a reduced external data bus AFAIK. Like the 68008 with its 8-bit external data bus. That would be cheaper to integrate.

Now while the 8088 was a commercial success, I don't think the 68008 ever was. That's also an interesting thing to ponder on.
I don't even know what kind of product the 68008 was used in except for the Sinclair QL. Probably a bunch of niche stuff I have never seen.
Title: Re: Zilog & Z80 History
Post by: woofy on July 12, 2022, 07:01:59 pm
It's interesting to note that Intel's CPUs were not particularly great compared to their competitors.
Something not lost on Intel judging by the effort they have put in trying to move away from x86 over the years.
i432, i960, i860 and biggest disaster of all, Itainium.
x86/amd64 has such a massive momentum its near unstoppable. At this stage only ARM and RISCV stand any real chance.
Title: Re: Zilog & Z80 History
Post by: bson on July 12, 2022, 07:02:55 pm
In the late 70s, Intel was teetering on the edge of bankrupcy. What saved their bacon was IBM's choice of the 8088 for the IBM PC.
The tech guys at IBM actually preferred the M68k from Motorola, but as Motorola was a computer manufacturer in those days (VersaDOS, UNIX, VersaModules, big racks, minicomputers etc.), selecting Intel was a nice choice. A small guy that could be kept under control and supported as necessary to supply the CPUs.
Boy, was that a miscalculation.
IBM started designing the PC in 1980.  The 68008 didn't come out until 1982, so it didn't exist and wasn't an option.  The 8088 was the only 16-bit processor available with an 8-bit external data bus.  The latter was important to keep board complexity down and fitting everything on a single mainboard.  The 68000 was also very new when IBM started designing the PC, while the 8086 series had been shipping for a number of years and could be sourced in volume.  I'm sure that was a factor.

The original 68000 wasn't that great either, but its size and need for support chips meant the resulting design would have been priced more along the line of Cromemcos which would have placed it completely beyond IBM's intended market.
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 12, 2022, 07:43:13 pm
8085/8088 and 8086 were perceived as Intel's response to Z80.  IIRC 8088 was still 8bits and was heaving the same registers as 8080 plus some more, though I think it was not binary compatible with 8080 or Z80, yet the assembly source code was directly translatable (one to one) into 8088/8086 assembly, or at least that's how I remember.  Certainly not a lie.  Lying means to intentionally try to deceive.

If my info is not correct, would you mind telling how was it, please, or why do you think that's a lie?
What?  The 8088 was an 8086 with an 8-bit external data bus.  Both are 16-bit processors and both have the same 20-bit segmented memory model and will run exactly the same binaries. They have absolutely nothing in common with the Z80 or 8080.

Yes, in 8088 only the bus was 8bits, but the registers in 8086/8088 were a superset of the 8080 registers, and even if the ALU was 16 bits, it can still operate with 8 bits.  Sure, 8086 was having some extra features, including segmented addressing registers, prefetch and pipeline instruction decoding and so on.

I still think the 8080 architecture (and thus Z80, too) was the ancestor of 8086, and I still think 8086 was an augmented 8080 rather than being a totally new microprocessor.  8086 was a followup of 8080.

I won't argue any more about this, and nowadays such details are irrelevant anyways, though for the curious, just look at the registers, and it will be clear that 8086 is an enhanced 8080.  XLT86™, 8080 to 8086 Assembly Language Translator, USER'S GUIDE, Copyright © 1981, Digital Research, Inc. (http://www.s100computers.com/Software%20Folder/Assembler%20Collection/Digital%20Research%20XLT86%20Manual.pdf)

See the similarities between 8080 and 8086:
(https://qph.cf2.quoracdn.net/main-qimg-f95b51d7947e0d556560c3c6bfbaee6d-c)
Picture from:  http://www.eazynotes.com/pages/microprocessor/notes/block-diagram-of-intel-8086.html (http://www.eazynotes.com/pages/microprocessor/notes/block-diagram-of-intel-8086.html)

Attached snapshot is from Quora page https://www.quora.com/What-is-the-difference-between-8080-and-8086 (https://www.quora.com/What-is-the-difference-between-8080-and-8086)
Title: Re: Zilog & Z80 History
Post by: paf on July 12, 2022, 07:43:47 pm
Well, IBM had a 68000 based computer in those days:
https://en.wikipedia.org/wiki/IBM_System_9000 (https://en.wikipedia.org/wiki/IBM_System_9000)
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 12, 2022, 07:45:52 pm
HP had workstations with 68k's too.

Anyway, whoever thinks commercial success is in any way directly linked to the technical merits would be severely deluded. ::)
Title: Re: Zilog & Z80 History
Post by: chickenHeadKnob on July 12, 2022, 08:39:00 pm
Well, IBM had a 68000 based computer in those days:
https://en.wikipedia.org/wiki/IBM_System_9000 (https://en.wikipedia.org/wiki/IBM_System_9000)

IBM also had the xt/370 and at/370. These were co-processor cards you put into a PC that ran a custom version VM/370. They had an MC68000 onboard and I seem to remember they may have had custom micro-code to emulate the 370 instruction set.
https://en.wikipedia.org/wiki/PC-based_IBM_mainframe-compatible_systems#Personal_Computer_XT/370 (https://en.wikipedia.org/wiki/PC-based_IBM_mainframe-compatible_systems#Personal_Computer_XT/370)
Title: Re: Zilog & Z80 History
Post by: David Hess on July 12, 2022, 09:09:13 pm
What?  The 8088 was an 8086 with an 8-bit external data bus.  Both are 16-bit processors and both have the same 20-bit segmented memory model and will run exactly the same binaries. They have absolutely nothing in common with the Z80 or 8080.  The 8086 was released in 1978, designed as a general-purpose personal computer microprocessor.

Besides being able to run 8080 binaries with translation, the 8088, and nominally the 8086, (1) maintained compatibility with the 8080 peripheral chips which were available at low cost.

(1) The 8086 with its 16-bit only external bus could use the same 8-bit peripheral chips, but it seems like the addressing would be different so I am not sure how they handled it if they wanted compatibility.  The 16-bit external bus has separate strobes for the upper and lower halves to support 8-bit access, so I guess a separate set of latches and some glue logic would be required to access 8-bit peripherals in a compatible way, and early 8086 systems did have compatibility problems.


Title: Re: Zilog & Z80 History
Post by: David Hess on July 12, 2022, 09:21:16 pm
IBM started designing the PC in 1980.  The 68008 didn't come out until 1982, so it didn't exist and wasn't an option.  The 8088 was the only 16-bit processor available with an 8-bit external data bus.  The latter was important to keep board complexity down and fitting everything on a single mainboard.  The 68000 was also very new when IBM started designing the PC, while the 8086 series had been shipping for a number of years and could be sourced in volume.  I'm sure that was a factor.

The way I remember it, Motorola could have produced the 68008 much earlier, but brushed IBM off for whatever reason, and then changed their mind too late.  My own experiences dealing with Motorola later duplicated this "our way or the highway" attitude from Motorola.

IBM started designing the PC in 1980.  The 68008 didn't come out until 1982, so it didn't exist and wasn't an option.  The 8088 was the only 16-bit processor available with an 8-bit external data bus.  The latter was important to keep board complexity down and fitting everything on a single mainboard.  The 68000 was also very new when IBM started designing the PC, while the 8086 series had been shipping for a number of years and could be sourced in volume.  I'm sure that was a factor.

I do not remember now, but Motorola might not have agreed to the second source agreement IBM demanded as quickly as Intel did.  In any event, the 8088 was definitely less expensive.
Title: Re: Zilog & Z80 History
Post by: Sal Ammoniac on July 12, 2022, 11:11:34 pm
The Z80 lives on! Zilog still sells several MCUs based on the Z80 with various enhancements.

Rabbit MCUs have a Z80-compatible core. https://www.digi.com/products/browse/rabbit (https://www.digi.com/products/browse/rabbit)
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 13, 2022, 09:31:53 am
My own experiences dealing with Motorola later duplicated this "our way or the highway" attitude from Motorola.

Yup, I remember a seminar, the brochure they gave us said exactly that "moto-way or the highway" with pride  :o :o :o

(they were advertising mc683xx for Ford Racing)
Title: Re: Zilog & Z80 History
Post by: Kleinstein on July 13, 2022, 09:38:09 am
Initially the 68000 was quite expensive and at that cost level the 16 bit bus was not a big deal. One would usually anyway need multiple ROM and RAM chips.
So I don't think an earlier 68008 would have changed much. It is only rather limited systems that really profit from a smaller bus. It was more later that small system like in test instruments could get away with a single ROM and maybe SRAM chip and still wanted more than a 6809 or Z80.

The 8088 was more like an oddity to allow building a slightly cheaper 16 bit computer, though at reduced performance. So IBMs choice was kind of odd more like part of rushing it to market. They mainly had to outperform Z80 based CP/M systems.  Very few of the PC clones used the 8088 - using the 8086 (or later a NEC V...) was the more obvious choice and gave them extra performance with little extra costs.
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 13, 2022, 10:19:03 am
Very few of the PC clones used the 8088 - using the 8086 (or later a NEC V...) was the more obvious choice and gave them extra performance with little extra costs.

That's not how I remember it.

Other MS-DOS / CP/M-86 machines that were *not* PC clones did often use an 8086 or v30, but the IBM clones that could run Flight Simulator and 1-2-3 and so forth stuck with the 8088 so that things also ran the same speed as on the IBM (sometimes with a switchable "Turbo" clock speed).
Title: Re: Zilog & Z80 History
Post by: iMo on July 13, 2022, 11:02:12 am
I got my Atari 520STM (68000, I upgraded it to 1MB dram soon) in Oct 1986. At that time IBM PCs were a joke..
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 13, 2022, 01:30:44 pm
HP had workstations with 68k's too.

At the last hacking camping, Ania, the Estonian girl with weird hair and mirror lens, showed me the Beehive ATL-008, a very nice ANSI/VT102 terminal that supports user programming in C, and - she explained - based on the 68008, with a quite the nice interface and extensions like soft keys.

She also typed on a Durango VT-Poppy fully restored and repainted in fresh bright green :o :o :o
Title: Re: Zilog & Z80 History
Post by: peter-h on July 13, 2022, 01:42:25 pm
The 68k won (for a long time, until 80x86 took over the universe) because it had 32 bit linear addressing, while everybody else was trying to flog the dead horse of segmentation which was a crap solution to just about everything unless you were an "academic OS purist" who wanted complex memory protection schemes which nobody actually wanted for any real product.

Zilog made the same mistake with the Z8000 (Z8001) which was also stupidly segmented. Only the Z80000 could do linear addressing but it a) came too late and b) was NMOS and drew loads of power (I saw some samples).

The rest of the story was, as has been said, good and bad timing.

Very interesting document on the Zilog history... thanks!

I was, according to Zilog, the first European design-in for the Z280. Great chip, lots of potential. Must have used ~10k of them. Wrote a simple RTOS for it too, and then the 64k addressing was completely circumvented (max task size 64k still).
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 13, 2022, 01:44:45 pm
I got my Atari 520STM (68000, I upgraded it to 1MB dram soon) in Oct 1986. At that time IBM PCs were a joke..

The Atari ST was not bad at all. The thing that always got me about the Amiga and Atari machines was the poor quality of screens and keyboards. Not really something you could use 12+ hours  a day. Apple was always ahead in that, even in the Apple ][ days.

From 1984 I pretty much always had the ability to take a university or work Mac home so I never bought anything myself, but I finally took the plunge in 1989 when it was starting to be possible (in New Zealand) to dial into a local BBS that had at first usenet and email, then later full internet with telnet and ftp and so forth. And 2400 bps modems at reasonable prices also just arrived. So I got a Mac IIcx with a "full page" (640x870) mono screen, and I think initially a 65 MB Rodime hard disk. I forget how much RAM. Probably 4 or 8 MB.
Title: Re: Zilog & Z80 History
Post by: coppice on July 13, 2022, 02:29:42 pm
- Z80 has more instructions than i8080
The Z80 had a LOT more instructions than an 8080. It also had a LOT more registers than an 8080. And almost none of that was used by 99% of the Z80 chips out there. For all its sophistication it was generally used as a faster 8080, that needed only a 5V supply, and had a little more integration.
Title: Re: Zilog & Z80 History
Post by: peter-h on July 13, 2022, 02:36:04 pm
It depends on how you programmed. Early days, most was asm, and then everybody did use the extra instructions, and often extra registers. For example, float maths using the alternate set (EXX & EX AF, AF') was much faster. This was in the days when floats would take milliseconds at least and this was a real problem. The extra regs also made 32 bit integer maths much faster, which was the "smart" way to get precision and performance.

C compilers were also crap in those days.
Title: Re: Zilog & Z80 History
Post by: coppice on July 13, 2022, 02:37:06 pm
Very few of the PC clones used the 8088 - using the 8086 (or later a NEC V...) was the more obvious choice and gave them extra performance with little extra costs.

That's not how I remember it.

Other MS-DOS / CP/M-86 machines that were *not* PC clones did often use an 8086 or v30, but the IBM clones that could run Flight Simulator and 1-2-3 and so forth stuck with the 8088 so that things also ran the same speed as on the IBM (sometimes with a switchable "Turbo" clock speed).
You are right. The IBM PC clone market went exactly the same way as IBM. 8088s for the early machines, moving to the 80286 as they expanded to more capable machines. So much early MSDOS code, especially games, was so clock rate specific that even using a V20 instead of an 8088 risked alienating some potential customers.
Title: Re: Zilog & Z80 History
Post by: Sal Ammoniac on July 13, 2022, 04:20:56 pm
- Z80 has more instructions than i8080
The Z80 had a LOT more instructions than an 8080. It also had a LOT more registers than an 8080. And almost none of that was used by 99% of the Z80 chips out there. For all its sophistication it was generally used as a faster 8080, that needed only a 5V supply, and had a little more integration.

That was probably the case with CP/M and application programs, which had to run on both 8080-based and Z80-based systems. For embedded applications, we used every new feature on the Z80--the new registers and the new instructions. I was particularly fond of the EXX and EX AF,AF' instructions to minimize interrupt handler overhead.
Title: Re: Zilog & Z80 History
Post by: peter-h on July 13, 2022, 05:38:02 pm
Exactly. I implemented CP/M 2.2 on a number of machines, writing a BIOS.

The techniques used then for performance were same as today. You used DMA for example, like today you use DMA in an ARM32.

Stuff which was really hard to do was fast floats, so printf/scanf type functions were slow as a pig. We used a £1k IAR C compiler whose runtimes were written in C also so you got a double hit :) And their code was really dumb. I rewrote a lot of it in asm, but the real speedup was via specialisation of the data formats.
Title: Re: Zilog & Z80 History
Post by: David Hess on July 13, 2022, 07:35:59 pm
Initially the 68000 was quite expensive and at that cost level the 16 bit bus was not a big deal. One would usually anyway need multiple ROM and RAM chips.

So I don't think an earlier 68008 would have changed much. It is only rather limited systems that really profit from a smaller bus. It was more later that small system like in test instruments could get away with a single ROM and maybe SRAM chip and still wanted more than a 6809 or Z80.

The 8088 was more like an oddity to allow building a slightly cheaper 16 bit computer, though at reduced performance. So IBMs choice was kind of odd more like part of rushing it to market. They mainly had to outperform Z80 based CP/M systems.

The cost advantage came from using memory with half of the bank width.  Back then memory was a major part of the cost of a system, and an 8088 system could have half the minimum memory size of an 8086 (or 68000) system.  The memory size could also be incremented by half as much.

Quote
Very few of the PC clones used the 8088 - using the 8086 (or later a NEC V...) was the more obvious choice and gave them extra performance with little extra costs.

Huh?  They practically all used the 8088 (or NEC V20), except for systems which were not completely hardware compatible but could still run MS-DOS or CP/M-86.  I think the hardware comparability issue came from the addressing difference that I described earlier.  Eventually there were some 8086 systems which were hardware compatible, or at least suppose to be.
Title: Re: Zilog & Z80 History
Post by: David Hess on July 13, 2022, 07:39:43 pm
Zilog made the same mistake with the Z8000 (Z8001) which was also stupidly segmented. Only the Z80000 could do linear addressing but it a) came too late and b) was NMOS and drew loads of power (I saw some samples).

They did make the same mistake, but when the 8088 was adopted by IBM, segmentation was seen as a great improvement over the bank switching which was becoming available on 8080 and Z80 CP/M machines to support more than 64 kilobytes.

If the 68000 had become the ISA standard, presumably via the 68008, then we would be complaining about a different set of legacy features like separate address and data registers, and unrecoverable double faults from indirect indirect addressing modes.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 13, 2022, 08:05:46 pm
Zilog made the same mistake with the Z8000 (Z8001) which was also stupidly segmented. Only the Z80000 could do linear addressing but it a) came too late and b) was NMOS and drew loads of power (I saw some samples).

They did make the same mistake, but when the 8088 was adopted by IBM, segmentation was seen as a great improvement over the bank switching which was becoming available on 8080 and Z80 CP/M machines to support more than 64 kilobytes.

If the 68000 had becomes the ISA standard, presumably via the 68008, then we would be complaining about a different set of legacy features like separate address and data registers, and unrecoverable double faults from indirect indirect addressing modes.

Yes of course. Complaining is a sport.
And history shows us that the main blocker is not any particular technical limitation, it's our resistance to any change. Which is why we have stuck to mediocre architectures for decades. Still doing useful stuff with it.
Title: Re: Zilog & Z80 History
Post by: langwadt on July 13, 2022, 08:57:20 pm
Zilog made the same mistake with the Z8000 (Z8001) which was also stupidly segmented. Only the Z80000 could do linear addressing but it a) came too late and b) was NMOS and drew loads of power (I saw some samples).

They did make the same mistake, but when the 8088 was adopted by IBM, segmentation was seen as a great improvement over the bank switching which was becoming available on 8080 and Z80 CP/M machines to support more than 64 kilobytes.

If the 68000 had becomes the ISA standard, presumably via the 68008, then we would be complaining about a different set of legacy features like separate address and data registers, and unrecoverable double faults from indirect indirect addressing modes.

Yes of course. Complaining is a sport.
And history shows us that the main blocker is not any particular technical limitation, it's our resistance to any change. Which is why we have stuck to mediocre architectures for decades. Still doing useful stuff with it.

reminds me of a lecture on allied technology and war production during WWII, generally came in three categories: too expensive, too late, and good enough. The choice was obvious
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 13, 2022, 09:00:11 pm
Complaining is a sport.

Except in my previous post, one page ago, when I complained about the Intel-TSO: in that case, it is mandatory to complain about Intel ;D

Even because neither Intel nor AMD reveal their internal details about the used x86-MC, and this way you can't verify if they are unable to satisfy axioms in every case.

I don't know if they do that because they don't want to give advantage to their competitors (who?), and I don't care too much; for me, only IBM POWER10 and POWER11 give a general consensus on which model is adequate for the multi-processors workloads.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 13, 2022, 09:04:09 pm
Complaining is a sport.

Except in my previous post, one page ago, when I complained about the Intel-TSO: in that case, it is mandatory to complain about Intel ;D

Just because this is a sport doesn't mean it's never justified.
The general point was that we usually prefer sticking to what is familiar, even when it sucks, which makes us complain and find workarounds. And then, it becomes good enough. As langwadt pointed out. ;D
Title: Re: Zilog & Z80 History
Post by: David Hess on July 14, 2022, 01:51:45 am
And history shows us that the main blocker is not any particular technical limitation, it's our resistance to any change. Which is why we have stuck to mediocre architectures for decades. Still doing useful stuff with it.

Backwards compatibility is a virtue in itself.  The 8086/8088 and MS-DOS started with an advantage over the competition by being backwards compatible with the established 8080 and CP/M base.

The inferior option was "just recompile".  The RISC vendors tried that.

Title: Re: Zilog & Z80 History
Post by: brucehoult on July 14, 2022, 02:58:30 am
And history shows us that the main blocker is not any particular technical limitation, it's our resistance to any change. Which is why we have stuck to mediocre architectures for decades. Still doing useful stuff with it.

Backwards compatibility is a virtue in itself.  The 8086/8088 and MS-DOS started with an advantage over the competition by being backwards compatible with the established 8080 and CP/M base.

The inferior option was "just recompile".  The RISC vendors tried that.

"Recompile the important stuff and emulate the rest" has worked pretty well for Apple over three transitions between four very different ISAs: M680x0 -> PowerPC -> AMD64 -> ARM64.

Actually, there was also i686 in there briefly, with a few Pentium 4 machines sold to developers and then 32 bit Core Solo and Core Duo CPUs going into a few machines (Mac Mini, 17" and 20" iMacs, 13" MacBooks, 15" and 17" MacBook Pro) from February 2006 until Core2 Duo machines were introduced starting in September 2006 and moving through the range over the next six months or so.

The 8080 -> 8086/8088 transition also of course required a recompile, or at the least a mechanical translation between one assembly language and the other. Some 8080 instructions required more than one 8086/8088 instruction.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 14, 2022, 03:15:17 am
Yes, Apple is an excellent example of that. Oh, there's also a version of Win 10 for ARM, btw.

But sure, minimizing the *apparent* effort almost always wins. So we stick to what we know works.
That's also why I chose the example of IBM and Commodore. Neither believed in the personal computer. They wanted to keep doing what they knew sold.
It took someone very persuasive at IBM to pursue the project, and the rest is history.

Sometimes it's just a matter of time. The RISC approach may have been marginal when the first CPUs were released, but it's now literally everywhere.
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 14, 2022, 06:52:41 am
z80 and 8080 ...good old times when recompiling meant looking at the assembly-list ...  :D

(and ".org" didn't mean what it currently means in GNU-AS way)
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 14, 2022, 07:36:10 am
Sometimes it's just a matter of time. The RISC approach may have been marginal when the first CPUs were released, but it's now literally everywhere.

The 6502, is often said to be a sort of "proto-RISC" design, but in the beginning, there was TV!

The British BBC program called "The Computer Programme" was an attempt by the BBC to educate Britons about just what the hell all these new fancy machines that looked like crappy typewriters connected to your telly were all about. Anyway, the TV-show explained fundamental computing concepts and teach a bit of BASIC programming.

Acorn was commissioned by the BBC to make kits for children, and the BBC Micro proved to be a big success (they made a lot of money) for Acorn, becoming the dominant educational computer in the UK in the 1980s.

At that time, Acorn was a Cambridge-based firm that started in 1979 after developing computer systems originally designed to run fruit machines (slot machines) then turning them into small hobbyist computer systems based on 6502 processors.


The 6502 was not enough if they wanted to compete, so they looked at
None of these were really doing the job, though, and Acorn reached out to Intel to see about implementing the Intel 80286 CPUs into their new architecture.

Thanks God, Intel ignored them completely  :-DD

And so ... next made the fateful decision to design their own CPU. Inspired by the company that was developing new 6502 versions (Western Design Center) and various research about a new sort of processor design concept called Reduced Instruction Set Computing (RISC), Acorn decided to move ahead. Engineers Steve Furber and Sophie Wilson proved to be key players on the project.

Many many months later, using the BBC Micro's Tube interface as a "testbed", the new RISC-based CPU was called ARM (I have the full story book about that). The chip manufacturing supplier VLSI began to produce ARM CPUs, first for Acorn's internal research and development. Not long after, a production version, the ARM2, was ready.

1987, Acorn RISC, with better performance than Intel's 286

After the launch of the Macintosh, Steve Jobs was fired and the new Apple's CEO John Sculley, Mr. (Pepsi-Cola) John Sculley came sniffing around in the late 1980s looking for a CPU powerful enough to translate handwriting into text and run a GUI all while being powered by AA batteries and not turning the handheld device it was to run into a hand-burning block of pain.

That's why Apple and Acorn's chip partner VLSI partnered with Acorn to spin off the ARM division into its own new company, called Advanced RISC Machines, allowing the ARM name to stick (at the beginning A.R.M. was for "Acorn RISC Machines"). Under this alliance, with Apple's considerable resources added, ARM would develop the ARM6 core, with the ARM610 CPU being the first production chip based on that core, and, in a 20 Mhz version, would go on to power the Apple Newton in 1993.

1993,  Apple Newton, powered by an ARM6 core

just a matter of time ...  and domino-effect ;D
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 14, 2022, 07:57:51 am
The Z80 had a LOT more instructions than an 8080.

Not only that, but it has even more instructions than a lot more!  ;D

I mean there were a few undocumented instructions.  Those undocumented instructions were not intended by design, but they were a side effect of the design.  I've used those a few times to make own programs harder to disassemble, so harder to reverse engineer.



I implemented CP/M 2.2 on a number of machines, writing a BIOS.
...
We used a £1k IAR C compiler ...

Same here!  :D
I had to write my own CP/M BIOS, too (for the Z80 computer I've posted in the beginning of this thread).

The most complicated thing was reading/writing floppy-disk sectors, thought my Z80 design doesn't have a DMA chip, so the data transfer was very tricky, optimized it to work on interrupts and was keeping track of each clock cycle of the CPU (adding them by hand so it won't be late for the next data coming from the floppy disk controller).  I was a teenager, so no money for fancy compilers.  All the BIOS was written in Z80 assembly, and stored on cassette-tape recorder before the BIOS rutine for the floppy were good to use.

Not sure if mine was CP/M 2.0 or 2.2, but I remember the BIOS (Basic Input Output System) part was pretty simple, it was heaving 7 entry points for very basic hardware abstraction, like reading a key from the keyboard, typing a character on the screen, printing a character to a printer, reading/writing a floppy-disk sector, etc.

The higher abstractions were made in BDOS (Basic Disk Operating System) and CCP (Console Commands Processor)  components of the CP/M OS, but I was heaving those as binaries and didn't have to write them, too.  It was all a hack, with the CP/M copied as a binary from another 8080 CP/M computer, then the address of the entry points in the BIOS subroutines were identified, and in the same memory area I placed my own BIOS binaries.  The luck was that BIOS, BDOS and CCP were separated not only as software layers, but as memory location for the binaries, too.

(http://cpmarchives.classiccmp.org/cpm/mirrors/www.geocities.com/SiliconValley/5711/arch_mem.gif)
Image source:  http://cpmarchives.classiccmp.org/cpm/mirrors/www.geocities.com/SiliconValley/5711/architec.html (http://cpmarchives.classiccmp.org/cpm/mirrors/www.geocities.com/SiliconValley/5711/architec.html)

The only documentation I had was a CP/M listing printed on tractor paper in 8080 assembly, and an unrelated book about 8272 floppy-disk controller.  ::)



Speaking of CP/M, found out this morning CP/M is now free, from this HaD article:
https://hackaday.com/2022/07/13/cp-m-is-now-freer-than-it-was/ (https://hackaday.com/2022/07/13/cp-m-is-now-freer-than-it-was/)
http://www.cpm.z80.de/license.html (http://www.cpm.z80.de/license.html)
Title: Re: Zilog & Z80 History
Post by: iMo on July 14, 2022, 09:01:13 am
I got my Atari 520STM (68000, I upgraded it to 1MB dram soon) in Oct 1986. At that time IBM PCs were a joke..
Before that I had ZX81 (with DIY 16kB dram) and ZX Spectrum 48kB. Both Z80 of course. Did a lot of programming in Basic and ASM. The 68k with the Atari was a fully different dimension, however.. You cannot compare the Z80 and 68k, no way..
Title: Re: Zilog & Z80 History
Post by: peter-h on July 14, 2022, 01:43:34 pm
Quote
Those undocumented instructions were not intended by design, but they were a side effect of the design

Reportedly, some clones didn't have these.
Title: Re: Zilog & Z80 History
Post by: DiTBho on July 15, 2022, 11:14:21 am
is there any z80 laptop?
are there other z80 vt-terms?
Title: Re: Zilog & Z80 History
Post by: oPossum on July 15, 2022, 12:40:16 pm
is there any z80 laptop?

https://en.wikipedia.org/wiki/Cambridge_Z88
Title: Re: Zilog & Z80 History
Post by: peter-h on July 15, 2022, 01:32:11 pm
Quote
is there any z80 laptop?
are there other z80 vt-terms?

There were no Z80 laptops because the technology required to build a "laptop" didn't exist until many years later.

In the early 1980s I did most of the design of a specialised portable computer, named "Visaid", to be carried by pharmaceutical sales reps. About 15 x 30 x 30cm. It looked somewhat like the Osborne "portable" computer but much nicer. It would run graphical presentations on how drugs worked in the body. So we had to do animated images of things like a beating heart. It had a Z80, 8MHz, Z80 DMA, a CPU board about 30x30cm, a custom built graphics controller (I vaguely remember an NEC UPD7220) a 9" colour CRT for which I designed the drive board, a NiCd battery pack big enough for about an hour, a mains (110-240V) switch-mode charger / power supply. A London advertising agency developed the visual material to be presented, and a guy with long ginger hair and sandals and a Communist Party membership card wrote a software package (it ran CP/M 2.2) which would be basically a video effects editor to be used by the ad agency. The data to be displayed lived on a 3.5" diskette, encoded as a list of macros for which I wrote an interpreter. The box was a foam injection moulding, like some PC tower cases in recent years, and about 5mm thick. Sprayed internally with a zinc coating for EMC reasons. In the lid was a keyboard, which we also built, with all the buttons populated on a PCB.

That was state of the art in miniaturisation!

And it actually worked! I think we made about 30 of them. It was scrapped after a year or so because only male salesmen with the build of Geoff Capes could carry it (it was over 10kg). There wasn't anything actually wrong with the performance, for that job. Obviously the graphics wasn't 4K (it was something like 320x240) but the same system was used for the software development (Macro-80 assembler etc, and a Dbase 2 database for storing the sequence of graphics commands).

It could never have run something like Win 3.1, etc. But in those days people didn't need that. What sold fantastic quantities was a word processor, a spreadsheet, a means of printing onto paper, and stuff like that.

The Z80 later went to something like 50MHz.

By the time it became possible to build a "laptop", CPU technology moved on to run GUIs properly.
Title: Re: Zilog & Z80 History
Post by: David Hess on July 15, 2022, 05:19:16 pm
"Recompile the important stuff and emulate the rest" has worked pretty well for Apple over three transitions between four very different ISAs: M680x0 -> PowerPC -> AMD64 -> ARM64.

It barely worked for Apple, but they had no choice initially having relied on Motorola; the 68k had big problems when scaled up.  How many customers did they shed doing that?  How close did they come to bankruptcy?

IBM managed backwards compatibility with their mainframes very successfully; they knew how important it was to their customers.  The Z80 did it very successful by building off of the established 8080; would it have amounted to anything otherwise?

is there any z80 laptop?

There were several Z80 "luggables" from companies like Osborn and Kaypro.  I worked on a Kaypro II for a while.  I remember at least being aware of most of the systems described below.

In 1983 the Tandy TRS-80 Model 100 notebook used an 80C85 and the Canon X-07 used an NSC800 Z80 instruction set compatible microprocessor (https://www.cryptomuseum.com/spy/fs5000/files/NSC800.pdf).  The Tandy was very popular and some people preferred them after they were long out of production.  Nothing made today really compares; it could operate for 20 hours on 4 x AA cells.

https://en.wikipedia.org/wiki/TRS-80_Model_100#Similar_computers_from_other_companies (https://en.wikipedia.org/wiki/TRS-80_Model_100#Similar_computers_from_other_companies)

1982 - 8086 - Grid Compass
1983 - 80C85 - Tandy TRS-80 Model 100
1983 - NSC800 (Z80) - Canon X-07
1985 - Z80 - Bondwell-2
1987 - Z80 - Cambridge Z88
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 15, 2022, 05:29:38 pm
is there any z80 laptop?

https://en.wikipedia.org/wiki/Cambridge_Z88

And then the Amstrad NC200. And I'm sure quite a few others.
Title: Re: Zilog & Z80 History
Post by: langwadt on July 15, 2022, 05:46:38 pm
Quote
Those undocumented instructions were not intended by design, but they were a side effect of the design

Reportedly, some clones didn't have these.

so they were reimplementations and not verbatim clones?
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 15, 2022, 06:31:31 pm
I think I've only tested the hidden instruction on 4 brands I had back then, Zilog Z80, Japanese Nec D780, DDR (former east Germany) U880 MME, and the Romanian MMN80-CPU, maybe 1-2 more from the Eastern Europe block countries, don't recall for sure, all DIL 40 pins.

What I remember for sure is that all the models I've tested were behaving the same, including when running the hidden instruction.  Might be out there some that don't, just that I never met any.
Title: Re: Zilog & Z80 History
Post by: David Hess on July 15, 2022, 06:45:16 pm
Quote
Those undocumented instructions were not intended by design, but they were a side effect of the design

Reportedly, some clones didn't have these.

so they were reimplementations and not verbatim clones?

The Ken Shirriff's article here (http://www.righto.com/2016/02/reverse-engineering-arm1-instruction.html) describes what is going on.  The instruction decoder relies on a programmable logic array (PLA) instead of a ROM, so an undecoded instruction produces an undefined output.
Title: Re: Zilog & Z80 History
Post by: langwadt on July 15, 2022, 06:57:12 pm
Quote
Those undocumented instructions were not intended by design, but they were a side effect of the design

Reportedly, some clones didn't have these.

so they were reimplementations and not verbatim clones?

The Ken Shirriff's article here (http://www.righto.com/2016/02/reverse-engineering-arm1-instruction.html) describes what is going on.  The instruction decoder relies on a programmable logic array (PLA) instead of a ROM, so an undecoded instruction produces an undefined output.

"programmable" so they were different depending mask programming?

if they were verbatim clones, you'd have the same unencoded undefined outputs
Title: Re: Zilog & Z80 History
Post by: m k on July 15, 2022, 07:28:10 pm
Clones, not copies.
East had no such restrictions.

I remember that all values were present, or not blocked.
So A = A + A was present, like all similar variations of all instructions.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 15, 2022, 07:33:14 pm
Were there any *useful* undocumented instructions though? I admit I never really got interested in them.
The one you just mentioned, for instance, is just doubling A, which could be done with a left shift.

So, does anyone have a curated list of undocumented instructions showing some that could actually be interesting?
Title: Re: Zilog & Z80 History
Post by: m k on July 15, 2022, 07:48:24 pm
I remember that some hardware had shortcuts but can't remember which one.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 15, 2022, 07:51:21 pm
I remember that some hardware had shortcuts but can't remember which one.

Ah, meaning that possibly some undocumented instructions could be executing faster than "equivalent" standard instructions?  I have a hard time finding much info on that right now.
Title: Re: Zilog & Z80 History
Post by: langwadt on July 15, 2022, 08:02:11 pm
Were there any *useful* undocumented instructions though? I admit I never really got interested in them.
The one you just mentioned, for instance, is just doubling A, which could be done with a left shift.

So, does anyone have a curated list of undocumented instructions showing some that could actually be interesting?

http://www.z80.info/z80undoc.htm (http://www.z80.info/z80undoc.htm)
Title: Re: Zilog & Z80 History
Post by: David Hess on July 15, 2022, 08:36:33 pm
"programmable" so they were different depending mask programming?

That is right, but unlike a mask ROM, unused terms can be removed from the programming.

Quote
if they were verbatim clones, you'd have the same unencoded undefined outputs

That is true, but there are multiple programmed configurations of the PLA which result in the same result for defined instructions, and maybe they used some of the undefined instructions for their own purposes, like self testing.
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 16, 2022, 12:20:49 am
Clones, not copies.
East had no such restrictions.

I remember that all values were present, or not blocked.
So A = A + A was present, like all similar variations of all instructions.

Wait, what? ADD A,A is not documented?

Admittedly it's a long long time since I saw a real z80 manual, and I just go off sites such as https://clrhome.org/table/ (https://clrhome.org/table/) and http://z80-heaven.wikidot.com/instructions-set (http://z80-heaven.wikidot.com/instructions-set) and it's right there at 87. Same for ADD HL,HL and ADD IX,IX and ADD IY,IY.

I don't know why they wouldn't be official instructions. The semantics are obvious, they are useful, and it's probably harder to not have them than to allow them.

This looks like an official Intel manual for the 8080 and the section "REGISTER OR MEMORY TO ACCUMULATOR INSTRUCTIONS" lists A as a valid source operand (as well as the implied destination), with code 111, right after 110 for M (which is "(HL)" in Zilog-speak). So you have ADD A, ADC A, XRA A, CMP A etc.

https://altairclone.com/downloads/manuals/8080%20Programmers%20Manual.pdf (https://altairclone.com/downloads/manuals/8080%20Programmers%20Manual.pdf)
Title: Re: Zilog & Z80 History
Post by: David Hess on July 16, 2022, 03:41:42 am
A more useful instruction was to AND or OR the accumulator with itself and place the result in the accumulator, to deliberately set the flags based on the contents of the accumulator without changing it.  XORing the accumulator with itself is way to zero the accumulator, but destroys the flags so a load is preferred sometimes.

There is a small difference in the flag results between adding the accumulator to itself and a left shift of the accumulator, but I do not see how it would be useful.
Title: Re: Zilog & Z80 History
Post by: brucehoult on July 16, 2022, 04:05:29 am
A more useful instruction was to AND or OR the accumulator with itself and place the result in the accumulator, to deliberately set the flags based on the contents of the accumulator without changing it.  XORing the accumulator with itself is way to zero the accumulator, but destroys the flags so a load is preferred sometimes.

There is a small difference in the flag results between adding the accumulator to itself and a left shift of the accumulator, but I do not see how it would be useful.

Adding a number to itself is useful because you often want to double a number.

It's a single bit shift left which is a basically useless and redundant instruction. Some machines have right shift but not left shift for this reason.
Title: Re: Zilog & Z80 History
Post by: David Hess on July 16, 2022, 04:51:39 am
A more useful instruction was to AND or OR the accumulator with itself and place the result in the accumulator, to deliberately set the flags based on the contents of the accumulator without changing it.  XORing the accumulator with itself is way to zero the accumulator, but destroys the flags so a load is preferred sometimes.

There is a small difference in the flag results between adding the accumulator to itself and a left shift of the accumulator, but I do not see how it would be useful.

Adding a number to itself is useful because you often want to double a number.

It's a single bit shift left which is a basically useless and redundant instruction. Some machines have right shift but not left shift for this reason.

I am not questioning the usefulness.  Adding a number to itself is a multiply by two, as is a left shift, but there are deliberate differences in how these instructions affect the flags because of their intended applications.  On the Z80 I think the only flag which would matter in this specific case is parity and overflow since they are the same flag.

The 8086 did not share flags like the Z80, and RISC processor do not bother storing stateless flags.
Title: Re: Zilog & Z80 History
Post by: Kleinstein on July 16, 2022, 07:50:00 am
Combinations like adding a number to itself or the AND / OR / XOR  with itself can get there own name, though they are the same OP-code. For the AVRs there is for example
ADD R,R   ->   ASL R              multiply with 2
AND R,R         TST R              (test the register, especially for zero)
XOR R,R         CLR                 clear a register


This can in someway confuse dissassembers as they don't know which way to write them
This are not really new commands.
With a processor that used logic to decode instruction undocumented commands may happen, that could be undefined, possibly unreliable (e.g. driving more registers from the same bus and for this need more time, so that it may not work at the same full clock speed) or just rarely useful. Another point with undocumented OP codes is that they may change in future variants (e.g. the NMOS and CMOS version of the 6502 were a little different in this respect). So oddities even persist and even need extra effort for this, like the  A20 gate with  x86 processors.
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 16, 2022, 08:13:41 am
Z80 is not RISC, while AVR is.

For Z80 there are different machine codes for additions vs rotations, IIRC the flags were affect differently, too, for a left shift vs add.  For the curious, here's a brief of Z80 opcodes and affected flags, didn't check it but seems OK:  https://worldofspectrum.org/z88forever/dn327/z80instr.htm#rotshft
Title: Re: Zilog & Z80 History
Post by: iMo on July 16, 2022, 09:02:55 am
I think I've only tested the hidden instruction on 4 brands I had back then, Zilog Z80, Japanese Nec D780, DDR (former east Germany) U880 MME, and the Romanian MMN80-CPU, maybe 1-2 more from the Eastern Europe block countries, don't recall for sure, all DIL 40 pins.

What I remember for sure is that all the models I've tested were behaving the same, including when running the hidden instruction.  Might be out there some that don't, just that I never met any.
I collected a couple of U880 (plus some "Z80 DDR" peripheral chips too) at that time, but never used them actually. Never heard about MMN80-CPU, though.. Interesting - so many clones had been produced in the Eastern Europe Block (also clones of 8080, 8086, 8048, etc).

PS: they started to clone the ARM chip in '87 - '88 or so (as it was decided the ARM chip architecture will be the arch for small computers in the Eastern Block), but with the end of the Eastern Europe Block in '89 they never finished it, afaik..
Title: Re: Zilog & Z80 History
Post by: m k on July 16, 2022, 09:19:05 am
Clones, not copies.
East had no such restrictions.

I remember that all values were present, or not blocked.
So A = A + A was present, like all similar variations of all instructions.

Wait, what? ADD A,A is not documented?

Admittedly it's a long long time since I saw a real z80 manual, and I just go off sites such as https://clrhome.org/table/ (https://clrhome.org/table/) and http://z80-heaven.wikidot.com/instructions-set (http://z80-heaven.wikidot.com/instructions-set) and it's right there at 87. Same for ADD HL,HL and ADD IX,IX and ADD IY,IY.

I don't know why they wouldn't be official instructions. The semantics are obvious, they are useful, and it's probably harder to not have them than to allow them.

This looks like an official Intel manual for the 8080 and the section "REGISTER OR MEMORY TO ACCUMULATOR INSTRUCTIONS" lists A as a valid source operand (as well as the implied destination), with code 111, right after 110 for M (which is "(HL)" in Zilog-speak). So you have ADD A, ADC A, XRA A, CMP A etc.

https://altairclone.com/downloads/manuals/8080%20Programmers%20Manual.pdf (https://altairclone.com/downloads/manuals/8080%20Programmers%20Manual.pdf)

It was a bad example.
z80undoc does it much better.
Title: Re: Zilog & Z80 History
Post by: m k on July 16, 2022, 09:21:23 am
I remember that some hardware had shortcuts but can't remember which one.

Ah, meaning that possibly some undocumented instructions could be executing faster than "equivalent" standard instructions?  I have a hard time finding much info on that right now.

I think speed was some other CPU, maybe 6502.

That z80undoc didn't ring a bell much either.
Some IX/IY and addressing mode stuff came up but argument against useless instructions was stronger.

Z80 is pretty old so I'd say that useless instructions are a byproduct, not something that was wasted.
I understand that Z80 style was already competing internally with 8085 style.
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 16, 2022, 09:46:44 am
Never heard about MMN80-CPU, though.

That was the Romanian Z80.  The convention notation for the Ro chips of the Z80 family was MMN80 for the family, followed by the function, e.g. MMN80-SIO, for the serial chip, or MMN80-PIO for the Z80 PIO chip.  https://en.wikipedia.org/wiki/MMN80CPU

I suspect the wafers were from the German DDR, and only cut and packaged at Microelectronica Romania, but I'm not sure where the dies were made.



Looking at the online pics of Z80 (CPU), if the Zilog Z80 was always in white ceramic case, like the DALLAS one, then I never had one, though I've certainly tested SHARP and SGS, too, I might still have one of each somewhere.
Title: Re: Zilog & Z80 History
Post by: iMo on July 16, 2022, 09:51:19 am
Never heard about MMN80-CPU, though.

That was the Romanian Z80.
https://en.wikipedia.org/wiki/MMN80CPU

The convention notation was MMN80 for the family, followed by the function, e.g. MMN80-SIO, for the serial chip, or MMN80-PIO for the Z80 PIO chip.
..
My bet would be your MMN80 series clones were made of the DDR's U880+SIO+PIO dies..
Noopy could prove that provided somebody sends him the samples  :D ..
Title: Re: Zilog & Z80 History
Post by: iMo on July 16, 2022, 10:06:47 am
The EE Block countries had agreed some kind of diversification and focus on specialties in their semiconductor and electronics production, sometimes they rebranded others silicon. It was basically "not allowed or recommended" to develop and produce the same device in several EEB countries, afaik.

PS: info on the USSR's Z80 during and after the end of the EEB (and the east German "connection"):
https://www.cpushack.com/2021/01/26/the-story-of-the-soviet-z80-processor/?hmsr=joyk.com&utm_source=joyk.com&utm_medium=referral (https://www.cpushack.com/2021/01/26/the-story-of-the-soviet-z80-processor/?hmsr=joyk.com&utm_source=joyk.com&utm_medium=referral)
Title: Re: Zilog & Z80 History
Post by: m k on July 16, 2022, 01:54:49 pm
Some may still remember Verifone Tranz 420 from card payments in restaurant.
That's a souped up 330 but still Z80, including PIO and SIO, actual chip being Z84C something.
Title: Re: Zilog & Z80 History
Post by: peter-h on July 18, 2022, 11:01:52 am
Quote
so they were reimplementations and not verbatim clones?

I think the Hitachi 64180 / Zilog Z180 did not implement these undoc opcodes.

And probably none of the "IP blocks" used in FPGAs etc do either; being written to emulate the Z80 in VHDL or whatever.
Title: Re: Zilog & Z80 History
Post by: RoGeorge on July 18, 2022, 01:05:21 pm
For the docs:
http://www.z80.info/z80undoc.htm (http://www.z80.info/z80undoc.htm)
https://www.z80cpu.eu/mirrors/www.z80.info/ (https://www.z80cpu.eu/mirrors/www.z80.info/)
http://www.bitsavers.org/components/zilog/ (http://www.bitsavers.org/components/zilog/)
Title: Re: Zilog & Z80 History
Post by: peter-h on July 18, 2022, 03:11:28 pm
In that interview (link further back) the Zilog guy said they intentionally didn't publish these extra instructions. But he doesn't say why.

Perhaps because they had plans for new chips, and the extra opcode space would have been used for new instructions. For example the 64180 (which I am sure was a project with Hitachi because Zilog sold the same chip as the Z180) has a 8x8=16 multiply instruction. Later Zilog did a Z180 with some "improvements" but they broke the UARTs and never fixed them so they continued to sell the old silicon.

Then the Z800, which AFAIK never really existed properly in silicon, would have had loads of new instructions.

The Z280 (which I used in production, writing vast amounts of asm code) had loads of very useful new instructions while being Z80 backward compatible. That was a very good chip, with 16MB physical addressing and an internal MMU, so it would have easily done a TCP/IP stack, with an RTOS mapping in a fresh 64k per task. Very fast, 2-3 clocks due to the pipeline. It was done by an external subcontractor and was a different design so probably would not have had those undoc opcodes, and anyway many were new instructions. It did have the low/high halves of IX and IY accessible though, same as the undoc Z80 opcodes do.

The Z80 etc was always going to die for higher end embedded stuff once 32 bit linear stuff arrived, at the right price.
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 18, 2022, 05:52:13 pm
If those undocumented instructions were just byproducts of the design and not intended, I find it reasonable not to publish anything about them. They probably didn't intend on even *validating* these instructions in that context. You don't sell something that hasn't been properly validated.
Title: Re: Zilog & Z80 History
Post by: peter-h on July 19, 2022, 10:03:33 am
They were absolutely good instructions, resulting from the opcode bitfield decoding.

I am certain they were unpublished to preserve space for future use of the opcode space.

They were useful. Both uint32 and float maths was greatly speeded-up by keeping everything in registers and IXL IXH IYL IYH would have been very handy. Well, Z80 floats thus implemented were not IEEE standard floats; those were about 5x slower :)
Title: Re: Zilog & Z80 History
Post by: SiliconWizard on July 20, 2022, 12:03:42 am
The point is not whether they were useful or not. Anything extra to be validated is an extra cost and effort. If it wasn't planned, then it's perfectly reasonable not to. And sure, as those were probably just a byproduct of the decoding, on top of avoiding useless effort, they probably just wanted not to be *constrained* by those unintended instructions for further revisions of the silicon which may have implemented the decoding completely differently.

Again, either way, not documenting unintended features especially if you have no intention of supporting those now or in the future is usually the right way to go.
Title: Re: Zilog & Z80 History
Post by: kleiner Rainer on July 20, 2022, 10:07:33 am
is there any z80 laptop?
are there other z80 vt-terms?

Epson PX-4 and PX-8. The PX-4 can run with 4 AA cells for many hours. I own both, they came from industrial applications - there is a DMM module for the PX-4, and a barcode reader was also available.. They run standard CP/M and come with ROM BASIC. An interesting feature was the way to hibernate: CTRL-Power switch to OFF. This shut down the main CPU, but the memory controller ran the refresh of the dynamic memory chips with a temperature dependent refresh rate. Wake-Up was nearly instantaneous, cold boot was a little slower, but much faster than on diskette based systems, because from ROM.

https://en.wikipedia.org/wiki/Epson_PX-4
https://en.wikipedia.org/wiki/Epson_PX-8_Geneva

Greetings,

Rainer


Rainer