Products > Programming

[solved] how disturbing is x86? Unreal-Mode!!!

(1/24) > >>

DiTBho:
so, you have an architecture that was made to be CP/M compatible, and this explains why 8086 has "Real-Mode", but ... wait, what is "Unreal-Mode"?   :o :o :o

it has been mentioned here and there in Grub (legacy), and it looks to me like a hardware bug in the CPU design, so, it won't work on some CPUs(1), but when it works it's an exploit that you can use by setting the CPU in Protected Mode, then setting the descriptor cache's limits for your segment registers to any value higher than 64Kbyte, and finally setting back the CPU in real-mode, and ...

... and boom, the magic power of quirks breaks the 64Kbyte limit of real mode segments while retaining 16-bit instructions and the segment * 16 + offset addressing mode by tweaking the descriptor caches.

* code still limited to 64Kbyte
* data allowed accessing 4Gbyte
* CPU in real mode!!! 16bit!!!
it's useful when you're trying to load something that will run in 32-bit mode which is larger than what you are allowed to load in "conventional" (I mean without the Unreal Mode trick) Real Mode memory and you don't want to bother writing a protected mode driver yet, but you also want to avoid switching between real and protected mode to copy chunks from the conventional memory buffer into extended memory.




(1) tested on
- PC, AMI BIOS, intel 486-DX2 ---> it worked
- PC, Phenix BIOS, intel PentiumII ---> it worked
- AppleMini, Apple BIOS, intel Core2 dual ---> it worked
- Acord RiscPC/x86_guest_card, A1 BIOS, AMD 586 ---> it worked
- Acord RiscPC/x86_guest_card, A1 BIOS, Cyrix Cx486 ---> it didn't work!!! <---- Is it a problem?!?
- Acord RiscPC/x86_guest_card, A1 BIOS, Cyrix C5x86 ---> it didn't work!!! <---- Is it a problem?!?

untested on Xeon, Opteron, etc  :-//


edit:
solved

SiliconWizard:
Ouch. And people whine about Intel considering getting past all that.
 :-DD

SiliconWizard:
Well, depends on how you look at it.
Their success, at least for a few decades, has largely been due to the great level of backwards compatibility. One can always claim to have been able to do better in retrospect, but that's just speculation.
Brain-dead engineers? Uh. A bit of humility never hurts.

As I said before, pretty much every time they tried to depart from that a bit too much, they failed. And no, that wasn't just because what they came up with was inferior. It's mainly because that's not what Intel's customers were looking for.
Trapped.

DiTBho:
See how other people, those who know in detail these things, title the problem "A20 - a pain from the past", link  here

And it's referenced as "nonsense"


--- Quote ---The 8088 in the original PC had only 20 address lines, good for 1 MB. The maximum address FFFF:FFFF addresses 0x10ffef, and this would silently wrap to 0x0ffef. When the 286 (with 24 address lines) was introduced, it had a real mode that was intended to be 100% compatible with the 8088. However, it failed to do this address truncation (a bug), and people found that there existed programs that actually depended on this truncation. Trying to achieve perfect compatibility, IBM invented a switch to enable/disable the 0x100000 address bit. Since the 8042 keyboard controller happened to have a spare pin, that was used to control the AND gate that disables this address bit. The signal is called A20, and if it is zero, bit 20 of all addresses is cleared.

Present
Why do we have to worry about this nonsense? Because by default the A20 address line is disabled at boot time, so the operating system has to find out how to enable it, and that may be nontrivial since the details depend on the chipset used.

--- End quote ---

When I updated my RiscPC to ROM v4.39 and faster companion x86 card, I spent 2 days fixing this nonsense for the x86-CPU guest card; the previous BIOS was incompatible with "!PC-v308", which was itself incompatible with the bew 586 guest cards (as they use different ASICs chip), and again, I found a dummy keyboard controller mapped to the bridge to allow DOS32 to see its 32Mbyte partition of the 128Mbyte system ram.


Anyway, the Unreal-Mode trick doesn't work with my Cyrix CPUs  :-//

james_s:
I feel like a broken record saying this, but backward compatibility is king, and nobody outside of a very rare few that do low level development cares what real mode is. Many really cool and innovative computers have appeared over the years, the Amiga (I have one) was fantastic, superior to the DOS PCs of the day but it couldn't run Lotus 123 or Microsoft Office. The Macintosh (I have several) was way ahead also, it could run Microsoft Office fairly early on but it was not affordable to the masses and it did not cater to game developers which everyone knows is what really drove the home PC market despite every kid saying they wanted one to do their homework. The BeBox was a neat and innovative machine but there was virtually no software for it so that one went nowhere. The NeXT was a neat machine, state of the art at the time but it had a workstation class price tag that made the Macintosh look inexpensive and again there was not really any software beyond what came with it. None of these offered backward compatibility with the established platforms and thus all of them failed in the market. As with smartphones, there is really only room for 2-3 platforms, anything else is doomed to fail and remain a niche. There is a chicken & egg problem where few are going to develop software for a platform that hardly anyone is using, and hardly anyone will use a platform that doesn't have a sizable library of software.

There's nothing "disturbing" at all about x86, it has a lot of baggage but that's because it has long offered one thing that no other platform can offer, and that is seamless compatibility with an enormous range of software. You could buy a 286 that could run almost everything you ran on your 8088, you could buy a 386 that ran almost everything that ran on your 286. You could buy a 486 that ran almost everything you ran on your 386 and so on. Given the historically high cost of software that was an *enormous* advantage and to a large degree it still is. For all but maybe 1,000 people on the entire planet, most of the x86 baggage is totally irrelevant, they never see any of it, most of the technical users never have to dive that deep and the non-technical users aren't even vaguely aware it exists and they don't care, and why should they? They just want to use their computer, they don't care how it works. It doesn't matter what whiz-bang innovations are lurking under the hood, if it can't run the software a person wants to use then it's useless. You must recognize that you are a part of an extremely small and exclusive club of people that ever touch any of the low level stuff and you cannot possibly expect the vast majority of people to care, it simply doesn't matter to them because they will never see any of the ugliness, they don't have a clue how a computer works internally nor are they interested in knowing and there's nothing wrong with that.

For some other platform to become the defacto standard, it will have to offer something tangible that normal people care about. For example it will have to be much cheaper than other solutions, or lower power, the latter is what has led to the success of ARM in the smartphone market, but smartphone users don't care about the gritty details of the architecture either, they just care about having a powerful device that fits in their pocket, goes a reasonable amount of time on a charge and runs all the popular apps.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod