Author Topic: The phasing out of 32 bit  (Read 10644 times)

0 Members and 1 Guest are viewing this topic.

Offline Ed.Kloonk

  • Super Contributor
  • ***
  • Posts: 4000
  • Country: au
  • Cat video aficionado
Re: The phasing out of 32 bit
« Reply #50 on: June 10, 2021, 01:53:31 am »
Hey, confirm my suspicion.

I suspect that this move away from 32-bit is only centered around the Arm architecture and it's ever prevalent in-built crypto. Supporting 32-bit backwards compatible is ball and chain.

iratus parum formica
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3909
  • Country: gb
Re: The phasing out of 32 bit
« Reply #51 on: June 10, 2021, 09:54:45 am »
Linux fails to compile on mips with gcc v10.2.0.
I see the same issue too, but only when compiling natively on a mips64r2 machine.
Cross-compiling on a x86-64 box works nicely.

First I thought it's a problem with setting the "cross_compiling" flag in ./Makefile.
But that's not sufficient.

mips-gcc and mips64-gcc are separate compilers.

I don't have a 64-bit mips userspace yet (just kernel).
- kernel 64bit
- userspace 32bit

This means, that all builds on my mips machines are 32bit and do a cross-compilation to a mips64 kernel if requested in the .config.

worse still, some files like asm-offsets.c are still preprocessed with 32bit compiler, so with gcc, not mips-gcc

which introduces a lot of flaws, over-work to double-check things two times, and basically it's prone to fail


Crazy?


We have a lot of things to fix  :D
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline nigelwright7557

  • Frequent Contributor
  • **
  • Posts: 689
  • Country: gb
    • Electronic controls
Re: The phasing out of 32 bit
« Reply #52 on: June 10, 2021, 10:08:26 am »
Microchip's MPLAB X has moved up to 64 bits and with that they threw out MPASM which is a disaster for some people.
As for backwards compatability Windows 10 still uses command prompt DOS mode !
It still used as a shell for npm ! node JS etc



 

Offline NiHaoMikeTopic starter

  • Super Contributor
  • ***
  • Posts: 9007
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: The phasing out of 32 bit
« Reply #53 on: June 10, 2021, 10:39:40 pm »
It certainly wasn't the case when x86-64 was introduced 20 years ago. The absolutely first priority was to run existing 32b systems and existing 32b software under 64b systems with maximum performance. Nobody would buy those CPUs if they were slower than Intel.

As for die area savings, this annotated photograph of VIA Nano CPU from some 10 years ago has been available for a while.
https://www.viagallery.com/isaiah-architecture/
As you can see, almost half of it is cache to begin with. SIMD would be unaffected by dropping 32b support, load/store probably too. You would only be fighting for some simplifications of the bottom right corner, and perhaps not a great one, because a lot of those transistors are devoted to the complex task of tracking dependencies, scheduling and reordering instructions.
In the beginning, it certainly made sense to have the transition as smooth as possible. But the transition to 64 bit being mainstream was over a long time ago and the question now is if it's a good time to start transitioning to 64 bit only CPUs. The video mentioned that the x86 CPUs in modern game consoles are 64 bit only, not sure if they have any actual hardware differences.

The savings that matter isn't the silicon area but rather the complexity of the logic that needs to run as fast and efficiently as possible.
Microchip's MPLAB X has moved up to 64 bits and with that they threw out MPASM which is a disaster for some people.
As for backwards compatability Windows 10 still uses command prompt DOS mode !
It still used as a shell for npm ! node JS etc
I only did a quick search on MPASM but it looks like they have a way to migrate projects to a new assembler?

I haven't used the "ghetto shell" in Windows for many years, Powershell is the modern replacement.
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16607
  • Country: us
  • DavidH
Re: The phasing out of 32 bit
« Reply #54 on: June 11, 2021, 04:06:46 am »
I suspect that this move away from 32-bit is only centered around the Arm architecture and it's ever prevalent in-built crypto. Supporting 32-bit backwards compatible is ball and chain.

ARM does not have a significant installed base of 32-bit software to preserve, and this is not the first time ARM has dropped backwards compatibility, which did not help it succeed as a desktop processor.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: The phasing out of 32 bit
« Reply #55 on: June 11, 2021, 04:36:10 am »
I suspect that this move away from 32-bit is only centered around the Arm architecture and it's ever prevalent in-built crypto. Supporting 32-bit backwards compatible is ball and chain.

ARM does not have a significant installed base of 32-bit software to preserve, and this is not the first time ARM has dropped backwards compatibility, which did not help it succeed as a desktop processor.

Say what?

ARM has been making 32 bit processors for 35 years and they and their customers have a HUGE installed base of 32 bit software, both embedded and Linux.

RISC-V would be the one without a significant installed base of 32 bit software, at least for Linux, as 64 bit has been the focus for applications processors from the start.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3909
  • Country: gb
Re: The phasing out of 32 bit
« Reply #56 on: June 11, 2021, 08:05:21 am »
He probably meant "Wintel" (Windows + intel-x86) legacy  :-//

The aggressive commercial policy made by Microsoft coupled with the interests of Intel to rule the marketplace.
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3909
  • Country: gb
Re: The phasing out of 32 bit
« Reply #57 on: June 11, 2021, 08:19:01 am »
HUGE installed base of 32 bit software, both embedded and Linux.

DOS has a huge installed base of 16 software
Windows has a huge installed base of 32 software
Mobile things has a huge installed base of ARM software (mostly is Android/Arm based)

Linux / ARM and RISC-OS / ARM (which is mainly used in the UK) in this scenario is like a grain of sand on a beach, and talking about workstation and sever stuff, due to the nature of "free software", things that run on Linux / ARM can easily be migrated to Linux / x86, which is mainstream with a huge and larger user base.

I can say it according to what I see on repositories.
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6758
  • Country: pl
Re: The phasing out of 32 bit
« Reply #58 on: June 11, 2021, 08:22:46 am »
Say what?

ARM has been making 32 bit processors for 35 years and they and their customers have a HUGE installed base of 32 bit software, both embedded and Linux.
He is right, though. No customer will buy a 64 bit refrigerator expecting it to run his old 32 bit refrigerator firmware. Meanwhile refrigerator vendors have all the means to port their firmware to 64 bit to take advantage of 64 bit features or just keep buying 32 bit ARM to save money. ARM install base is either throwaway or de-facto rented software, unlike x86 where you own software binaries and want them to run faster on your new and faster machine.
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1719
  • Country: se
Re: The phasing out of 32 bit
« Reply #59 on: June 11, 2021, 03:33:21 pm »
Windows 10 still uses command prompt DOS mode !
It depends what you mean by "DOS mode".
  • There is no DOS code in Windows since many years: cmd.exe is a 64 bit executable (or 32 bit on 32 bit Windows).
  • There's not even a 16 bit subsystem in 64 bit versions of Windows, since at least Windows 7.
  • The command interpreter (cmd.exe) understands so many more internal commands compared to DOS command.com it's not even funny (conditionals, && and ||, string handling, etc.).
    I use PowerShell 99% of the times, but sometimes cmd is quicker to load and still quite useful.
But yes, most batch files of MS-DOS era might still work in cmd.exe, as the syntax is largely the same.
This is a testament to backwards compatibility.
Just an example from the top of my mind:
Code: [Select]
C:\Users\newbrain>set /a (3+4)*10
70
C:\Users\newbrain>set /P VAR1="Please enter var1: " && echo %VAR1%
Please enter var1: 1234
1234
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: The phasing out of 32 bit
« Reply #60 on: June 11, 2021, 05:01:24 pm »
Say what?

ARM has been making 32 bit processors for 35 years and they and their customers have a HUGE installed base of 32 bit software, both embedded and Linux.
He is right, though. No customer will buy a 64 bit refrigerator expecting it to run his old 32 bit refrigerator firmware. Meanwhile refrigerator vendors have all the means to port their firmware to 64 bit to take advantage of 64 bit features or just keep buying 32 bit ARM to save money. ARM install base is either throwaway or de-facto rented software, unlike x86 where you own software binaries and want them to run faster on your new and faster machine.
I agree. For embedded devices where ARM rules the world, binary backward compatibility is not necessary. Changing to a different platform requires to recompile the software anyway. This is already the case for microcontrollers for which there are at least a dozen different ARM cores in use.
« Last Edit: June 11, 2021, 05:05:23 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6239
  • Country: fi
    • My home page and email address
Re: The phasing out of 32 bit
« Reply #61 on: June 12, 2021, 04:58:52 am »
I do not believe for a second that ARM Cortex M processor family is being "phased out".
It's like claiming English is being phased out, because more and more people speak X.

Just look at the existing 8-bit processor families, which are still used in new designs.
And they will be used in new designs for as long as it makes commercial sense.
For obvious reasons, the 32-bit processor niche is much, much larger.

It is, however, true that with at least the GNU toolchain, it is relatively painless to move between hardware implementations; and in that sense, those looking for ARM chips for new designs don't care that much about backwards compatibility.

Nevertheless, the existing codebase for ARMv6-M to ARMv8.1-M is so large, that existing product lines are easier to upgrade with somewhat compatible newer processors.  And this is a very big reason why ARM Cortex M 32-bit family at least will not be going anywhere soon.

If we extend this to 32-bit processors in general, we'd need to look at the business cases where a 64-bit processor would be a hindrance compared to 32-bit, if not for anything else, then because of the added code size, or the need to port existing codebase to 64-bit compatibility (which traditionally has revealed quite a lot of idiotic assumptions programmers make, and have to be fixed in such transitions).

The vast number of processors are in embedded devices.  Cellphones are just one class, albeit a big one; and because of their multipurpose use, they probably do benefit from 64-bit support.  But what about the display controllers ("graphics cards") in them; the biggest scalars those use are 32-bit, so 32-bit plus SIMD extensions on vectors makes most sense.  And what about modems, routers, TVs, storage devices, and so on?
A vast majority of embedded devices at this point gains basically nothing from having 64-bit support.  Typical routers, modems, TVs, etc. that you do not notice, have 32 - 256 megabytes of memory, and basically gain nothing from having more than that; they just aren't even hitting the limits of 32 bits.  (Again, multifunction devices do differ.  And perhaps programmers are worse and worse year after year, so that it makes more economical sense to buy more powerful hardware, so that even bloated crappy code works, somewhat, on them.  After all, Microsoft has already managed to convince at least one generation of humans that devices are supposed to crash every now and then for no discernible reason.)

One should examine what kind of processors human interface peripherals – mice, keyboards, etc. – still use, when considering answering the above question.  They do not need more than a few hundred bytes of memory, plus a full native USB support, so the vast majority of them run on some 8-bit processor.  Why would they move to 32-bit?  The same is with currently 32-bit appliances that are not hitting any limitations for the kinds of workloads customers have.

Put simply, I think the claims in the initial post are complete bullshit by someone who can only perceive a small slice of the world and believes that is everything there is, and thinks it is his Dog-given mission to convince everybody else.  Or at least get them to part with their money.  The only reason the video exists is to gain views by sowing discord (like this thread here), just like news no longer report actual events or facts, but just spin everything in an effort to rile people up.
« Last Edit: June 12, 2021, 05:01:05 am by Nominal Animal »
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8167
  • Country: fi
Re: The phasing out of 32 bit
« Reply #62 on: June 12, 2021, 09:17:58 am »
For embedded, and by embedded I don't mean smartphones*, depending on case, 32-bit MCUs are either extremely overkill, or just slightly overkill, most of the time. There is very little to gain from going to 64 bits. In some embedded algorithms, 64-bit number crunching is needed, but that is very efficient on a 32-bit core by just upping frequency a bit and "software emulating" 64-bit arithmetic; and for example a Cortex-M7 already includes 64-bit features like a 64-bit wide memory data bus and a few instructions operating on 64-bit data, while still being a nominally 32-bit CPU.

32-bit is kind of sweet spot by being able to satisfy 99.99% of embedded needs while not being too much of an overkill for the smallest tasks. You can have a Cortex M0 for some 30-40 cents. An 8-bit PIC won't be any/much cheaper.

In classical embedded the memory requirements seem to vary from a few bytes to maybe half a gigabyte max. Special applications requiring more are expensive anyway and can just choose to use FPGAs, custom ASICs or, of course, just general purpose desktop CPUs.

*) IMHO, smartphones are exactly what desktop computing is, just not physically on a desk; general purpose computing. Smartphones have a bit different use case, namely mostly rented out simplified single-purpose entertainment software, enabling recompiling to changing architectures, but from CPU capability point of view, they benefit from all general-purpose performance enhancing features exactly like classic desktop CPUs do, making transition into 64 bits appealing. The same can't be said about microcontrollers in refrigerators, routers or ECUs. These applications naturally fit into 8 to just 32 bit architectures and anything excess is waste.
 
The following users thanked this post: SilverSolder, newbrain, DiTBho

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3909
  • Country: gb
Re: The phasing out of 32 bit
« Reply #63 on: June 12, 2021, 11:44:35 am »
32-bit is kind of sweet spot by being able to satisfy 99.99% of embedded needs

Yup, indeed  :D
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 3909
  • Country: gb
Re: The phasing out of 32 bit
« Reply #64 on: June 12, 2021, 11:52:17 am »
I do not believe for a second that ARM Cortex M processor family is being "phased out".
It's like claiming English is being phased out, because more and more people speak X.

LOL, more Russian people I meet, more I think the soviet dialects are spoken by more people who talk English, according to this, if I was an "A.I." - programmed with such statistic-based deductive logic - , I would infer that "English is being phased out" because more and more met people speak Soviet dialects ;D


(funny, there are really A.I. programs that do think that way, with a statistic-based deductive logic)
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14445
  • Country: fr
Re: The phasing out of 32 bit
« Reply #65 on: June 12, 2021, 05:13:55 pm »
Of course, 32-bit is likely not going away anytime soon for small embedded stuff (microcontrollers).

Now for "desktop" and "server" applications, 32-bit has already been largely phased out.

And I agree with Siwastaja, powerful mobile devices suchs as smartphones and tablets are just small computers in disguise. Given that even mid-range phones these days have 6GB or 8GB of RAM, going 64-bit makes sense. (Now whether having that much power and memory in a phone makes sense, that's another story entirely, but that's just the way it is now.)
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
Re: The phasing out of 32 bit
« Reply #66 on: June 12, 2021, 09:07:33 pm »
Of course, 32-bit is likely not going away anytime soon for small embedded stuff (microcontrollers).

Now for "desktop" and "server" applications, 32-bit has already been largely phased out.

And I agree with Siwastaja, powerful mobile devices suchs as smartphones and tablets are just small computers in disguise. Given that even mid-range phones these days have 6GB or 8GB of RAM, going 64-bit makes sense. (Now whether having that much power and memory in a phone makes sense, that's another story entirely, but that's just the way it is now.)

To the last point, I think it does make sense to have serious amounts of RAM in a mobile platform, simply because all the active apps tend to be in memory at the same time in order to keep the thing responsive to the user (loading from Flash memory can be slow).  I'm not convinced that having a boatload of RAM necessitates having a 64 bit data word size, though...  that seems to have a big potential for wasting resources more than anything else.

Modern programmers / dev environments appear to be hugely non-caring about efficiency generally, irrespective of 32 bit vs. 64 bit...




 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6758
  • Country: pl
Re: The phasing out of 32 bit
« Reply #67 on: June 12, 2021, 10:12:12 pm »
Would you rather have it with x86-style segmentation or AVR-style X,Y,Z registers? :D

(I would probably prefer the latter, perhaps because it wouldn't be me to implement the hardware).
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4028
  • Country: nz
Re: The phasing out of 32 bit
« Reply #68 on: June 12, 2021, 10:42:10 pm »
32-bit is kind of sweet spot by being able to satisfy 99.99% of embedded needs while not being too much of an overkill for the smallest tasks. You can have a Cortex M0 for some 30-40 cents. An 8-bit PIC won't be any/much cheaper.

Waaaaay out, if you're talking about embedded cores doing some controlling task inside some chip. By a factor of 1000.

A Cortex-M0+ is 0.009 mm^2 on 40nm or 0.035 mm^2 on 90nm. [1]

TSMC price for a 300 mm wafer in 2020 was $2274 for 40nm and $1650 for 90nm. [2]

That makes the cost for the core 0.03 cents on 40nm or 0.08 cents on 90nm.

An 8051 or PIC will cost quite a lot less, while a 64 bit M0+ would cost something a little under twice more. If ARM offered 64 bit M0s, which they don't. But SiFive do (with RV64I or RV64E instruction set of course).


If your idea of "embedded" is buying off the shelf chips and assembling them onto a board then, yeah, you can get any of 8, 16, or 32 bit cores with some SRAM and flash and peripherals, in a package, for 30-40 cents. A 64 bit chip wouldn't cost any more to make. The packaging costs are the same and most of the silicon area is taken up by the pads not the core.

There's little to no advantage in putting a 64 bit core in such a packaged chip because you won't have 4 GB of RAM inside it and you don't have an external address bus. You quite likely do have more than 64 KB though. And often need numbers bigger than 255 or 65535, which need multiple instructions to process on an 8 or 16 bit core, using more energy than a single instruction a 32 bit core.

But for little embedded cores doing some task inside a larger chip, such as controlling a SERDES, or 5G radio, or any other peripheral that needs real-time supervision, if the main application processors are 64 bit and the addressing inside the chip is 64 bit, then it makes a lot of sense to use a "64 bit M0" for the minions.

[1] https://www.anandtech.com/show/8400/arms-cortex-m-even-smaller-and-lower-power-cpu-cores
[2] https://www.tomshardware.com/news/tsmcs-wafer-prices-revealed-300mm-wafer-at-5nm-is-nearly-dollar17000
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16607
  • Country: us
  • DavidH
Re: The phasing out of 32 bit
« Reply #69 on: June 13, 2021, 10:14:22 am »
For deep embedded applications, I have already run into problems using ARM microcontrollers to replace 8 and 16 bit parts; none of them support the lowest power applications as well.

Another complication is that microcontrollers must be built on larger legacy processes to support embedded memory, which prevents ARM microcontrollers from taking advantage of greater integration from a denser process.
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6239
  • Country: fi
    • My home page and email address
Re: The phasing out of 32 bit
« Reply #70 on: June 13, 2021, 11:36:01 am »
So, wouldn't the proper statement here be more like "We now have a variety of processor size classes to choose from", then?

It's not like 32 bit hardware is going away; it's more that we now have 64-bit hardware that can be used for the tasks where we hit 32-bit limits.
 

Offline AntiProtonBoy

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: au
  • I think I passed the Voight-Kampff test.
Re: The phasing out of 32 bit
« Reply #71 on: June 14, 2021, 04:55:47 am »
Intel, Transmeta, and DEC all tried  that and failed, and I think Apple's attempts hurt them more than they helped.  Maybe these were all implementation failures, but if every attempt has failed, then that argues that the concept is flawed.
Transmeta didn't really fail, per se. They had a fairly successful prototype showcasing proof of concept. Their issue was more to do with patents and the use of x86 instruction set. Intel took them to court and settled out of court in the end, with terms of basically killing the project.
 

Online ejeffrey

  • Super Contributor
  • ***
  • Posts: 3713
  • Country: us
Re: The phasing out of 32 bit
« Reply #72 on: June 14, 2021, 04:09:49 pm »
Would you rather have it with x86-style segmentation or AVR-style X,Y,Z registers? :D

(I would probably prefer the latter, perhaps because it wouldn't be me to implement the hardware).

I suspect what SilverSolder was suggesting was something like PAE.  32 bit virtual memory addresses with 40+ bit physical addresses and page table entries.  x86 had this since the Pentium Pro.  With it, individual applications on a 32 bit CPU can access ~4 GiB of RAM at once, but allows the OS to manage up to 64 GiB -- allowing for multiple large processes, or additional memory for disk cache.  It makes more work for the kernel which cannot directly access all physical memory at once, but provides a relatively simple way to use more than 4 GB of memory without require changes in user mode code.
 
The following users thanked this post: SilverSolder

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 6126
  • Country: 00
Re: The phasing out of 32 bit
« Reply #73 on: June 14, 2021, 06:47:03 pm »
Would you rather have it with x86-style segmentation or AVR-style X,Y,Z registers? :D

(I would probably prefer the latter, perhaps because it wouldn't be me to implement the hardware).

I suspect what SilverSolder was suggesting was something like PAE.  32 bit virtual memory addresses with 40+ bit physical addresses and page table entries.  x86 had this since the Pentium Pro.  With it, individual applications on a 32 bit CPU can access ~4 GiB of RAM at once, but allows the OS to manage up to 64 GiB -- allowing for multiple large processes, or additional memory for disk cache.  It makes more work for the kernel which cannot directly access all physical memory at once, but provides a relatively simple way to use more than 4 GB of memory without require changes in user mode code.

This kind of stuff was supported back in the days of MS Server 2003...   worked well!

Speaking of which, I recently ran up a copy of Server 2003 in a virtual machine.  The performance was stunning...   just so lean and mean!

 

Offline magic

  • Super Contributor
  • ***
  • Posts: 6758
  • Country: pl
Re: The phasing out of 32 bit
« Reply #74 on: June 14, 2021, 10:04:34 pm »
Okay, I suppose PAE is an option too.

But I just happen to use one old 32 bit machine for web browsing sometimes and let me tell you - a dozen or two tabs, particularly with images, a few weeks of uptime, and then despite a few gigs of RAM and swap available, the browser runs out of 32 bit address space and crashes.
|O
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf