Author Topic: Is the Byte obsolete?  (Read 4618 times)

0 Members and 1 Guest are viewing this topic.

Offline ConnecteurTopic starter

  • Regular Contributor
  • *
  • Posts: 231
  • Country: 00
Is the Byte obsolete?
« on: July 17, 2021, 06:10:26 pm »
Memory and processing are still expressed in bytes (kilobytes, megabytes etc.)  but how useful is an 8-bit word nowadays?  It gives us 256 data states, but most applications nowadays require more.  Are there names for 16-bit, 32-bit and 64-bit words?
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 8042
  • Country: gb
Re: Is the Byte obsolete?
« Reply #1 on: July 17, 2021, 06:29:53 pm »
Let us from now on describe memory sizes in kibilonglongs.
 
The following users thanked this post: bd139

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11638
  • Country: us
    • Personal site
Re: Is the Byte obsolete?
« Reply #2 on: July 17, 2021, 06:31:07 pm »
Half word, word and double word are commonly used names for 16-, 32- and 64-bit quantities. And quad word for 128 bits. So we are covered for quite some time.

This is a really strange question. No, byte is not obsolete, since there is still a lot of stuff that is designed around 8-bit bytes.
« Last Edit: July 17, 2021, 06:33:04 pm by ataradov »
Alex
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15127
  • Country: fr
Re: Is the Byte obsolete?
« Reply #3 on: July 17, 2021, 07:11:14 pm »
Not being able to address bytes - at least remotely efficiently since you coud always do this using bitwise operations - would be a real pain in many applications.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 17051
  • Country: us
  • DavidH
Re: Is the Byte obsolete?
« Reply #4 on: July 17, 2021, 07:13:36 pm »
"Word" used to be the natural operand size for the CPU whether that was 8, 9, 16, 18, or whatever bits.  It only became synonymous with 16 bits after x86 became so popular, and it remained 16 bits instead of being changed to 32 or 64 bits on x86 for backwards compatibility.

As far as bytes, they are still used for character data.
 

Online Gyro

  • Super Contributor
  • ***
  • Posts: 9869
  • Country: gb
Re: Is the Byte obsolete?
« Reply #5 on: July 17, 2021, 07:46:19 pm »
Quote
Is the Byte obsolete?

4E 6F 21
Best Regards, Chris
 
The following users thanked this post: Tom45, newbrain, golden_labels

Offline Bud

  • Super Contributor
  • ***
  • Posts: 7061
  • Country: ca
Re: Is the Byte obsolete?
« Reply #6 on: July 17, 2021, 07:55:55 pm »
@OP this may be a shock to you but not only Byte but also Bit is still used. Tell me how efficient you can code True or False state in your 64-bit ( no pun intended)  variable.
Facebook-free life and Rigol-free shack.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 8168
  • Country: us
  • Retired, now restoring antique test equipment
Re: Is the Byte obsolete?
« Reply #7 on: July 17, 2021, 08:04:00 pm »
The digital "word" has always had a variable length, depending on the equipment, and was not always an integer number of 8-bit bytes.
Why re-define the only stable nomenclature in sight, just because it is not as big as that with an unstable nomenclature?
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 9252
  • Country: gb
Re: Is the Byte obsolete?
« Reply #8 on: July 17, 2021, 08:25:41 pm »
Now that the vast majority of text has settled down to being in UTF-8 I guess the byte will be relevant for a very very long time.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4376
  • Country: nz
Re: Is the Byte obsolete?
« Reply #9 on: July 18, 2021, 02:59:24 am »
Memory and processing are still expressed in bytes (kilobytes, megabytes etc.)  but how useful is an 8-bit word nowadays?  It gives us 256 data states, but most applications nowadays require more.  Are there names for 16-bit, 32-bit and 64-bit words?

No.

An 8 bit byte is a very useful size for storing reasonable fidelity text, audio, and greyscale images (or RGBA or CMYK components of colour). Some applications need more, but many don't. UTF-8 is a good solution for text forever.

Using word addresses instead of byte addresses severely complicates accessing byte values. It becomes necessary to pass around both a word base address and a byte offset.

Non power of two word sizes are a non-starter for modern machines because at least masking and shifting are cheap, but div and mod of weird values such as 36 by 8 is and always will be expensive. You're faced with either storing just 4 chars in each word (wasting 11% of the space), or splitting chars between words, or using 9 bit chars (which will upset modern software, and also waste 11% of the storage).

Word addressing only made sense when

1) computers were used primarily for numerical processing (whether integer or FP) with very little use of text, and

2) registers and ALUs were so small that it made a significant difference whether you could address N bytes or N words of RAM.

In the history of 32 bit computers -- starting from the IBM S/360 in the early 1960s -- there has only been a brief period when anyone cared whether the amount of memory a program could address was 4 GB or 16 GB. Moving to 64 bits was coming sooner or later anyway, and in any given application area the difference between running out of 2^32 bytes and running out of 2^32 32 bit words is only about three or four years -- roughly 1995 in workstations, 2005 in PCs, and 2015 in smartphones (with the pioneers 2 or 3 years earlier in each case).

64 bit byte addresses should be enough for at least 50 years in most applications, quite possibly 100 or more.
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4376
  • Country: nz
Re: Is the Byte obsolete?
« Reply #10 on: July 18, 2021, 03:26:03 am »
In the history of 32 bit computers -- starting from the IBM S/360 in the early 1960s -- there has only been a brief period when anyone cared whether the amount of memory a program could address was 4 GB or 16 GB. Moving to 64 bits was coming sooner or later anyway, and in any given application area the difference between running out of 2^32 bytes and running out of 2^32 32 bit words is only about three or four years -- roughly 1995 in workstations, 2005 in PCs, and 2015 in smartphones (with the pioneers 2 or 3 years earlier in each case).

A little bit of data in this. Intel introduced PAE with the Pentium Pro in 1995. This extended total system memory (but not an individual application) from 32 bits to 36 bits (i.e. 16x more). This lasted until AMD introduced the Opteron/Athlon64 in 2003 -- eight years.

Intel had no intention of ever further expanding the x86 supported RAM, or allowing more than 4 GB in a single program. If Sir requires more memory, may I direct Sir's attention to our fine line of Itanic processors that AMD doesn't have a license for?
 
The following users thanked this post: newbrain

Offline GlennSprigg

  • Super Contributor
  • ***
  • Posts: 1259
  • Country: au
  • Medically retired Tech. Old School / re-learning !
Re: Is the Byte obsolete?
« Reply #11 on: July 18, 2021, 10:46:44 am »
Many years ago, I laughed, when finding out that 1/2 a Byte was 'cutely' called a Nybble   :-DD
Diagonal of 1x1 square = Root-2. Ok.
Diagonal of 1x1x1 cube = Root-3 !!!  Beautiful !!
 

Online Gyro

  • Super Contributor
  • ***
  • Posts: 9869
  • Country: gb
Re: Is the Byte obsolete?
« Reply #12 on: July 18, 2021, 10:55:41 am »
Nibble
Best Regards, Chris
 

Offline GlennSprigg

  • Super Contributor
  • ***
  • Posts: 1259
  • Country: au
  • Medically retired Tech. Old School / re-learning !
Re: Is the Byte obsolete?
« Reply #13 on: July 18, 2021, 11:07:38 am »
Ok...  I've seen it written with a 'Y' too, to match the 'Y' in Byte...
Diagonal of 1x1 square = Root-2. Ok.
Diagonal of 1x1x1 cube = Root-3 !!!  Beautiful !!
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8608
  • Country: fi
Re: Is the Byte obsolete?
« Reply #14 on: July 18, 2021, 11:43:49 am »
Very very widely used to signify the amount of red, green or blue light in a pixel. Used in 99.99999% of images and videos on the web and otherwise. Yes, for really high image quality, high dynamic range stuff, it's a on the low side and 10-bit display formats are in use instead, but 256 levels is enough for most cases and there appears to be no general or widespread shift towards higher bit depths in the near future.

Anything large, high-bandwidth requires optimization of storage size and any arbitrary bit counts are used, it's not odd all to see some value stored in, for example, 3 bits if the expected range is 0..7. Byte (8 bits) is more convenient and efficient as fewer instructions are required to extract and use the value than, say, with 3-bit value fields. But most of the performance penalty comes from memory access so if you can squeeze more into memory, you save time even if your CPU needs to do some bit shifting and masking.

And yes, half a byte (4 bits) is a nibble, this isn't some esoteric funny joke but a normal, widely used term, you can find it in instruction set manuals for example whenever they have instructions like "swap nibbles".
« Last Edit: July 18, 2021, 11:48:25 am by Siwastaja »
 

Offline GlennSprigg

  • Super Contributor
  • ***
  • Posts: 1259
  • Country: au
  • Medically retired Tech. Old School / re-learning !
Re: Is the Byte obsolete?
« Reply #15 on: July 18, 2021, 12:29:18 pm »

And yes, half a byte (4 bits) is a nibble, this isn't some esoteric funny joke but a normal, widely used term, you can find it in instruction set manuals for example whenever they have instructions like "swap nibbles".

I know that it's a real word in the computing world...   :)
I just think it's 'cute', because a 'Byte' sounds like 'Bite'   ;D
Diagonal of 1x1 square = Root-2. Ok.
Diagonal of 1x1x1 cube = Root-3 !!!  Beautiful !!
 

Online Gyro

  • Super Contributor
  • ***
  • Posts: 9869
  • Country: gb
Re: Is the Byte obsolete?
« Reply #16 on: July 18, 2021, 12:42:09 pm »
The apparent explanation from Wikipedia...

Quote
The term byte was coined by Werner Buchholz in June 1956,[4][13][14b] during the early design phase for the IBM Stretch[15][16][1][13][14][17][18] computer, which had addressing to the bit and variable field length (VFL) instructions with a byte size encoded in the instruction.[13] It is a deliberate respelling of bite to avoid accidental mutation to bit.[1][13][19][c]
...
« Last Edit: July 18, 2021, 12:50:56 pm by Gyro »
Best Regards, Chris
 

Offline PKTKS

  • Super Contributor
  • ***
  • Posts: 1766
  • Country: br
Re: Is the Byte obsolete?
« Reply #17 on: July 18, 2021, 12:53:54 pm »
Hilarious question   ^-^ :-DD

42 59 54 45

RS232 is still defacto a wide industry standard
wo any sort of short vanishing any time sooner...

Paul
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 9252
  • Country: gb
Re: Is the Byte obsolete?
« Reply #18 on: July 18, 2021, 06:46:03 pm »
Very very widely used to signify the amount of red, green or blue light in a pixel.
Most video is moving to at least 10 bits per colour, so that may not be true for long. It might be the pixels change from linear to a pseudo-log format, and shrink back to 8 bits while in storage, but I think the jury is still out on that one.
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4217
  • Country: gb
Re: Is the Byte obsolete?
« Reply #19 on: July 20, 2021, 02:13:35 pm »
the basic unit should be one nibble (half byte, 4bit), so you can comfortably use an hex-contraves  ;D

The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8608
  • Country: fi
Re: Is the Byte obsolete?
« Reply #20 on: July 20, 2021, 02:22:44 pm »
RS232 is still defacto a wide industry standard
wo any sort of short vanishing any time sooner...

In case you haven't noticed, RS232 has almost vanished during last 20 years. Well not quite, but it's quite a rarity today. It was widely used for computer peripherals, later in special configuration/access ports of embedded devices.

Today, you often find logic-level serial ports instead in devices, eliminating unnecessary level conversion ICs that sit mostly unused, because such ports are fine with the signal integrity of single-ended 3V3 or 5V CMOS logic, and after all, RS232 isn't that robust, in the end. Where good signal integrity is required, RS422/485 is used, and it hasn't diminished like RS232 has.

You can still find it in some newly manufactured measurement instruments, but isolated interfaces are better there, as well.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8608
  • Country: fi
Re: Is the Byte obsolete?
« Reply #21 on: July 20, 2021, 02:27:08 pm »
Most video is moving to at least 10 bits per colour, so that may not be true for long. It might be the pixels change from linear to a pseudo-log format, and shrink back to 8 bits while in storage, but I think the jury is still out on that one.

Pixels already are in kind of "pseudo-log" format, have always been. This is called "gamma correction" and although its existence had another historical reason not valid anymore, it happened to have the side effect of making 8 bits suffice significantly better than in linear representation, roughly corresponding to 10 bits linear.

You may be right that majority of new video transitions into 10 bits in the near future. This is because video compression is rapidly changing technology anyway; because the huge bandwidth requirement warrants technical improvements in compression technology, you can at the same time do other improvements as well.

Still images we look at on the computer screen are pretty much still jpg and png and in 8 bits per channel.
« Last Edit: July 20, 2021, 02:30:50 pm by Siwastaja »
 

Offline PKTKS

  • Super Contributor
  • ***
  • Posts: 1766
  • Country: br
Re: Is the Byte obsolete?
« Reply #22 on: July 20, 2021, 02:34:47 pm »
RS232 is still defacto a wide industry standard
wo any sort of short vanishing any time sooner...

In case you haven't noticed, RS232 has almost vanished during last 20 years. Well not quite, but it's quite a rarity today. It was widely used for computer peripherals, later in special configuration/access ports of embedded devices.



Nah I sure did notice... actually I have been "collecting" these
small USB dongles and converters since then..

I have a FULL BOX about 10 inches wide to store them..

Any crappy USB serial gizmo needs a F*K dongle...

and still some of my DMM require an RS232 old school COM port..

WTF  they did with these USB  :palm:

Paul
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15127
  • Country: fr
Re: Is the Byte obsolete?
« Reply #23 on: July 20, 2021, 07:00:54 pm »
Memory and processing are still expressed in bytes (kilobytes, megabytes etc.)  but how useful is an 8-bit word nowadays?  It gives us 256 data states, but most applications nowadays require more.  Are there names for 16-bit, 32-bit and 64-bit words?

No.

An 8 bit byte is a very useful size for storing reasonable fidelity text, audio, and greyscale images (or RGBA or CMYK components of colour). Some applications need more, but many don't. UTF-8 is a good solution for text forever.

That's just the point. It's the most useful granularity we have found so far for handling general data.
The way we need to see it IMHO is just this: granularity.

People "advocating" increasing this granularity should really ask themselves what the implications would be. They probably have not thought this through.

As to the "general-purpose word widths" being anything else than powers of two, this is something I remember we discussed a while ago. Non-power-of-two widths cause various issues that are not worth the trouble for general-purpose computing.
 

Offline AaronLee

  • Regular Contributor
  • *
  • Posts: 229
  • Country: kr
Re: Is the Byte obsolete?
« Reply #24 on: July 21, 2021, 12:27:44 am »
There are still MCUs commonly used which are very memory limited, and necessarily so due to cost constraints. I regularly design stuff to be mass produced where saving just a few cents per unit of the production cost can be quite significant. In these cases, memory optimization is often mandatory, and 8-bit bytes must be used, or sometimes even separate bits in the byte defined for different usage. If I'm writing a PC App with gigabytes of available memory, I don't care if a simple variable is using 8, 16, or 32 bits. I generally just use an int or unsigned int and forget about it. When it's a MCU with only a few kilobytes of RAM, I need to be mindful of what the maximum values will be and use the appropriate bit size.
 

Offline Rick Law

  • Super Contributor
  • ***
  • Posts: 3474
  • Country: us
Re: Is the Byte obsolete?
« Reply #25 on: July 21, 2021, 12:42:37 am »
Hmmm...

I guess 4 passenger cars are obsolete as well.  Forget the Honda Civic and Toyota Camry, we have 5 axle 18 wheel tractor-trailer trucks that can carry over 150,000 pounds at one go, or pack 100+ standing "passengers".  Why still keep these limiting tiny thing that can carry merely 4 people...
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4376
  • Country: nz
Re: Is the Byte obsolete?
« Reply #26 on: July 21, 2021, 01:16:34 am »
There are still MCUs commonly used which are very memory limited, and necessarily so due to cost constraints. I regularly design stuff to be mass produced where saving just a few cents per unit of the production cost can be quite significant. In these cases, memory optimization is often mandatory, and 8-bit bytes must be used, or sometimes even separate bits in the byte defined for different usage. If I'm writing a PC App with gigabytes of available memory, I don't care if a simple variable is using 8, 16, or 32 bits.

It's not so different as you might think.

While your PC might have 8 or 16 or 128 GB of RAM, that RAM is hundreds of clock cycles away from the CPU. Most PC / server CPUs only have 32K or so of L1 data cache, not so very different to the SRAM size on many microcontrollers. Your program won't crash if you exceed the cache size, but there are significant performance benefits to working mostly within it.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8608
  • Country: fi
Re: Is the Byte obsolete?
« Reply #27 on: July 22, 2021, 11:07:42 am »
Exactly, and a cache line is as small as 32 or 64 bytes.

It's kind of funny when you consider a state-of-the-art number crunching gaming PC running with 32GB of RAM that saving a few bits somewhere may be crucial to make that 3D virtual reality run at a decent frames-per-second.

Arguably, variable granularity is best granularity. And that's exactly what computers do, in most instances the smallest accessible unit is byte or even a bit yet you can access the same data with wider instructions, going all the way to large parallel SIMD processing.
« Last Edit: July 22, 2021, 11:10:54 am by Siwastaja »
 

Offline rsjsouza

  • Super Contributor
  • ***
  • Posts: 6044
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
Re: Is the Byte obsolete?
« Reply #28 on: July 22, 2021, 11:27:22 am »
There are still MCUs commonly used which are very memory limited, and necessarily so due to cost constraints. I regularly design stuff to be mass produced where saving just a few cents per unit of the production cost can be quite significant. In these cases, memory optimization is often mandatory, and 8-bit bytes must be used, or sometimes even separate bits in the byte defined for different usage.
Indeed. In my line of work bytes, UART/RS232, direct bit manipulation are all day-to-day lingo with no end in sight.

In the earlier days of Intel's MMX (SIMD), I still remember doing some programming using Intel's library for byte manipulation. Since then I left this field of large computer programming and I wonder if most of these optimizations vanish when hidden by layers and layers of SW. My impression (which could be wrong) is that cache size optimizations and considerations get completely washed away at an OS'es task context switch (as the cache is flushed for the next completely unrelated task) and the only serious work is done at the subsystem level (Video Graphics processors, for example).
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Online brucehoult

  • Super Contributor
  • ***
  • Posts: 4376
  • Country: nz
Re: Is the Byte obsolete?
« Reply #29 on: July 22, 2021, 12:31:06 pm »
My impression (which could be wrong) is that cache size optimizations and considerations get completely washed away at an OS'es task context switch (as the cache is flushed for the next completely unrelated task)

That depends on the cache design. If the cache uses physical address tags -- as is normal for L1 cache -- then it doesn't need to be flushed.
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Is the Byte obsolete?
« Reply #30 on: July 24, 2021, 11:37:21 am »
Not being able to address bytes - at least remotely efficiently since you coud always do this using bitwise operations - would be a real pain in many applications.

Analog devices shark DSPs had me reworking a SPI protocol to better accommodate the fact that on that platform sizeof (char) == sizeof (short) == sizeof (int) == 1, rather annoying that was.

Regards, Dan.
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Is the Byte obsolete?
« Reply #31 on: July 24, 2021, 11:51:59 am »
You may be right that majority of new video transitions into 10 bits in the near future.
Outside of computers the vast majority of video has been 10 bits (Generally 4:2:2 chroma subsampled Y'CbCr) for many, many years.
Even the old parallel BT.656 digital video (that nobody has used since the early days of standard def. digital video) was generally 10 bit per pixel.

Note that is 10 bits after the non linear light curve that is gamma has been applied, to work properly in linear light you actually need significantly more then 10 bits.

We are now starting to see displays with enough dynamic range that 10 bits is not sufficient even with a standard gamma applied, and there are various hacks to the gamma curves to allow greater dynamic range for HDR display (Which IMHO makes a FAR bigger difference then the move from 1080 to 4k).

8 bits per pixel is fine for a word processor, it leaves something to be desired for video, and a lot to be desired for photography.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf