Computing > General Computing

Is the Byte obsolete?

<< < (7/7)

dmills:

--- Quote from: SiliconWizard on July 17, 2021, 07:11:14 pm ---Not being able to address bytes - at least remotely efficiently since you coud always do this using bitwise operations - would be a real pain in many applications.

--- End quote ---

Analog devices shark DSPs had me reworking a SPI protocol to better accommodate the fact that on that platform sizeof (char) == sizeof (short) == sizeof (int) == 1, rather annoying that was.

Regards, Dan.

dmills:

--- Quote from: Siwastaja on July 20, 2021, 02:27:08 pm ---You may be right that majority of new video transitions into 10 bits in the near future.

--- End quote ---
Outside of computers the vast majority of video has been 10 bits (Generally 4:2:2 chroma subsampled Y'CbCr) for many, many years.
Even the old parallel BT.656 digital video (that nobody has used since the early days of standard def. digital video) was generally 10 bit per pixel.

Note that is 10 bits after the non linear light curve that is gamma has been applied, to work properly in linear light you actually need significantly more then 10 bits.

We are now starting to see displays with enough dynamic range that 10 bits is not sufficient even with a standard gamma applied, and there are various hacks to the gamma curves to allow greater dynamic range for HDR display (Which IMHO makes a FAR bigger difference then the move from 1080 to 4k).

8 bits per pixel is fine for a word processor, it leaves something to be desired for video, and a lot to be desired for photography.

Navigation

[0] Message Index

[*] Previous page

There was an error while thanking
Thanking...
Go to full version