Author Topic: Just because technology can do something, doent meant its always right  (Read 18854 times)

0 Members and 1 Guest are viewing this topic.

Offline strawberryTopic starter

  • Super Contributor
  • ***
  • Posts: 1155
  • Country: lv
is not quite desktop computer 8K TV set really necessary for general public
is technology really evolving or is it just pros/cons shift (high yields vs environment)
fusion reactor could be prof of tech limits
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9410
  • Country: gb
It's certainly the case in the TV example. When I was in the industry it was all about "we must have 3D support in the HDMI spec right now!"... and that flew like a lead balloon, curved panels etc. There is far too much upscaling required for most content for 4k or 8k resolutions etc.

I've had lots of experiences (particularly with semiconductor manufacturers) of technology looking for an application, rather than the other way round.
Best Regards, Chris
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
It's certainly the case in the TV example. When I was in the industry it was all about "we must have 3D support in the HDMI spec right now!"... and that flew like a lead balloon, curved panels etc. There is far too much upscaling required for most content for 4k or 8k resolutions etc.

I've had lots of experiences (particularly with semiconductor manufacturers) of technology looking for an application, rather than the other way round.

The problem I see is that the TV manufactures all spent years trying to replicate the huge sales volume of the early HD boom when everyone was replacing their old sets with HD. The thing they seemed to completely miss was the fact that the vast majority of people didn't buy new TVs to get HD, they bought them to get a big flatscreen that didn't take up a massive volume of space like the older CRT based sets. I can't even count how many HD sets I saw that were hooked up to only SD sources, most people don't even care about picture quality that much, the cheapest crappiest TV sets have always been the biggest sellers. The TV manufactures should have realized that once everyone already has a thin flat TV there is absolutely NOTHING that can be offered on a new set that will replicate that kind of sales volume.
 
The following users thanked this post: rsjsouza, tooki, Cubdriver, 2N3055

Offline Black Phoenix

  • Super Contributor
  • ***
  • Posts: 1129
  • Country: hk
It's certainly the case in the TV example. When I was in the industry it was all about "we must have 3D support in the HDMI spec right now!"... and that flew like a lead balloon, curved panels etc. There is far too much upscaling required for most content for 4k or 8k resolutions etc.

I've had lots of experiences (particularly with semiconductor manufacturers) of technology looking for an application, rather than the other way round.

Nothing against curve panels, but as PC monitor. Had one at the company and being able to have 2 windows open side by side, each at 1920x1200, as if it was 2 monitors side by side was a godsend in terms of space saved. But it was only the place were it would look logical to me.
 

Offline rsjsouza

  • Super Contributor
  • ***
  • Posts: 5980
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
It's certainly the case in the TV example. When I was in the industry it was all about "we must have 3D support in the HDMI spec right now!"... and that flew like a lead balloon, curved panels etc. There is far too much upscaling required for most content for 4k or 8k resolutions etc.

I've had lots of experiences (particularly with semiconductor manufacturers) of technology looking for an application, rather than the other way round.

The problem I see is that the TV manufactures all spent years trying to replicate the huge sales volume of the early HD boom when everyone was replacing their old sets with HD. The thing they seemed to completely miss was the fact that the vast majority of people didn't buy new TVs to get HD, they bought them to get a big flatscreen that didn't take up a massive volume of space like the older CRT based sets. I can't even count how many HD sets I saw that were hooked up to only SD sources, most people don't even care about picture quality that much, the cheapest crappiest TV sets have always been the biggest sellers. The TV manufactures should have realized that once everyone already has a thin flat TV there is absolutely NOTHING that can be offered on a new set that will replicate that kind of sales volume.
I agree with you the industry was still inebriated with the years of a sales surge, but I think the catalytic came slightly before the flat screens: the DVD. The boom caused by the frenzy of DVD technology was the first leap on the video/audio arena that could compete with Hollywood as it was considered "Hi-Q/Hi-Fi enough" to create the concept of home theater. Basically an entire ecosystem was created with many different devices and accessories to grant the most immersive experience.  Not to mention it was much more convenient when compared to the sequential access of the VCR tapes. 
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Offline daqq

  • Super Contributor
  • ***
  • Posts: 2301
  • Country: sk
    • My site
Well, there's the problem - to sell, manufacturers need something new. Not because it's necessary, nor because it adds 500% to the experience, but because if they don't, people don't have a reason to buy stuff as quickly.

My current fridge has some 25 years. Odds are that it'll last another 10 or 20. This is bad from the manufacturers point of view, since I have no motivation to buy a new one. The solution is to either make shitty products that break after some time or to create a product so good with features or parameters better enough to warrant a purchase of a new device whilst the old one is still good.

Improving upon core features is difficult and for many cases is available only to the top names in the established trades. But adding features that sound nice...

Take some 90% of consumer products equipped with IoT these days - they are mostly useless gimmicks that the marketing department can take advantage off and say that you really aren't up to date if your semi-sentient fridge is not connected to the WobbleTyGoo cloud through the 6G, and can't connect to your Bolloxer account. IoT toaster? Sure. IoT washing machine? Yup. IoT bloody kettle? Of course ( https://smarter.am/collections/smarter-ikettle )! And there are sillier examples.

Most new products today are not revolutionary new concepts that people will buy to get a never before seen feature that will impact their lives. They rarely offer anything but an incremental improvement in parameters over the same priced model from two years ago. But they do offer ever more obscure and sillier gimmicks.
Believe it or not, pointy haired people do exist!
+++Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7934
  • Country: us
  • Retired, now restoring antique test equipment
The current trend in video entertainment seems to be to view cinema content on either a huge wall-mounted flat screen in ones home theater, or on a tiny screen in ones smartphone.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14297
  • Country: fr
Uh, sorry but there are much worse uses of technology currently than 8K displays. Why not 8K on a large screen?
4K on a 6" display is already questionable sure, and we have that. Would be a better example.
But there are still way worse uses of technology than that.

And forcing people to use some technology is way, way worse IMHO and what I really have a problem with.
 
The following users thanked this post: james_s

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
I agree with you the industry was still inebriated with the years of a sales surge, but I think the catalytic came slightly before the flat screens: the DVD. The boom caused by the frenzy of DVD technology was the first leap on the video/audio arena that could compete with Hollywood as it was considered "Hi-Q/Hi-Fi enough" to create the concept of home theater. Basically an entire ecosystem was created with many different devices and accessories to grant the most immersive experience.  Not to mention it was much more convenient when compared to the sequential access of the VCR tapes.

I used DVD for quite a few years before I ever owned a flatscreen. I'm sure it didn't hurt anything but it wasn't the root driving factor, which was the flat screens, loved by housewives everywhere in homes that had big box rear projection sets or the giant 300+ lb direct view CRTs that were offered for a while. Even modest 25" sets were quite bulky and people loved the fact at they could get a reasonably large screen set just a few inches thick. The fact that it could do HD was just a cherry on top.
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Where does the madness end? Watch THE FILM, not the pixels. People OBSESS over resolution, HDR - they can all parrot spec sheets, and yet remain with no idea about what is actually WORTH watching.... But as long as it's a higher resolution that The Jones' next door? Well....  :palm: |O
 
The following users thanked this post: tooki

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Well, there's the problem - to sell, manufacturers need something new. Not because it's necessary, nor because it adds 500% to the experience, but because if they don't, people don't have a reason to buy stuff as quickly.

My current fridge has some 25 years. Odds are that it'll last another 10 or 20. This is bad from the manufacturers point of view, since I have no motivation to buy a new one. The solution is to either make shitty products that break after some time or to create a product so good with features or parameters better enough to warrant a purchase of a new device whilst the old one is still good.

Improving upon core features is difficult and for many cases is available only to the top names in the established trades. But adding features that sound nice...

The problem is that things like TVs have become way too cheap. We used to be just fine with TVs having a ~20 year life cycle, but that was when a decent TV cost several weeks wages or more. As an example, when I was sorting through my grandmother's stuff after she died I found the receipt for the 25" console TV my grandpa bought in I think 1984, IIRC it was $699 and that was with $100 off since it was a floor model. $699 was a lot of money in 1984, around $2k. Now average TVs are a few hundred bucks for a 50-60" screen.
 
The following users thanked this post: eti

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
People are INSANE with the amount they spend on A TELEVISION, every 3-5 years. Need their heads testing :LOL:
 

Online Psi

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: nz
It's less about the resolution and more about the DPI.

As TVs get bigger and bigger you need to up the resolution to keep them looking sharp.

There's zero point having a 32" TV that's 8k IMHO.

But if you have a projector making a 3 meter image then 8K is going to look much better than 1080p and probably noticeably better than 4k. (Though i cant say i have seen one yet).


It seems probably we will get to the point where an entire wall of your house is a display. And you can do conference calls that simulate your wall joined to other peoples walls. So its like you are both standing in the same room because the DPI & HDR are so high
« Last Edit: June 10, 2022, 12:37:06 am by Psi »
Greek letter 'Psi' (not Pounds per Square Inch)
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 8972
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
There is far too much upscaling required for most content for 4k or 8k resolutions etc.
My 970 (that's now approaching 8 years old) only has about 30% utilization upscaling 1080p to 4K, about 1 TFLOPS. If we assume the GPU performance required scales linearly with pixel count, upscaling 4K to 8K would require 4 TFLOPS, which an entry level RTX 3050 should have no problems with. (That's also ignoring the DLSS hardware in the RTX GPUs which would accelerate upscaling and make things even easier.)
There's zero point having a 32" TV that's 8k IMHO.
Photo editing can certainly use more than 4K. Maybe 5K or 6K would be a more reasonable next step, but what's the cost difference between 6K and 8K nowadays?
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline strawberryTopic starter

  • Super Contributor
  • ***
  • Posts: 1155
  • Country: lv
cant see difference between 1280x720 and 2560x1440 on cellphone only battery usage is different
cameras have to much pixels and it reduces light sensitivity and increase noise and compression efforts
10y ago had Sony Ericsson with same quality camera as new top tech developed recently 
 
The following users thanked this post: tooki

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4208
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
But if you have a projector making a 3 meter image then 8K is going to look much better than 1080p and probably noticeably better than 4k. (Though i cant say i have seen one yet).

It's academic; movies just aren't made in that resolution. Most aren't even in 4K, they're upscaled.

I have a true 4K projector with a 120 inch screen. A good 1080P image looks excellent, and the best examples of a 4K image are better still - but a good 1080P image beats a mediocre 4K one hands down.

Most 4K content is pretty mediocre, and if the resolution already isn't the limiting factor, a change to 8K can't offer any real improvement because the information simply isn't there to begin with.

Online Psi

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: nz
There's zero point having a 32" TV that's 8k IMHO.
Photo editing can certainly use more than 4K. Maybe 5K or 6K would be a more reasonable next step, but what's the cost difference between 6K and 8K nowadays?
Yeah agreed, but that's different, that's a monitor rather than a TV.
Greek letter 'Psi' (not Pounds per Square Inch)
 

Online Psi

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: nz
Most 4K content is pretty mediocre, and if the resolution already isn't the limiting factor, a change to 8K can't offer any real improvement because the information simply isn't there to begin with.

Gaming content at native 8k is one exception.
But you obviously need the GPU horsepower to do that, and it will take quite a while before that is common place. 
but I just mean there is no lack of 8K gaming content since any game can be run at 8k if you had the hardware.
Greek letter 'Psi' (not Pounds per Square Inch)
 

Offline AndyBeez

  • Frequent Contributor
  • **
  • Posts: 855
  • Country: nu
Thought for the day: Why do I need 4K/8K when playing retro games like Pong or Tetris?
 

Offline rsjsouza

  • Super Contributor
  • ***
  • Posts: 5980
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
Another example of technology that fails to do something right: future generations will never know that a TV can show an image in less than 5 seconds.

This was a huge push since the old vaccuum tube days and which was fully solved by solid state, only to be hampered now by the lag to boot the embedded OS (or bring it up from its sleep state) and fill a frame buffer.
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 
The following users thanked this post: tooki

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Nothing against curve panels, but as PC monitor. Had one at the company and being able to have 2 windows open side by side, each at 1920x1200, as if it was 2 monitors side by side was a godsend in terms of space saved. But it was only the place were it would look logical to me.

The below is my home-office (the place formally known as 'the shed at the bottom of the garden', post installation of a/c and insulation) setup. The curved monitor is in the middle in portrait, and I really like it...

It's incredibly useful for data sheets when I've got a PCB window up on one of the monitors, and the schematic on the other./Users/simon/Desktop/IMG_0465.jpeg

Or when coding - no more scrolling up and down on colleagues enormous functions/methods to see what's going on - just a glance up or down (I've given up trying to instill a shorter-focussed-methods-are-better approach, apparently throw-in-the-kitchen-sink-as-well approach is a more common coding style...)

The two 4K monitors at the side are easy to work with and can show an enormous amount of detail, but the widescreen one is where I put text so it's easier to read at a glance.
« Last Edit: June 10, 2022, 03:30:48 pm by SpacedCowboy »
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
It's academic; movies just aren't made in that resolution. Most aren't even in 4K, they're upscaled.

I have a true 4K projector with a 120 inch screen. A good 1080P image looks excellent, and the best examples of a 4K image are better still - but a good 1080P image beats a mediocre 4K one hands down.

Most 4K content is pretty mediocre, and if the resolution already isn't the limiting factor, a change to 8K can't offer any real improvement because the information simply isn't there to begin with.

Movies used to be shot on 35mm film, IIRC while analog it is roughly equivalent to 8K. I don't know much about the stuff used today for digital cinema. I definitely agree that there is much more to this than resolution, far too many rips of HD movies are such low bandwidth that the resolution is just wasted.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7934
  • Country: us
  • Retired, now restoring antique test equipment
It's academic; movies just aren't made in that resolution. Most aren't even in 4K, they're upscaled.

I have a true 4K projector with a 120 inch screen. A good 1080P image looks excellent, and the best examples of a 4K image are better still - but a good 1080P image beats a mediocre 4K one hands down.

Most 4K content is pretty mediocre, and if the resolution already isn't the limiting factor, a change to 8K can't offer any real improvement because the information simply isn't there to begin with.

Movies used to be shot on 35mm film, IIRC while analog it is roughly equivalent to 8K. I don't know much about the stuff used today for digital cinema. I definitely agree that there is much more to this than resolution, far too many rips of HD movies are such low bandwidth that the resolution is just wasted.

The SMPTE estimates that there are roughly 30 standards in use for digital cinema.
A popular specification (DCI) can be found at  https://dcimovies.com/specification/DCI-DCSS-v141_2021-1013.pdf
That spec references "4k" at 4096 x 2160 pixels, and "2k" at 2048 x 1080 pixels, and specifies that decoders for DCI shall handle both, and be capable of 24 or 48 frames/sec.
 

Offline Bassman59

  • Super Contributor
  • ***
  • Posts: 2501
  • Country: us
  • Yes, I do this for a living
It's academic; movies just aren't made in that resolution. Most aren't even in 4K, they're upscaled.

I have a true 4K projector with a 120 inch screen. A good 1080P image looks excellent, and the best examples of a 4K image are better still - but a good 1080P image beats a mediocre 4K one hands down.

Most 4K content is pretty mediocre, and if the resolution already isn't the limiting factor, a change to 8K can't offer any real improvement because the information simply isn't there to begin with.

Movies used to be shot on 35mm film, IIRC while analog it is roughly equivalent to 8K. I don't know much about the stuff used today for digital cinema. I definitely agree that there is much more to this than resolution, far too many rips of HD movies are such low bandwidth that the resolution is just wasted.

My local art-house movie theater is showing "2001: A Space Odyssey" from a 70 mm print next month. I saw it last time that print came around here. It is unbelievable: sitting in the middle of a large theater you can still read the writing on the control panels and the space suits.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
The SMPTE estimates that there are roughly 30 standards in use for digital cinema.
A popular specification (DCI) can be found at  https://dcimovies.com/specification/DCI-DCSS-v141_2021-1013.pdf
That spec references "4k" at 4096 x 2160 pixels, and "2k" at 2048 x 1080 pixels, and specifies that decoders for DCI shall handle both, and be capable of 24 or 48 frames/sec.

2K sounds like a significant step backward compared to old fashioned 35mm film. Frankly even 4K sounds low for a movie theater sized screen, personally I would consider 8K to be about the bare minimum to be called reasonable and given the high dollar movie industry I'm surprised they aren't using something really exotic.
 
The following users thanked this post: SiliconWizard


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf