Author Topic: Sharp pulls the pin on all LCD televisions  (Read 13258 times)

0 Members and 1 Guest are viewing this topic.

Online TimFox

  • Super Contributor
  • ***
  • Posts: 7949
  • Country: us
  • Retired, now restoring antique test equipment
Re: Sharp pulls the pin on all LCD televisions
« Reply #25 on: June 21, 2015, 10:43:11 pm »
Just as in the movie "Bull Durham", one must learn the cliches.
"Pulling the pin" refers to arming a grenade.
"Pulling the plug" refers to stopping life support in a medical facility.
The latter is a more appropriate cliche for this business decision.
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 9018
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: Sharp pulls the pin on all LCD televisions
« Reply #26 on: June 22, 2015, 01:33:14 am »
RESOLUTION:
This is an old argument with new energy from the 4k developments. Honestly, 4k is a tough sell because there is not much available yet AND to appreciate it your screen has to be HUGE or you need to be VERY close to it to perceive the difference. I am in the camera business in Hollywood and I am not motivated to go buy a 4k display for my house.
GPUs that can do 4K for gaming are becoming quite affordable now. Besides, who wants to spend a lot on a big screen with less resolution than an iPad? What I would have liked to see are large 1440p ("2.5K") displays for those who don't need the full 4K and would like lower GPU requirements, but the niche is very small with the affordable 4K displays out there and Nvidia's (inverse) DSR works nicely to cut back on GPU requirements.

And then there are 5K displays, with even greater GPU requirements. Pretty sure inverse DSR can do wonders there as well, but at current prices, most 5K users are getting SLI anyways.
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Re: Sharp pulls the pin on all LCD televisions
« Reply #27 on: June 22, 2015, 02:14:42 am »
Just as in the movie "Bull Durham", one must learn the cliches.
"Pulling the pin" refers to arming a grenade.
"Pulling the plug" refers to stopping life support in a medical facility.
The latter is a more appropriate cliche for this business decision.

Heh. Yes, I thought the same. Except I imagined Sharp standing there dumbly still holding the grenade after pulling the pin (and letting the lever flip off, for you weapons perfectionists.)

What's laughable about all the TV manufacturers blindly staggering around, searching for some kind of marketing gimmick that could keep the sales flowing (despite broadcast TV dying since fewer and fewer people want to watch the shitty programs and lying news), is that there's an obvious one they are missing.

What's the hottest market in the whole home computing field?
Top end 3D engines, for gaming.
So turn the 'TV' into a computer peripheral, that is a self-contained 3D engine. Include an ethernet port with 10/100/gigabit speed, and a protocol to allow a PC to offload the actual 3D scene drawing to the display. The PC just sends the scene model, textures, viewpoint, etc to the screen. Updates that dataset only when something changes. The engine in the screen does all the frame generation grunt work.

The screen should also be able to generate 2D screen images, for OS GUI interfaces, etc. Except it should be able to texture those 2D GUI images onto surfaces in 3D scenes. Viola... the basis for implementing a truly 3D OS GUI environment. At f-ing last.

Another benefit of this functional refactoring, is that with PC-to-screen data flow being TCP/IP, you could suddenly have any combination of multiple PCs and screens you liked, with screens showing duplicate or unique scenes. Or one screen easily used as an interface to many PCs. If there's no change of a screen's content, there's no data traffic.

A mating product would be a broadcast tuner as a standalone TCP/IP peripheral, with control and A-V-out via the LAN. Then people could have just one, or enough to capture as many channels simultaneously as they wished. Or none, for those not interested in broadcast garbage.

However I have no expectation that the consumer electronics industry could ever get such a concept working. They are the same ones that couldn't get HDMI to provide universal remote control capability for multiple devices over HDMI. Because they collectively have no imagination, or ability to construct flexible, extensible protocols. Also for 'universal remote control' there's a method they'd absolutely have to adopt since it's the only workable way to do it, but they never could since it's fundamentally incompatible with traditional ideas of product styling as a sales feature.

And let's not talk about USB being master-slave, instead of peer to peer. Shudder.
« Last Edit: June 22, 2015, 02:24:00 am by TerraHertz »
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Offline HalcyonTopic starter

  • Global Moderator
  • *****
  • Posts: 5679
  • Country: au
Re: Sharp pulls the pin on all LCD televisions
« Reply #28 on: June 22, 2015, 05:40:56 am »
However I have no expectation that the consumer electronics industry could ever get such a concept working. They are the same ones that couldn't get HDMI to provide universal remote control capability for multiple devices over HDMI. Because they collectively have no imagination, or ability to construct flexible, extensible protocols.

Precisely why I opt to use SDI for audio/video transfer where possible. It's a standard that has been around for a long time on professional and broadcast gear, is far superior to HDMI and uses bog standard coax cable.
« Last Edit: June 22, 2015, 06:03:43 am by Halcyon »
 

Offline rx8pilot

  • Super Contributor
  • ***
  • Posts: 3634
  • Country: us
  • If you want more money, be more valuable.
Re: Sharp pulls the pin on all LCD televisions
« Reply #29 on: June 22, 2015, 05:28:28 pm »
How is SDI far superior to HDMI in a consumer application? You have to convert at every point since consumer displays do not have SDI/HD-SDI.

I have been in the broadcast TV (engineering) business for 25-ish years and saw the birth of SDI. I still to this day have not seen any reason to bring it home.
Factory400 - the worlds smallest factory. https://www.youtube.com/c/Factory400
 

Offline HalcyonTopic starter

  • Global Moderator
  • *****
  • Posts: 5679
  • Country: au
Re: Sharp pulls the pin on all LCD televisions
« Reply #30 on: June 22, 2015, 06:59:59 pm »
How is SDI far superior to HDMI in a consumer application? You have to convert at every point since consumer displays do not have SDI/HD-SDI.

I have been in the broadcast TV (engineering) business for 25-ish years and saw the birth of SDI. I still to this day have not seen any reason to bring it home.

The point I was making was the technology and design is superior, I think it should have been adopted in consumer gear instead of HDMI but that wasn't to be. Of course if you have to convert, there isn't much point. Thankfully most of my gear (including my PC) has native SDI support.

I can recall numerous issues I've had with HDMI including complete signal dropout for no apparent reason. I've had one issue with SDI where a specific make/model of camera wouldn't trigger recording to a hard disk recorder (although that was fixed later in firmware I believe).
 
« Last Edit: June 22, 2015, 07:19:38 pm by Halcyon »
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 9018
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: Sharp pulls the pin on all LCD televisions
« Reply #31 on: June 23, 2015, 04:21:24 am »
What's the hottest market in the whole home computing field?
Top end 3D engines, for gaming.
So turn the 'TV' into a computer peripheral, that is a self-contained 3D engine. Include an ethernet port with 10/100/gigabit speed, and a protocol to allow a PC to offload the actual 3D scene drawing to the display. The PC just sends the scene model, textures, viewpoint, etc to the screen. Updates that dataset only when something changes. The engine in the screen does all the frame generation grunt work.
Even gigabit is way too slow to connect a decent GPU with. PCIe 3.0 x16 does 128Gbps. Don't forget the latency either, which is very critical in many GPGPU applications.
Quote
A mating product would be a broadcast tuner as a standalone TCP/IP peripheral, with control and A-V-out via the LAN. Then people could have just one, or enough to capture as many channels simultaneously as they wished. Or none, for those not interested in broadcast garbage.
Already exists:
http://www.hdhomerun.com/
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline HalcyonTopic starter

  • Global Moderator
  • *****
  • Posts: 5679
  • Country: au
Re: Sharp pulls the pin on all LCD televisions
« Reply #32 on: June 23, 2015, 04:39:30 am »
Already exists:
http://www.hdhomerun.com/

I have one of these and have never been able to get it working correctly dispite all the software updates etc... It used to work through their own software but for some reason has stopped. I gave up. I'm going to buy a professional unit instead.
 

Offline mtdoc

  • Super Contributor
  • ***
  • Posts: 3575
  • Country: us
Re: Sharp pulls the pin on all LCD televisions
« Reply #33 on: June 23, 2015, 06:02:50 am »
We've got a 52" Samsung "smart LCD TV" and it works great.

We have no broadcast TV, no cable or satellite. Our only "TV" is streaming Netflix (or occasionally Amazon video) over 1 Mbs DSL.  Prior to this TV, to stream Netflix, I was using a Mac Mini wired to a "dumb" flat screen for years. We got this Samsung a year and a half ago and with Netflix and Amazon built in things are much simpler. Yes it has a web browser which is a joke to use but for streaming Netflix or Amazon it works great.
 

Offline Towger

  • Super Contributor
  • ***
  • Posts: 1645
  • Country: ie
Re: Sharp pulls the pin on all LCD televisions
« Reply #34 on: June 23, 2015, 06:53:37 am »
The problem is when Youtube, Netflix and Amazon change their API and you find yourself back to needing an external box.
The majority of early smart tv no longer work with youtube anymore.
 

Offline rdl

  • Super Contributor
  • ***
  • Posts: 3667
  • Country: us
Re: Sharp pulls the pin on all LCD televisions
« Reply #35 on: June 23, 2015, 04:27:27 pm »
My TV is a 42" 1080p. It's connected to a mid-range gaming PC. The TV has no antenna or cable connected to it. Everything it puts on screen comes through the PC. All I need from a TV is a decent picture. Funny thing, I remember a few years ago John Carmack complaining about the PC to TV screen lag being worse than the internet to PC lag.
 

Offline AF6LJ

  • Supporter
  • ****
  • Posts: 2902
  • Country: us
Re: Sharp pulls the pin on all LCD televisions
« Reply #36 on: June 23, 2015, 04:37:28 pm »
... without a lot of the bullshit 'Smart' features that seem to be plaguing other brands.
 Absolutely no built-in cameras or microphones.

That's the spirit. But you forgot to mention Internet connection.
It will be 'interesting' if all available TV brands end up with this big brother spyware cam & mic online crap, won't it? Never mind the excuses, like smart voice command or whatever.

Don't need it, don't want it, flatly and non-negotiably don't trust it and will not stand for it.
Personally, I don't even have any broadcast TV receiver/antenna. Just PCs and DVD & VHS players.

Ha ha... a phrase I came across recently, and like. "We're entering the hockey stick of evil."

Television here in the states is a waste of time, money and an invasion of one's privacy. I have a twenty year old analogue TV, that has seen so little use that it smells new. If I want to watch something new and modern I'll do it from my computer, with no microphone, camera and behind a decent logging firewall.

My roommate has a "Smart TV" which has its MAC address blocked in my firewall, not to mention the WiFi only is on when absolutely needed. 
Sue AF6LJ
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 9018
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: Sharp pulls the pin on all LCD televisions
« Reply #37 on: June 24, 2015, 03:17:13 am »
I have one of these and have never been able to get it working correctly dispite all the software updates etc... It used to work through their own software but for some reason has stopped. I gave up. I'm going to buy a professional unit instead.
Mine works just fine for what I need it to do. Though it does work best with wired Ethernet - Wifi N doesn't work very well at all and Homeplug works if the signal is great. In my case, I have the HDHomerun in a wiring closet with the antenna in the attic above and then run back to my PC using a pair of Homeplug adapters hacked to run on telephone wire - best I could do in rental property where I cannot rewire it.
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Re: Sharp pulls the pin on all LCD televisions
« Reply #38 on: June 24, 2015, 05:44:23 am »
What's the hottest market in the whole home computing field?
Top end 3D engines, for gaming.
So turn the 'TV' into a computer peripheral, that is a self-contained 3D engine. Include an ethernet port with 10/100/gigabit speed, and a protocol to allow a PC to offload the actual 3D scene drawing to the display. The PC just sends the scene model, textures, viewpoint, etc to the screen. Updates that dataset only when something changes. The engine in the screen does all the frame generation grunt work.
Even gigabit is way too slow to connect a decent GPU with. PCIe 3.0 x16 does 128Gbps. Don't forget the latency either, which is very critical in many GPGPU applications.

You misunderstand the structure I meant. I didn't mean still have the CPU in the PC still managing 3D graphics generation by the GPU. I meant split off the entire 3D engine - CPU, GPU, RAM for 3D model structures, texture maps, and physics models, rendering memory, all in 'the screen'. The only connection to the PC would be to upload the mesh, lighting and textures to the screen (predictive precaching once before they are encountered in scenes), and also a high level scene control command stream, only containing things like viewpoint moves, etc.

The 'screen' would end up being equivalent of a high end PC core plus GPU. But it wouldn't need an OS like Windows. Just turn it on, and it listens for the scene-control protocol on the data link.
Splitting 3D graphics generation out of the PC and into the display devices, would change the update cycles needed to keep pace with improving graphics generation. PCs could remain useful much longer, while screens could be replaced as resolution and corresponding 3D GPU power advanced together.

Also for fairly static 3D scene models, such as for a 3D GUI for some future OS, there could be very low bitrate requirements between the 'screen' (3D engine) and user-interface master system.  Pre-load all the scene data in non-volatile memory in the screen, then just send info about changes. A PIC or something could handle running a full high-res interactive GUI, if the screen(s) were doing all the graphics frame-generation grunt work.
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Offline Rmx77

  • Contributor
  • Posts: 10
Re: Sharp pulls the pin on all LCD televisions
« Reply #39 on: June 24, 2015, 10:39:29 am »
thats a shame. i recall there is a company that is or has pulled the plug on plasma tv's cause they dont sell well and the reason they dont is cause of burn in and the expensive bulbs. led tv's are just lcd panels with led backlight. i think also the reason for the drop in lcd is cause of the led technology and that everyone is trying to go to 4k. i myself dont even own an lcd or hd tv of any sort so i am stuck with standard def tv which is slowly going away since most electronics manufactures are going hdmi outputs only and most of the old tv's dont have hdmi ports which cripples the old tv's unless you want to search or build an hdmi to component or composite converter. most of the converters i have found are to convert the component or composite to hdmi not the reverse. i do feel that crt tv's outlast lcd and led tv's any day though cause of the fact they are built better and are a little easier to work on then an lcd or led tv.
 

Offline HalcyonTopic starter

  • Global Moderator
  • *****
  • Posts: 5679
  • Country: au
Re: Sharp pulls the pin on all LCD televisions
« Reply #40 on: June 24, 2015, 11:07:34 am »

That's what current graphics cards do. It's why they have gigabytes of RAM. In fact it's been that way for a long time now... Over a decade IIRC.

I think it was around 2000 that we started to get Transform and Lighting (T&L) hardware support. Upload the mesh to the graphics card, upload a transformation and lighting matrix and let it do the calculations. Then came full vertex shaders that basically replaced the fixed T&L functionality with a programmable system that allowed for much more than just basic transformations. Now GPUs do most of the heavy lifting, e.g. tessellation, interpolation, curve generation, particle effects etc. On top of that you have pixel shaders that as well as making things like texturing fully programmable allow for post-processing effects, more detailed lighting, shadowing etc.

The PC uploads meshes and textures to the card, sends it vertex and pixel shaders and then only does high level control. Having a high bandwidth, low latency link does help, but not as much as you might expect. People find that the performance of some cards barely drops when moved from a 16x PCIe slot to a 4x slot, as long as the card has plenty of memory to avoid needing to swap stuff in and out all the time. It's why Thunderbolt works reasonably well for external GPUs, despite the higher latency.

You make some good observations here. But I can see one day some "off-board" processing occurring inside displays, after all, you can fit so much more PCB into the back of a monitor/TV chassis than into a PC slot. Perhaps fibre optics would be a good link between PC and Monitor in this instance?
 

Offline TerraHertz

  • Super Contributor
  • ***
  • Posts: 3958
  • Country: au
  • Why shouldn't we question everything?
    • It's not really a Blog
Re: Sharp pulls the pin on all LCD televisions
« Reply #41 on: June 24, 2015, 04:23:21 pm »
That's what current graphics cards do. It's why they have gigabytes of RAM. In fact it's been that way for a long time now... Over a decade IIRC.

I think it was around 2000 that we started to get Transform and Lighting (T&L) hardware support. Upload the mesh to the graphics card, upload a transformation and lighting matrix and let it do the calculations. Then came full vertex shaders that basically replaced the fixed T&L functionality with a programmable system that allowed for much more than just basic transformations. Now GPUs do most of the heavy lifting, e.g. tessellation, interpolation, curve generation, particle effects etc. On top of that you have pixel shaders that as well as making things like texturing fully programmable allow for post-processing effects, more detailed lighting, shadowing etc.

The PC uploads meshes and textures to the card, sends it vertex and pixel shaders and then only does high level control. Having a high bandwidth, low latency link does help, but not as much as you might expect. People find that the performance of some cards barely drops when moved from a 16x PCIe slot to a 4x slot, as long as the card has plenty of memory to avoid needing to swap stuff in and out all the time. It's why Thunderbolt works reasonably well for external GPUs, despite the higher latency.

I know all this. Did I say anything that suggested I didn't? And you're making my own point - the 3D engine can be made an independent module from the PC, because at a higher level of scene control there doesn't have to be a huge bandwidth over the control channel.  And so, since 3D engine capability and screen resolution more or less rise in proportion, why not put the graphics engine in the screen? Along with a CPU capable of managing it, and enough memory to hold entire game scenery datasets.

One obstacle is that the software drivers for current 3D engines are only available in binary blobs, targeted at Windows, with Linux driver releases lagging behind. This has the effect that one can't do 3D in embedded systems (or experimental OSs), using current graphics cards. That was the saddest thing about nVidia buying out 3Dfx. 3Dfx made full driver source code available, and it could be ported to anything. It was plain C, with only a few tiny bits of ASM and not much OS dependency. nVidia so far as I know never release full source code. Except maybe to console manufacturers.

Perhaps TV manufacturers would have the leverage to get driver source code out of nVidia.
But I have a feeling Microsoft likes the current status quo, in which Windows is pretty much a requirement for PC gaming 3D, thanks to nVidia's Windows-specific binary blob drivers. So Microsoft may incentivise nVidia to keep it that way. If they aren't already.
« Last Edit: June 24, 2015, 04:24:56 pm by TerraHertz »
Collecting old scopes, logic analyzers, and unfinished projects. http://everist.org
 

Offline HalcyonTopic starter

  • Global Moderator
  • *****
  • Posts: 5679
  • Country: au
Re: Sharp pulls the pin on all LCD televisions
« Reply #42 on: June 25, 2015, 10:52:07 am »
« Last Edit: June 25, 2015, 10:54:59 am by Halcyon »
 

Online tom66

  • Super Contributor
  • ***
  • Posts: 6707
  • Country: gb
  • Electronics Hobbyist & FPGA/Embedded Systems EE
Re: Sharp pulls the pin on all LCD televisions
« Reply #43 on: June 25, 2015, 12:37:20 pm »
The 'screen' would end up being equivalent of a high end PC core plus GPU. But it wouldn't need an OS like Windows. Just turn it on, and it listens for the scene-control protocol on the data link.
Splitting 3D graphics generation out of the PC and into the display devices, would change the update cycles needed to keep pace with improving graphics generation. PCs could remain useful much longer, while screens could be replaced as resolution and corresponding 3D GPU power advanced together.

This would just make monitors obsolete more quickly.
I can upgrade my processor, GPU, etc. easily. PCI-E has stayed relevant for some time and is likely to remain dominant for at least the next 2-3 years before it is replaced.
So there is no need for me to replace the PC as I can upgrade parts as needed, individually...

Whereas your idea would make monitors obsolete, and I'd wager that monitors actually require more resources to make, what with them having an LCD panel, backlights, etc. You would throw away the whole monitor, because it could not be upgraded, like an iMac.
 

Offline rx8pilot

  • Super Contributor
  • ***
  • Posts: 3634
  • Country: us
  • If you want more money, be more valuable.
Re: Sharp pulls the pin on all LCD televisions
« Reply #44 on: June 25, 2015, 05:24:20 pm »
Agreed. The display should be the bare-bones necessities to take an HDMI or MM fiber signal and display the pixels. Fast changing technologies like GPUs', CPU's, and software should be in a place that they can be altered and/or upgraded individually. You will never get trapped. If Hulu changed API or whatever, just download the new software to connect and decode.
Factory400 - the worlds smallest factory. https://www.youtube.com/c/Factory400
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf