General > General Technical Chat
Samsung can now remotely brick your TV
<< < (6/15) > >>
AaronLee:

--- Quote from: tom66 on September 09, 2021, 09:50:21 am ---It's cheaper to buy 4K TV's nowadays than 4K monitors.
I personally have 2 x 28" 4K monitors, and they're great.  But they cost £400 in total new (which was very cheap, they are now more expensive due to the silicon shortage, last I checked.)
A 48" 4K TV would cost about the same and offers more screen area, although less total pixels.

I know a 3D CAD mechanical designer who has a 39" 4K TV as his main desktop display and has it partitioned into 2/3 displays as needed.

--- End quote ---

Well, maybe I've just looked at the wrong models, because back when 4K was still too expensive, I benchmarked several 2K monitors vs 2K TV sets, and all the TV sets seemed to be doing some sort of scaling so that they weren't using the native panel resolution. Everything looked fuzzy on the TVs compared to real monitors. At that time, the type of ports and specs for the ports on the TVs was much worse than for typical monitors as well. For example, do 4K TVs currently typically support DisplayPort? Even if I need to spend double, for me it's worth it to get a good quality display that will display computer text and graphics clearly, causing me less eye strain. I'm looking at my monitor for most of the day, and a high quality display is one of the top priorities to keep me productive.
tom66:
4K TV's don't usually have DisplayPort.  But every one I have used supports 1:1 pixel perfect rendering so never fuzzy.  Don't know what models you are using,  my 2012 Panasonic FHD plasma TV supports 1:1.  It's been a common feature since HDMI was first a thing.

But HDMI 2.0 supports passive DisplayPort adapters.  My monitors only have one DisplayPort and two HDMI (one 1.4, one 2.0).  I can get 60Hz inputs on them via either the DisplayPort, or the HDMI 2.0 port.  (1.4 is max 30Hz so unusable for a desktop.)  Since I use my work laptop with them as well as my desktop I have passive cables for the HDMI ports that take my graphics cards' DP signals and route them to the HDMI input, and I use regular DP for the work laptop. The GFX card in my desktop recognises this and switches to HDMI 2.0 mode.  This is a standard DP feature.
Berni:

--- Quote from: AaronLee on September 09, 2021, 10:08:44 am ---
--- Quote from: tom66 on September 09, 2021, 09:50:21 am ---It's cheaper to buy 4K TV's nowadays than 4K monitors.
I personally have 2 x 28" 4K monitors, and they're great.  But they cost £400 in total new (which was very cheap, they are now more expensive due to the silicon shortage, last I checked.)
A 48" 4K TV would cost about the same and offers more screen area, although less total pixels.

I know a 3D CAD mechanical designer who has a 39" 4K TV as his main desktop display and has it partitioned into 2/3 displays as needed.

--- End quote ---
Well, maybe I've just looked at the wrong models, because back when 4K was still too expensive, I benchmarked several 2K monitors vs 2K TV sets, and all the TV sets seemed to be doing some sort of scaling so that they weren't using the native panel resolution. Everything looked fuzzy on the TVs compared to real monitors. At that time, the type of ports and specs for the ports on the TVs was much worse than for typical monitors as well. For example, do 4K TVs currently typically support DisplayPort? Even if I need to spend double, for me it's worth it to get a good quality display that will display computer text and graphics clearly, causing me less eye strain. I'm looking at my monitor for most of the day, and a high quality display is one of the top priorities to keep me productive.

--- End quote ---

You probably had problems with image 'enhancement' features in the TV.

I run into this most times that i try to feed a HDMI signal into a TV from a PC. The picture looks great when playing a movie, but then when i show a windows desktop on it the picture looks horrible, it feels like i was connecting to the TV over a 20m long VGA cable. The TVs processing takes a pixel perfect image from HDMI and tries to make it 'more perfect'. If you are lucky then there might be a single menu option that turns off all processing on that input. If you are slightly less lucky you might have to turn like 5 different toggles and sliders to 0 and off. However with a lot of TVs i found that the sliders are actually 'turned off' somewhere in the middle. In that case i open up test patterns on the screen and adjust the sliders until the test patterns look correct. Once you find the correct settings you get a nice 1:1 image like on a desktop monitor.

The test patterns i like to use for it are these: http://www.lagom.nl/lcd-test/

EDIT:
Oh and the video upscalers in TVs have also gotten a lot better these days. So feeding a 1080p image into a 4K TV does make it upscale pretty nicely with sharp text rather than making it fuzzy and blocky.
AaronLee:
Well, when I'm in the market for another monitor, I'll have to revisit the current state of TVs and see if I can find one suitable that won't process the computer image and make it worse.

As for anyone who doesn't want a smart TV, shouldn't it be as simple as not connecting the TV or setting anything up regarding connectivity to make it a normal non-smart TV? If smart TVs are the new norm, then the same with computer monitors, the non-smart TVs will not be as popular, and could even become more expensive than smart TVs. With the last monitor I bought, I discovered that the curved screen monitors were significantly cheaper than flat screens, simply because most of the market for monitors these days is for gamers, of which I'm not one, and find curved screen monitors extremely annoying to the point of almost being unusable.
Bud:
It was not clear to which extent Samsung's kill switch works. Maybe it disables the entire thing so you cant use it even as a monitor.
Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod