Author Topic: Which 27 "Dell monitor to choose - 2K or 4K (not for gaming, but for work and..)  (Read 2696 times)

0 Members and 1 Guest are viewing this topic.

Offline RinnakeTopic starter

  • Contributor
  • Posts: 16
  • Country: cz
Hello.

I can't decide to choose a 27 "monitor from Dell. I choose between these two models:

1. P2720DC
2. S2721QS

I will use the monitor with ThinkPad E14 (Gen 2) laptop that supports usb-c charging / transfer / display.
When I compare both monitors, I see that the model "P" has 2K resolution, it is a professional Dell series, it has a usb-hub and supports usb-c, which is very convenient for my laptop.I will have everything connected with a single cable. model "S", so it is a Dell consumer line (poor quality and display?), but its biggest advantage is 4K resolution.
The monitor should be used for work at home (school), but I would also like to watch movies/YouTube sometimes. I won't play any games on it.

What do you think is better to choose? Is really 4K resolution at 27 "better than 2K? Is it worth when the monitor will have 4K resolution, but it doesn't have usb-hub and usb-c? What is your experience with this? What is more advantageous? If I had a computer, would it probably be better to take 4K? But if I have laptop then usb-c is priceless? How do you see it?

Thank you.
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23096
  • Country: gb
Assuming you are using Windows?

I run a 4k at 27" (Iiyama PROLITE XUB2792UHSU-B1) which is excellent but there are some tradeoffs to be aware of:

1. You will need to use fractional scaling in windows to get a display which isn't too large and isn't too small. I run at 150% scaling which is about right. It's unusable at both 100% and 200% scaling.
2. Some apps don't like fractional scaling and will have terrible icons or blurry UI display on windows. This is improving rapidly though. 90%+ applications work fine though!
3. At no point will this ever work acceptably with macOS or Linux if you move in that direction. Really you need a 5k 27" display running at 200% scaling (integer pixel scaling amounts) for those platforms to work.
4. If your T14 has integrated graphics it may push it to the limit in dual display mode. Don't run it in dual display mode or it'll lag terribly. Also beware that your display scaling amount is probably better at 125% on the T14 and 150% on the 4k display which causes some logistical problems switching displays.

Worth looking a the Lenovo ThinkVision P27u as well. That has 27" 4K IPS and a USB-C charger built in. I'd have bought that if it existed instead of the Iiyama or Dell. But it isn't cheap!

It's game changer for me though (work / software development). If I play games on it, which I do, then I run them at 1920x1080 and put up with the monitor scaling it up for me as my GPU (NVidia 1660 GTX) can't cope  :-DD
« Last Edit: November 12, 2020, 12:14:34 pm by bd139 »
 

Offline capt bullshot

  • Super Contributor
  • ***
  • Posts: 3033
  • Country: de
    • Mostly useless stuff, but nice to have: wunderkis.de
I've got a Dell S2817Q (4k) here, it's s dumpster find with an issue, but in general it works.

I've tried that with some Dell laptops, they can serve a 4k / 60Hz picture. Anyway, I couldn't find any useful advantages over my dual screen setup (2x 1600x1200 pixels). The pixels of this 4k monitor are just too small to get me any better view or image sharpness, otherwise CPU / GPU load gets high when playing videos. I had an edge case (viewing a very large page size scanned schematic) that was on the plus side for the 4k monitor.
So I decided to stay with my dual 1600x1200 monitor setup, which serves me quite well.
If you go for a monitor that is 1920 pixels wide, look for 1200 pixels in height. Full HD (1920 x 1080) is no good for computer work IMO.
« Last Edit: November 12, 2020, 12:29:13 pm by capt bullshot »
Safety devices hinder evolution
 

Offline ace1903

  • Regular Contributor
  • *
  • Posts: 240
  • Country: mk
I have 27" 4k display from Dell. Most of the time, with laptop display is underutilized due to scaling.
I have cheap Lenovo laptop and 4K monitor slows down the whole system because even black and white pdf document needs extra time to be rendered in 4k resolution.
Even scrolling down web page slows the other tasks. That 4k monitor I also use when working from home for software development with my medium range PC configuration and it is perfect.
If intended use is with laptop only, I would recommend 2k display with all the bells and whistles.

 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23096
  • Country: gb
27" 1440p is a good compromise as well.
 

Offline RinnakeTopic starter

  • Contributor
  • Posts: 16
  • Country: cz
And what do you say about the "S3221QS" model?
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23096
  • Country: gb
That’d be fine at 100% scaling. However curved monitors are weird to use I find. Down to personal preference. The world looks a bit weird when you walk away from them for a couple of minutes.
 

Offline RinnakeTopic starter

  • Contributor
  • Posts: 16
  • Country: cz
Yes, It has also VA panel, that's a pity. So maybe P2720DC (or only P2720D?) will be the best option, I hope...
« Last Edit: November 12, 2020, 03:14:41 pm by Rinnake »
 

Offline Fred27

  • Supporter
  • ****
  • Posts: 727
  • Country: gb
    • Fred's blog
I recently upgraded to a couple of 27" monitors. (My choice of setup is one portrait and one landscape and I mostly do software development.) I decided to go for 2K (2560 x 1440) because I found that when I used a 4K monitor I had to scale to make things readable and usable. It seemed daft to pay more for a resolution I was then not really making use of. I went for a couple of Dell U2719D which I'm very happy with. Display Port chaining means I only need one display port on my PC for both monitors.
 

Online Doctorandus_P

  • Super Contributor
  • ***
  • Posts: 3847
  • Country: nl
I agree with (most of the) others here.
4k does not make much sense for such a small monitor. What good are those pixels if you can't see them? The smallest pixels that would be useful are around 0.2mm
 

Offline IDEngineer

  • Super Contributor
  • ***
  • Posts: 1941
  • Country: us
My personal experience is that except for a few specialty applications (very high end CAD, detailed photo/video editing) screen real estate is vastly more important than increased resolution. A 4K video environment is actually a disadvantage if you're having to scale up icons and text to make them readable. At that point you're consuming processing cycles (whether hardware or software) to render pixels that you're then de-rezzing into meaninglessness with the upscale. What's the point?

When I last upgraded my video environment, I ended up choosing two 32 inch DisplayPort monitors with 2K resolution. It's the perfect balance of sharpness and screen real estate, and I can run everything at native "zoom" scale so I'm not just grinding out lots of duplicate pixels for no delivered benefit. One screen is driven by the i7's on-chip video; the other is driven by an NVidia graphics card. The reason for the graphics card is to gain its hardware acceleration via CUDA for Premiere Pro and Photoshop; I don't play video games so my sole consideration was its CUDA core count.

Somewhat related: Several of my partners on our current project are using SolidWorks for the mechanical side of things and they are encountering slowdowns with files containing lots of objects. I just researched ways to speed up SolidWorks and discovered that despite this being 2020, and them being the leading 3D modeling package out there:

* SolidWorks does not take advantage of GPU's for calculations

* SolidWorks does not take advantage of GPU's for graphics rendering

* SolidWorks is single threaded, so it gains absolutely nothing from multicore processors

I mean honestly... a seriously repetitive-calculation intensive application that doesn't leverage GPU's? And if not that, even worse it just flat-out ignores the only real CPU throughput improvement mechanism in the last decade? It's not like clock speeds are increasing by orders of magnitude anymore... multiple cores and multithreading is the single best and fastest way to leverage today's CPU's yet SolidWorks is locked into a DOS-like single threaded mentality. Sheesh... I was writing fully multithreaded Win32 apps over 20 years ago. This ain't exactly breaking news. Grrrrr.
« Last Edit: November 25, 2020, 07:56:13 pm by IDEngineer »
 

Offline grumpydoc

  • Super Contributor
  • ***
  • Posts: 2906
  • Country: gb
27" 1440p is a good compromise as well.

+1 for that, I'd also suggest look at 21:9 if you have the space - I have a BENQ EX3501R, about the same height as a 27" but way more screen area (1440x3840).
 

Offline olkipukki

  • Frequent Contributor
  • **
  • Posts: 790
  • Country: 00
A 4K video environment is actually a disadvantage if you're having to scale up icons and text to make them readable.
As per my experience, Windows scaling is crap, but on macOS works fine.

* SolidWorks does not take advantage of GPU's for calculations
* SolidWorks does not take advantage of GPU's for graphics rendering
Do you mean SolidWorks completely ignore yours Quadro (or Radeon Pro)?
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4488
  • Country: nz
1. You will need to use fractional scaling in windows to get a display which isn't too large and isn't too small. I run at 150% scaling which is about right. It's unusable at both 100% and 200% scaling.
2. Some apps don't like fractional scaling and will have terrible icons or blurry UI display on windows. This is improving rapidly though. 90%+ applications work fine though!
3. At no point will this ever work acceptably with macOS or Linux if you move in that direction. Really you need a 5k 27" display running at 200% scaling (integer pixel scaling amounts) for those platforms to work.

No idea about Windows but I run Linux on a 32" 4k monitor (I have both Philips and Samsung, both around the US$400 mark when I got them a couple of years ago).

I don't "scale" the UI. The UI stays 1:1. I set particular applications to use a slightly larger font than I would on a 2k monitor e.g. Gnome Terminal is set to use 11 point Ubuntu Mono Regular, whereas I'd used 9 point on any lower resolution monitor. The emacs is likewise set to 11 pt Ubuntu Mono Regular.

Web pages I just hit ctrl-+ until they look good. Some I leave at 100%, some at 110%, some at 125%. That doesn't scale text, it chooses an appropriate font of the actual size. I guess photos get scaled.

PDFs I have set to fit the page to the window, non-continuous. I have no idea what scaling that is. It renders the text at the appropriate final size, no scaling.

Nothing else gets adjusted. It's fine in 4k at 32" for these 58 year old eyes. 27" might be a bit too small though. That's 20% higher dot pitch. I guess I'd be setting terminal and emacs up to 13 or 14 pt in that case. I dunno.
 

Offline IDEngineer

  • Super Contributor
  • ***
  • Posts: 1941
  • Country: us
* SolidWorks does not take advantage of GPU's for calculations
* SolidWorks does not take advantage of GPU's for graphics rendering
Do you mean SolidWorks completely ignore yours Quadro (or Radeon Pro)?
That is my understanding, from both my own research and comments from extremely seasoned SolidWorks experts that I consulted. It obviously uses it to display graphics, but performance is no better than with the standard motherboard chipset video hardware.

The part that angers me is that they don't leverage CUDA to accelerate calculations. When our folks move an object to check for interferences and the like, that is the poster child for massively parallel nearly identical calculations - the exact use case CUDA was designed to accelerate at the hardware level. Yet SolidWorks doesn't take advantage of it.

What's even MORE aggravating is that they use one, and only one, core even if your CPU has 7-11 more cores just sitting there. Again, that's low hanging fruit. Even if they ignore the opportunity to accelerate via GPU, their software's performance would skyrocket if they would just distribute the task across a few more threads so the OS could schedule them across all that idle hardware. I still almost don't believe it except that every reference I check, and every expert I ask, confirms it.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf