Hacker News new | past | comments | ask | show | jobs | submit login

> I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design.

I have to say you have it exactly backwards!

Gaming on 4K is extremely expensive and still basically impossible at refresh rates higher than 60 Hz. In fact, you’ll be lucky to get even that much. 1440p/144Hz is a much better and more realistic target for even the most enthusiastic gamers.

Also a most welcome recent trend has been to ship with novel temporal antialiasing techniques, completely redefining how games can look at lower resolutions.

Temporal artifacts have always been the bane of 1080p, forcing higher resolutions either directly, or indirectly as subsampling. Once you take that out of equation, the benefit of native 4K is much more modest.

4K movies are nice, but as with games, it’s more of a linear progression. I doubt most people could even tell the difference in a blind test.

Full-range HDR is, in my opinion, a much better investment if you want to improve your TV viewing experience (and lately gaming as well) in a noticeable way.

I don’t know much about graphic design, but I doubt 4K is all that essential. Everyone has been using 96 dpi displays to create content for very high density mediums for a long time. Even the most craptastic ink printer is 300 dpi+. All you need is a zoom function. Color reproduction is, I think, much more important than resolution.

Where HiDPI displays really shine is actually in the most mundane: font rendering.

For anyone that works in the medium of text, programmers, writers, publishers, etc., a 4K display will be a considerable and noticeable quality-of-life improvement.

Even the most Unix-neckbeardy terminal dwellers will appreciate the simply amazing improvement in visual fidelity and clarity of text on screen[1].

> I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time.

That’s what you are supposed to do! :)

It’s only HiDPI at 200% scaling. Otherwise it’s just 2160p, or whatever the implicit resolution is for some partial scaling value.

For 4K at 100% scaling you’d need something like 45" screen at minimum, but that’s not actually practical once you consider the optimal viewing distance for such a screen, especially with a 16:9 ratio.

> I get more screen estate and still readable text with the 25" 1440p.

A 4K display should only provide extra space indirectly. With text on the screen looking so much sharper and more readable, it might be possible to comfortably read smaller font sizes, compared to equivalent 96 dpi display.

If you also need extra space as well, then that’s what 5K is for.

Though for things like technical drawings or detailed maps you can actually use all the extra 6 million pixels to show more information on the screen.

A single-pixel–width hairline is still thick enough to be clearly visible on a HiDPI display[2].

> but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain.

Yes, that’s an issue with things like games. However you can still display 1080p content on a 4K screen, and it looks just as good[3], and often even better[4].

Most graphics software will also work with 1080p bitmaps just fine. Vector graphics necessitates doing a little bit extra work, but for a very good payoff.

Overall though, for things like programming or web browsing, it shouldn’t matter. I have a netbook with a cheap Atom SoC (Apollo Lake) and it can handle 3K without breaking a sweat. That much more capable[5] GPU on your Surface Pro should easily handle even multiple 4K displays.

Pushing some extra pixels is not a big deal, if all you’re doing is running a desktop compositor with simple effects.

> going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Yeah that must suck. Still, it’s only a software bug, and you mustn’t let it keep you from evaluating HiDPI on its merits.

> Also still come across apps that don't know how to scale so that can be really frustrating.

That’s life on bleeding edge ;)

Sure, it’s annoying, but the situation is a lot better than it used to be. Even Linux is doing fine, at least if you stick to recent releases. Some distros like to ship outdated software for some reason :/

Still, in my opinion, the quality-of-life improvements of a HiDPI display very much outweigh the occasional inconvenience. Though obviously, YMMV.

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] Assuming you’re viewing at optimal distance.

[3] With the notable exception of 96dpi native pixel art.

4K has exactly 4 times as many pixels as 1080p, so it shouldn’t be an issue in theory. Nearest-neighbor will give you exactly what you want.

However in practice you need to force scaling in software, otherwise graphic drivers, and most monitor’s postprocessing, tends to default to bicubic scaling. That said, pixel art is not computationally expensive, so it’s mostly just an inconvenience.

[4] You can use advanced scaling algorithms to upscale 1080p to 4K and it usually looks great. E.g. MPV with opengl-hq profile or MadVR on Windows. For that you’ll need something a notch over integrated graphics though, e.g. RX 560, GTX 1050 and on mobile Ryzen 2500U or equivalent.

[5] http://gpu.userbenchmark.com/Compare/Intel-HD-4000-Mobile-12...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: