Somewhere in the 90's, games were in 320x200 pixel resolution, and the Windows 3.11 desktop was 640x480 pixels or so on a typical 386 computer.
Then up to around the year 2000, resolutions increased, going through such stages as 800x600, 1024x768 and 1280x1024 (your CRT would flicker if much higher, and then LCDs came along pinning it at 1280x1024 for 17").
Then the resolution (or especially DPI) growth basically stopped for 10 years imho ("HD" was not really any higher at all than what a year 2000 CRT could do).
But now there's some growth again :) Why the problems though? has 10 years of the same DPI made developers forget about different DPIs?
Your timeline is pretty far off, to my memory. We had VGA graphics games in the early 90s. My Sparcstation 20 came with a 1280x1024 CRT monitor. The 15" Sony CRT I bought shortly after, along with a Matrox Millenium PCI graphics card, was capable of the same. I ran a 21" Sony at 2048x1536 in 1999. It did not flicker.
If he's talking about games and therefore mass market Intel micros, he's roughly right. 1990ish just-plain-VGA gives you 640x480x4bit or 320x200x8bit. You'd generally use the latter for games.
The development you're describing is the timeline of how this stuff trickled into inexpensive consumer hardware.
Expensive workstations already had big monitors and 1280x1024 resolutions in the 1980's.
Also, on a related tangent: prior to 1970, computers already ran with processor clocks over 30 Mhz, with 32 bit (or larger) words, and virtual memory. These things did not show up in the Intel 80368 and start to be supported in Windows 95. :)
We used over 3000 minimun pixel length CRTs at work.
But they were complete monsters(they were deeper than x, y dimensions), not good for your eyes either,needed filters, some X rays will hit your face.
We replaced them with much cheaper(and not as good quality) LCDs arrays. They were so cheap that you could buy 6 for the price of 1 high quality display.
What happened is very simple, LCD technology did not benefit from CRT advances in resolution, as they are completely orthogonal technologies, and people bought cheap panels.
The other change was 3D. We used 2D acceleration cards back in the day, but consumers made little use of them, but they started playing 3D games a lot.
Comparing original Lemmings "sprite changing 100 small parts of the screen" with Call of duty "millions of polygons moving in 3d at real time" does not make very much sense.
So the cards evolved from accelerating a static 2D image or document on the screen like professionals needed, to updating a 3D screen 60 times per second, which is much more complex.
After LED backlighting(too expensive before) smartphones ,tablets and TVs mass producing even cheaper panels quality has improved a lot.
Very true, although I would add that it is often best to sell moderately expensive stuff to a moderate number of people (i.e., why BMW doesn't sell $10k subcompacts). :)
No. Porsche tried to buy VW, neatly made it but ran out of money during the financial crisis. VW took advantage and the deal reversed.
Note that Porsche SE used to own Porsche AG directly, but now VW owns it.
To quote your link:
In July 2012, it was announced that Volkswagen AG was taking over the Porsche AG automotive company completely, which bears the same name, but is only a subsidiary of Porsche SE.
I am guessing - is it because of HDMI and its abysmal bandwidth? Before with VGA the actual resolution of CRT monitors was limited by RAMDAC. And 10 years ago I was able to to have 2048x1536@70Hz on a top-end 15" CRT already.
I guess as nowadays all digital signal is routed through content protection filter, the technology processing these streams wasn't performing well, causing stagnation in display resolutions for a long time (well, there was also a limit in DPI of CRTs and immature LCD tech). I guess only nowadays with HDMI 2.0, DisplayPort 1.3 or Thunderbolt we are getting to the point the digital signal can be "properly" controlled by content-protection chips, hence allowing 4K+ resolutions (though it's still a mess).
1. DVI has supported higher resolution displays for a long time; 1920x1200 since 1999 on single-link, and 2560x1500 on dual-link. Note that this is more than '10 years ago'.
2. HDMI has supported 4K displays since May 2009.
Most of your post belies a misunderstanding or underestimation of the technology involved. The bottleneck was never 'content protection filters' so much as the feasibility of building electronics that can handle higher-bandwidth signals over cables with no extra data channels (in a backwards-compatible manner), and the challenge of getting manufacturers to make hardware that would actually support it, at increased cost, for no practical benefit.
For older LCDs, it was an issue of new technology is expensive. I remember when everyone I knew had 15" CRT monitors, and having a 15" LCD monitor was a luxury that almost no one could afford. It might be hard to remember, but consumer LCD displays were new once, and new technology is never cheap. On top of that, the cost of an LCD panel doesn't scale linearly with diagonal size, so going from 15" to 17" to 19" was a huge cost curve. Until LCD production was more consistent and people understood the point in buying them over CRTs, the market didn't really heat up, and so sizes/resolutions never grew.
As for DPI, LCD DPIs depend on the size of pixels, which are a mechanical element. In contrast, CRT displays are a printed screen of phosphors; paint a smaller, more detailed phosphor grid, adjust the electronics for more precision control, and boom, higher DPI.
20Gbps is certainly non-trivial amount of bandwidth to put in a cheap, consumer cable. The whole system is hard to make work for cheap; I would think HDCP chips would not be a bottleneck at all.
It was/is because LCDs have fixed resolutions.
High DPI displays was a niche, with CRTs you could satisfy that niche without sacrificing the mainstream audience since you didn't have to force people to run any specific resolution.
The reason for why high DPI was a niche was because of one thing, and one thing only: software. You had to have good eyesight and seek a large screen real estate to even be able to make use of it.
Software didn't scale well. We had small laptop screens with 1920x1200 in the early 2000s. It barely cost more than lower resolution displays (back then you typically had three different resolutions to choose from for every laptop (then came apple)). It was real apparent even back then that high-DPI wasn't costly at all, it just wasn't mainstream enough for anyone to care.
Fast forward a decade and we have the exact same problem. Software can't scale shit. The only reason apple was first with high-dpi was because they controlled the software, and they took the easy way out. They just scaled everything up 4 times and look at that, exactly the same screen real estate with just higher DPI. Same thing with iPhone 4 - the only reason they increased the resolution so much was so that they wouldn't have to bother with scaling ("retina" was just an added bonus). Remember that niche that sought large screen real estate and that bought high-resolution CRTs in the 90's? Well, they are still waiting for a decent LCD...
As for HDMI, no, firstly HDMI was never intended for computers at all. But the hype in combination with small laptops forced it upon us anyway. But HDMI came much later anyway, the battle for high-resolution/high-DPI displays was already lost (because of software). The real technical hurdle was the OSD and scaling chips. Again, the reason for why apple was first with the 30" 2560x1600 display was because they were the first ones that could ditch both the OSD and all scaling from the monitor. It only had brightness adjustments (no OSD) and all of the scaling was done on the graphic card, that way you couldn't pair it with a regular PC - if you did you would have to use another monitor to be able to enter BIOS, install the OS, enter safe-mode, or game on any other resolution (which you pretty much had to). Of course, eventually most graphic cards could do so but apple were the first to be able to assume that the monitor would be paired with such a graphic card.
That and the fact that Dual-Link-DVI was quite rare (hardly surprising since there were no monitors on the market that used it).
Oh and people, I hope you (not parent, but lots of others) didn't run 1280x1024 on your 4:3 CRTs. The only 5:4 monitors that existed were 17" and 19" LCDs. You should have used 1600x1200 or 1280x900, that is, if you didn't want to stretch everything.
In the 90's/2000's, typical screen size also grew from 15" to 20"+, so the DPI didn't really change a lot, did it? The text size in OS X has never been adjustable, and in Windows 95+ you could only tweak it from 100% to 150% or so, which would still look good next to text that had a fixed size.
This time, the screen size stays the same, but the DPI goes through the roof.
In the CRT days people would change resolution to get the DPI/text/widget size they wanted, almost no one I saw was using the highest resolution of their monitor as a result. It was easy to buy a CRT because you just picked the size/cost you wanted and knew that you could just set the resolution.
LCDs introduced the problem of a fixed resolution, where basically you have to choose the right resolution at the time you bought it. There were 1920x1200 laptops [1] and 4k 22inch (T221) screens 10 years ago, it was clear this problem was coming, though the software never changed to become resolution scalable.
Game developers don't traditionally think about their games in terms of DPI. Think about it: a game is played on a TV ot Monitor that can display 1920x1080 pixels, but what size is it? 23" diagonal? 30" diagonal? .. etc.
Game developers these days usually think of a PC or console game display in terms of pixel count and aspect ratio, not DPI
When you use OpenGL to map textures onto triangles it gets handled automatically. You might get a quality setting which enables higher resolution textures. That's often there for tuning performance on different hardware, but the effect is higher resolution textures.
Many games have configurable resolution for a fixed screen size, which is basically a configurable DPI. Much of the rendering that goes on can use pixel shaders to take advantage of the extra pixels (e.g. via implicit surfaces).
(Not sure I 100% agree with your timing, but that doesn't matter too much...)
The last 10 years have seen a major shift from desktop to laptops (and mobile/tablet). That has different needs & constraints. e.g. Power, form-factor, capabilities of mobile GPUs. So I think that's part of the effect you've witnessed.
Resolution growth didn't really stop. It's just that vertical resolution growth took a huge backseat to horizontal res growth for a decade or so. I guess since vertical growth didn't really occur, there was little need for UI scaling.
No, it stopped. I just tossed a 10 year old CRT with a 2560x2048 resolution. The key is that tech switched over to favoring LCDs which are finally catching up, and in some cases, passing CRT.
I don't agree that the stagnation in resolution evolution was an artifact of the CRT to LCD conversion. I put the blame on the convergence of the living room and desktop display markets and the disturbing marketing effectiveness of the "HD" moniker [1].
Resolution didn't simply stagnate; it regressed. In approximately 2000, Dell sold laptops with 1600x1200 LCD displays. Once "HD" appeared, display manufacturers lost interest in resolutions higher than 1920x1080. For several years to a decade, the most common top resolution was 1920x1080 and many mid and low-spec laptops were sold with shockingly poor 1024x768 and 1366x768 resolutions. (The 2560x1600 30-inch monitors appeared on the market as a prosumer option in ~2004 but these were tainted by their own blight--a fixed price of $1,100 that never wavered--and didn't see any compelling competition until the Korean and Chinese manufacturers disrupted the incumbents.)
I blame porn, and more generally, the desire for flashy colours in the consumer market. Circa 1990 we had 1600×1200 monochrome CRTs at work, and nothing I've used since has equaled them for text. Considering displays as effectively limited by signal bandwidth, colour forces a √3 drop in linear resolution.
>Circa 1990 we had 1600×1200 monochrome CRTs at work, and nothing I've used since has equaled them for text.
Really? I've used monochrome hi-res CRTs in the early nineties (made by SUN for its workstations nonetheless) and they were shite -- (not to mention the text rendering of the software at the time was shite too).
I actually think you're just seeing those things through rose-colored glasses. Try a modern 5K retina iMac or 4K dell monitor.
I'm surprised that nobody has talked about using the Xorg -dpi flag. I am a MacBook Retina user (but exclusively run Linux -- a long story) and i actually find the results reasonable. I had to go to a bit of effort, but finally i managed a reasonable result by changing the way the X server is started (adding the -dpi 227 flag) and modifying Firefox's devPerPx option. The rest sort of "automagically" worked IIRC. I should however add that my setup is minimalistic -- Xmonad and very few non-terminal apps.
If you ever had time to put together a blog post with your config files, I'm sure more than a few people would be interested. I know I have struggled with setting up a reasonably looking Linux on the MBP Retina, and in the end given up.
Would be very interesting to see the results - I think knowing the DPI and scaling accordingly is probably the better way to go about it .. it'll reveal the broken apps quicker, anyway.
I have a Thinkpad Yoga 2 Pro (3200x1800 @ 13"), and often use an external monitor. My "solution" is to turn off HiDPI and rely on the zoom features in all of my apps. (terminal, emacs & browsers)
Browser windows on the laptop usually run at a zoom of 200%, whereas windows on the external display run at 90%.
Annoying, but the beautiful text is worth the bother.
What OS/Distro do you run? I've read some rather worrying things about the linux support and haven't taken windows off yet. I've been living in VMs, which are surprisingly snappy.
Even with a 1920x1080@13" I needed to change settings. Mainly just changing the font size in gnome-tweak-tool. And I just zoom manually in Firefox because my external monitor is much lower res (1280x1024@19").
Regarding the problem of the external display, the DPI disparity is "easily" solved by configuring the external display with a high apparent resolution but down-scaling to what the device actually supports[1]:
xrandr --output <output_name> --scale 2.0x2.0
The output_name can be found just by running "xrandr".
A related and more powerful feature is specifying an arbitrary matrix for the screen-display transformation (e.g. rotation by any angle, shear) (xrandr --transform).
Firefox on Linux currently doesn't support a mixture of HiDPI and non-HiDPI monitors, it has a single DPI setting. I think it will be supported when this [1] bug is resolved.
Shameless plug: I have written an addon to workaround this [2].
It does work well. The other day, there was an article on someone who did an extreme test on pixel scaling, and an Apple software engineer wrote the following comment:
The part of the comment that I found the most interesting was his take on a solution:
1. Make everything vectors. You'll have to choose between weird antialiasing artifacts and potential pixel cracks; either way things will look bad. And you'll encounter bitmaps eventually, and have to deal with the necessities of resampling and pixel aligning at that point. 2. Scale only to integral sizes, and resample. You'll avoid antialiasing and pixel alignment issues, but pay a performance penalty, and things may look slightly blurry.
I do t know about the "some", but that is what that Apple engineer says:
Which leads to the problem of centering. I wish to center a bitmap image within a button's border. The image is 101 logical points high, and the border is 200 logical points high. With a 2x scale factor, I can center absolutely, and still be aligned to device pixels. With a 1x, 1.25, 1.33, etc. scale factor, centering will align me on a partial pixel, which looks like crap. So I have to round. Which way? If the goal is "make it look good," then the answer is "whichever way looks good," which depends on the visual style of the bezel, i.e. whether the top or bottom has more visual weight. So now we need hinting.
And that's where things start to get really nasty. In order to make things look good at arbitrary resolutions, we want to round to device pixels. But the rounding direction is not a local question! Consider what happens if we have two visual elements abutting in logical coordinates, and they round in opposite directions: now there's a device pixel between them. That's very visible: you get a pixel crack! So you have to coordinate rounding.
Question: even with hinting, is it actually possible to make a font where two glyphs align perfectly without white space between them?
WPF is a good example of a framework that attempted resolution independence and encountered this problem. Initially it has the "SnapsToDevicePixels" property, which triggers rounding behavior at draw time. But draw time is too late, because of the "abutting elements rounding in opposite directions" problem. So they introduced the "UseLayoutRounding" property, which does...something. And the guidance is basically "turn it on and see if it helps, if not, disable it." Great.
I think a way out of this is way higher DPI with some blurring as the last step. That's what print does. At 1000 or 2500 DPI nobody will notice aliasing anymore, and he blurring would take care of small cracks between objects that supposedly are abutting, make cases where two objects that shouldn't accidentally do overlap less obnoxious, etc.
The saddest thing about this is that X11 was intended from the start to support multiple DPIs (see e.g. xdpyinfo(1)) and originally shipped with rasterized fonts in multiple resolutions. Sadly the "year of the desktop" crowd has been absolutely consistent in forcing *nix UIs down into the Windows mold.
If your display is over 1080p (I use 1920x1200), Linux (at least Mint 17) switches to HiDPI by default, reducing my effective resolution to 960x600... There is some obscure setting that can switch HiDPI off, but it's a mess overall for the time being as some apps still use HiDPI (like gdm).
I have the same display with Debian and no issue. So it's either introduced by Mint or Ubuntu (Mint is Ubuntu derived). Most likely by Mint, as the UI customization is very distro specific.
Yep, it's Cinnamon, I've just installed a Debian on a new server and switched to Cinnamon, and the same happened there as well... Cinnamon has scaling slider as well, HiDPI is a bit different thing (and a different option in general settings).
For web browsers I thought CTRL+ and CRTL- were for dealing with this, but I think that is stored on a site by site basis. Something similar for the UI would be nice too, but IMHO should be kept independent of the "data" the app displays.
I have a Thinkpad Yoga 2 --- 3800x1800 I think --- I find that the GTK-based stuff works well on HiDPI, but not automatically, so you need to fiddle with the GTK settings a bit.
The QT stuff is worse since they cannot be fixed by adjusting only the font size.
The older (Xaw3D,TK) stuff is really bad.
I have never seen different DPI on different displays working.
99% of my day is spent in the terminal (xfce4-terminal) though and the crisp text makes it all worth while.
"Well, KDE can do what we are looking for, but GNOME does it in a hackish way. As we use GNOME, our review is all about it, and as a conclusion, Linux is no good."
Why do these people insist on using GNOME after all?
I assume you haven't used both GNOME and KDE on a high-DPI monitor, because KDE's support for it is certainly not better. Qt themes don't scale, web views don't scale, icon, window, plasma panel, etc sizes need to be configured independently of font DPI, some applications just fritz out at high DPI font sizes. I've tested both 4.13 and 5.3, and haven't noticed any significant improvement in the latter.
I'm a fan of KDE, but on my high-DPI laptop, I'm using GNOME purely because its "hackish" solution does work a lot better.
My guess is psychology: The default theme in KDE looks really alien, after having used Ubuntu for a long time. I am sure there would be a familiar looking Ambiance theme for KDE, too, but maybe people give up before trying to change and tweak themes.
KDE looks more like Mac or Windows than Gnome 3 does. I'm not saying that's a bad thing, I like Gnome 3. But KDE doesn't look particularly alien to me; it looks like a normal, boring PC desktop.
KDE is my desktop environment of choice since I first installed it, after years of Windows, Gnome, and after several months (each) of Unity, XFCE and LXDE. IMHO nothing beats KDE overall, save perhaps OSX.
I thought the whole point of High-DPI was to was to free us from the need to render graphics pixel-perfect at native resolution. While scaling bitmaps will probably be necessary for photographs and most video for a long time, can't everything else be rendered using SVG? Does such an environment already exist?
Its a good article about High-DPI displays and Gnome, as the author writes "Your editor's experiments have mostly been with GNOME, so this article will have an unavoidable bias in that direction."
I've had 1600x1200 CRT or LCD on my main desktop since... 00s? Late 90s? If you make money typing at a desk, you're foolish not to splurge on desk, chair, lighting, quiet, keyboard... and monitor. I've been dealing with this class of problem since the 80s although it was a little different back then...
If gnome-less, "linux" or more accurately, FOSS in general, has no problem with high res displays, in fact its really nice, far from a problem.
I don't know if KDE in general has resolution problems. Konsole looks beautiful at high res and decent font. The only thing more flame war-ish than emacs vs vi is font choice, however I have noticed visible quality differences such that random font A in 2005 at 24 pt size looks much worse than it does at 10pt size, and this varies by font and over time.
An example of the above I have freebsd on the other machine on my desk and displaying Monospace in a konsole font size 12 is "normal" but when you kick it up to 13 it looks perma bold because thats the 1 pixel to 2 pixel transition for Monospace on that platform and exact config. Luxi Mono displays "normally" at 11, permabold at 12, and at very large sizes like 17 it looks very peculiar randomish parts of letters display as 1 or 2 pix wide. Nimbus Mono L Regular 16 looks nice to my eyes and thats mostly what I use on that machine at 1600x1200.
The biggest problem I have is finding really good UTF8 fonts. There's a nice test file called UTF-8-demo.txt Its mirrored and copied all over, here is one link:
I find this a true test of difficult display rendering on linux.
The ergonomics of widescreen for long term anything, including movie viewing, is another topic and I believe my multiple high res 4:3 monitors are a productivity booster for me, so until they break I'll keep them.
My 12 year old irix fuel v10 that I used for my phd has full vector graphics and fonts and supports monitor resolutions much higher than I can afford/justify today.
Another interesting angle is that it’s not only application software we need to change, but also the hardware drivers are not quite there yet:
I have a Dell UP2414Q (3840x2160 resolution, driven via DisplayPort 1.2) connected to a nVidia GTX 660 card, which was one of the cheapest ones that support DP 1.2.
With the proprietary nvidia driver, I need to manually edit the xorg configuration file to have the correct modes and most importantly disable XRandR in favor of Xinerama.
This in turn breaks e.g. GNOME shell on Fedora 20 (without RandR, you’ll just get an exception in your syslog), and in general prevents plenty of use-cases (e.g. redshift for controlling display brightness, or changing rotation settings without restarting X11).
The reason for having to disable RandR is that there is not currently a standard way on how to represent multi-stream transport (MST) connections, and 4K displays require 2 streams (1920x1080 each) at the same time. With RandR enabled, what you’ll see is 2 connected outputs, and all applications will treat them as such, even though you have only one monitor connected.
Fixing this requires changes in RandR (i.e. the X server) and each driver. AFAIK, on the intel driver this should work, on nouveau there’s work under way, no clue about the proprietary nvidia driver.
Random mom-and-pop folks have been able to buy high-DPI laptops at Best Buy for well over a year. (And higher DPI than discussed in this article; for example, 3200x1800).
In computer lifecycle terms that is a long time and it is a little bit embarrassing that Linux works as poorly as it does in these situations. (Windows ain't so great at it either, which reflects poorly on Microsoft, but it still handles the situation a lot better than Linux does.)
My 11 inch asus laptop has a full hd display. So it's somewhere in the middle.
Gnome's hidpi support doesn't allow float values, making this a serious issue to me. On top of that, you'll have to change the cursor size manually by diving into the config.
The iphone 6+ display uses downsampling, so perhaps that's the way to go. But as things stand, I don't see my self using an external monitor in the near future, even after wayland becomes mainstream.
Unfortunately, that's not the case. I have tried it and the usable screen real state is so small, that gnome is unusable. I like big fonts, and found that a scaling of 1.5 is about right.
Because the article mentions that Chromium doesnt have HighDPI support right now: You can actually compile it with the -Denable_hidpi=1 flag which at least displays most web sites in the right way though the menu/navigation is broken (which is probably the main reason it isnt enabled by default).
There's also a setting in chrome://flags (ctrl+f search for dpi: "Overrides the device scale factor on the device"), although the dropdowns no longer work and the tabs look ridiculous when set to anything above 1.
Imo Linux, KDE and most other software works nicely with HiDPI.
A small issue is GTK apps that don't respect DPI settings.
The most significant problem AFAIK is for people who switch frequently between different displays. There are a few settings here and there you have to adjust every time you switch between Hi and Lo-DPI.
> Imo Linux, KDE and most other software works nicely with HiDPI.
Having written some Qt apps and used them on scaled screens... eh. It's got some work to go, but we're making progress. Apparently Qt 5.4 will have further improvements in this field too.
If we are talking about complete resolution independence I agree.
Currently on two monitors with the same physical dimensions but different resolution you can't get a common rendering.
But you can get a decent rendering from both of them, albeit a bit different; one for example will have a tad larger icons, the other maybe a little larger fonts,etc.
I suppose this is why Apple went to 5K for the 27" iMac. It's exactly double the pixel dimensions of the non-5K version, so everything looks exactly the same size but with higher fidelity.
Not necessarily a big deal for consumers, but vital for professionals that have preferred working screen layouts on their old machines and don't want to change that on the new ones.
Yeah, differing monitor DPIs is still a terrible experience on every OS except OS X in my experience. I've gone so far as to use separate machines and Synergy to get around multi-DPI issues on Windows and Linux.
What a mess. I didn't know HiDPI was so spotty on Linux. It's unclear to me why people would subject themselves to such an experience. If you want a HiDPI laptop, why not spend a little more to get a retina mac? On a developer's salary, the difference is a rounding error.
I would not call it a mess. I run Ubuntu on Lenovo X1 Carbon and Asus Zenbook UX31A and on both machines all the software that I run except Steam work just fine.
It really isn't. I've been running fedora with gnome-shell for 3/4 months on hidpi and most things run fine. The applications I had problems were Chrome, it's solvable with 200% zoom, although the UI remains small and Netbeans, here it's more serious because the UI is more important than on Chrome. Another problem is the impossibility of having different scalings on different displays. Everything else that I've needed runs fine and It it's getting better with each gnome-shell release since 3.10. And anyone can try it before buying a display/laptop just by running the scaling command given on the article. In my opinion/case the sharpness of the fonts/icons/ui outweight the small problems that I've had, but using Netbeans and Chrome aren't mandatory to me.
How do you survive with UX31A's touchpad in Linux? On my UX32VD it's the most horrible part of Linux experience (also battery drained while idle in 2 hours (2 SSDs) doesn't help)
I have a UX31A and my experience is like yours with Ubuntu 14.04.
Only quirk I have with the touchpad is that on Windows, with 1 finger idle on the touchpad and another finger moving the cursor would still move with the moving finger. On Ubuntu, the cursor remains stationary any time there are two fingers on it. I probably would not have noticed it if I weren't already so used to the Windows functionality so it isn't a big deal at all.
Right, you shouldn't have to switch operating systems to get the benefits. It seems perfectly reasonable to expect the display subsystem of your preferred OS to support HiDPI mode, however it chooses to do so.
It's unclear to me why Linux running on a retina mac wouldn't have the same resolution problems as Linux running on PC hardware with a similar display.
It absolutely has the same resolution problems. I'm running ubuntu 14.10 on a macbook pro with a retina screen (installed from scratch last week). TweakUI helps make unity look better. Google Chrome's UI looks terrible, although setting the default page scaling to 150% does pretty well for browsing.
I'm running 14.10 (and previous versions for about year) on an Ultrabook with a HiDPI screen that is actually a higher resolution than the MBP's Retina display.
I've set the scale to 1.38 for menu and titles bars (in the Screen Display settings) and everything looks great.
Chrome is a nightmare, there are some builds of Chromium that are better with HiDPI support, but Firefox works perfectly with the setting "layout.css.devPixelsPerPx" set to 1.8.
I've also set the text scaling factor to 1.58 (Using Unity Tweak Tool).
I'm completely happy with the results and when I use my Wife's MacBook Air I think there is something wrong with my eyes as the difference is really noticeable.
Then up to around the year 2000, resolutions increased, going through such stages as 800x600, 1024x768 and 1280x1024 (your CRT would flicker if much higher, and then LCDs came along pinning it at 1280x1024 for 17").
Then the resolution (or especially DPI) growth basically stopped for 10 years imho ("HD" was not really any higher at all than what a year 2000 CRT could do).
But now there's some growth again :) Why the problems though? has 10 years of the same DPI made developers forget about different DPIs?