Somewhere in the 90's, games were in 320x200 pixel resolution, and the Windows 3.11 desktop was 640x480 pixels or so on a typical 386 computer.
Then up to around the year 2000, resolutions increased, going through such stages as 800x600, 1024x768 and 1280x1024 (your CRT would flicker if much higher, and then LCDs came along pinning it at 1280x1024 for 17").
Then the resolution (or especially DPI) growth basically stopped for 10 years imho ("HD" was not really any higher at all than what a year 2000 CRT could do).
But now there's some growth again :) Why the problems though? has 10 years of the same DPI made developers forget about different DPIs?
Your timeline is pretty far off, to my memory. We had VGA graphics games in the early 90s. My Sparcstation 20 came with a 1280x1024 CRT monitor. The 15" Sony CRT I bought shortly after, along with a Matrox Millenium PCI graphics card, was capable of the same. I ran a 21" Sony at 2048x1536 in 1999. It did not flicker.
If he's talking about games and therefore mass market Intel micros, he's roughly right. 1990ish just-plain-VGA gives you 640x480x4bit or 320x200x8bit. You'd generally use the latter for games.
The development you're describing is the timeline of how this stuff trickled into inexpensive consumer hardware.
Expensive workstations already had big monitors and 1280x1024 resolutions in the 1980's.
Also, on a related tangent: prior to 1970, computers already ran with processor clocks over 30 Mhz, with 32 bit (or larger) words, and virtual memory. These things did not show up in the Intel 80368 and start to be supported in Windows 95. :)
We used over 3000 minimun pixel length CRTs at work.
But they were complete monsters(they were deeper than x, y dimensions), not good for your eyes either,needed filters, some X rays will hit your face.
We replaced them with much cheaper(and not as good quality) LCDs arrays. They were so cheap that you could buy 6 for the price of 1 high quality display.
What happened is very simple, LCD technology did not benefit from CRT advances in resolution, as they are completely orthogonal technologies, and people bought cheap panels.
The other change was 3D. We used 2D acceleration cards back in the day, but consumers made little use of them, but they started playing 3D games a lot.
Comparing original Lemmings "sprite changing 100 small parts of the screen" with Call of duty "millions of polygons moving in 3d at real time" does not make very much sense.
So the cards evolved from accelerating a static 2D image or document on the screen like professionals needed, to updating a 3D screen 60 times per second, which is much more complex.
After LED backlighting(too expensive before) smartphones ,tablets and TVs mass producing even cheaper panels quality has improved a lot.
Very true, although I would add that it is often best to sell moderately expensive stuff to a moderate number of people (i.e., why BMW doesn't sell $10k subcompacts). :)
No. Porsche tried to buy VW, neatly made it but ran out of money during the financial crisis. VW took advantage and the deal reversed.
Note that Porsche SE used to own Porsche AG directly, but now VW owns it.
To quote your link:
In July 2012, it was announced that Volkswagen AG was taking over the Porsche AG automotive company completely, which bears the same name, but is only a subsidiary of Porsche SE.
I am guessing - is it because of HDMI and its abysmal bandwidth? Before with VGA the actual resolution of CRT monitors was limited by RAMDAC. And 10 years ago I was able to to have 2048x1536@70Hz on a top-end 15" CRT already.
I guess as nowadays all digital signal is routed through content protection filter, the technology processing these streams wasn't performing well, causing stagnation in display resolutions for a long time (well, there was also a limit in DPI of CRTs and immature LCD tech). I guess only nowadays with HDMI 2.0, DisplayPort 1.3 or Thunderbolt we are getting to the point the digital signal can be "properly" controlled by content-protection chips, hence allowing 4K+ resolutions (though it's still a mess).
1. DVI has supported higher resolution displays for a long time; 1920x1200 since 1999 on single-link, and 2560x1500 on dual-link. Note that this is more than '10 years ago'.
2. HDMI has supported 4K displays since May 2009.
Most of your post belies a misunderstanding or underestimation of the technology involved. The bottleneck was never 'content protection filters' so much as the feasibility of building electronics that can handle higher-bandwidth signals over cables with no extra data channels (in a backwards-compatible manner), and the challenge of getting manufacturers to make hardware that would actually support it, at increased cost, for no practical benefit.
For older LCDs, it was an issue of new technology is expensive. I remember when everyone I knew had 15" CRT monitors, and having a 15" LCD monitor was a luxury that almost no one could afford. It might be hard to remember, but consumer LCD displays were new once, and new technology is never cheap. On top of that, the cost of an LCD panel doesn't scale linearly with diagonal size, so going from 15" to 17" to 19" was a huge cost curve. Until LCD production was more consistent and people understood the point in buying them over CRTs, the market didn't really heat up, and so sizes/resolutions never grew.
As for DPI, LCD DPIs depend on the size of pixels, which are a mechanical element. In contrast, CRT displays are a printed screen of phosphors; paint a smaller, more detailed phosphor grid, adjust the electronics for more precision control, and boom, higher DPI.
20Gbps is certainly non-trivial amount of bandwidth to put in a cheap, consumer cable. The whole system is hard to make work for cheap; I would think HDCP chips would not be a bottleneck at all.
It was/is because LCDs have fixed resolutions.
High DPI displays was a niche, with CRTs you could satisfy that niche without sacrificing the mainstream audience since you didn't have to force people to run any specific resolution.
The reason for why high DPI was a niche was because of one thing, and one thing only: software. You had to have good eyesight and seek a large screen real estate to even be able to make use of it.
Software didn't scale well. We had small laptop screens with 1920x1200 in the early 2000s. It barely cost more than lower resolution displays (back then you typically had three different resolutions to choose from for every laptop (then came apple)). It was real apparent even back then that high-DPI wasn't costly at all, it just wasn't mainstream enough for anyone to care.
Fast forward a decade and we have the exact same problem. Software can't scale shit. The only reason apple was first with high-dpi was because they controlled the software, and they took the easy way out. They just scaled everything up 4 times and look at that, exactly the same screen real estate with just higher DPI. Same thing with iPhone 4 - the only reason they increased the resolution so much was so that they wouldn't have to bother with scaling ("retina" was just an added bonus). Remember that niche that sought large screen real estate and that bought high-resolution CRTs in the 90's? Well, they are still waiting for a decent LCD...
As for HDMI, no, firstly HDMI was never intended for computers at all. But the hype in combination with small laptops forced it upon us anyway. But HDMI came much later anyway, the battle for high-resolution/high-DPI displays was already lost (because of software). The real technical hurdle was the OSD and scaling chips. Again, the reason for why apple was first with the 30" 2560x1600 display was because they were the first ones that could ditch both the OSD and all scaling from the monitor. It only had brightness adjustments (no OSD) and all of the scaling was done on the graphic card, that way you couldn't pair it with a regular PC - if you did you would have to use another monitor to be able to enter BIOS, install the OS, enter safe-mode, or game on any other resolution (which you pretty much had to). Of course, eventually most graphic cards could do so but apple were the first to be able to assume that the monitor would be paired with such a graphic card.
That and the fact that Dual-Link-DVI was quite rare (hardly surprising since there were no monitors on the market that used it).
Oh and people, I hope you (not parent, but lots of others) didn't run 1280x1024 on your 4:3 CRTs. The only 5:4 monitors that existed were 17" and 19" LCDs. You should have used 1600x1200 or 1280x900, that is, if you didn't want to stretch everything.
In the 90's/2000's, typical screen size also grew from 15" to 20"+, so the DPI didn't really change a lot, did it? The text size in OS X has never been adjustable, and in Windows 95+ you could only tweak it from 100% to 150% or so, which would still look good next to text that had a fixed size.
This time, the screen size stays the same, but the DPI goes through the roof.
In the CRT days people would change resolution to get the DPI/text/widget size they wanted, almost no one I saw was using the highest resolution of their monitor as a result. It was easy to buy a CRT because you just picked the size/cost you wanted and knew that you could just set the resolution.
LCDs introduced the problem of a fixed resolution, where basically you have to choose the right resolution at the time you bought it. There were 1920x1200 laptops [1] and 4k 22inch (T221) screens 10 years ago, it was clear this problem was coming, though the software never changed to become resolution scalable.
Game developers don't traditionally think about their games in terms of DPI. Think about it: a game is played on a TV ot Monitor that can display 1920x1080 pixels, but what size is it? 23" diagonal? 30" diagonal? .. etc.
Game developers these days usually think of a PC or console game display in terms of pixel count and aspect ratio, not DPI
When you use OpenGL to map textures onto triangles it gets handled automatically. You might get a quality setting which enables higher resolution textures. That's often there for tuning performance on different hardware, but the effect is higher resolution textures.
Many games have configurable resolution for a fixed screen size, which is basically a configurable DPI. Much of the rendering that goes on can use pixel shaders to take advantage of the extra pixels (e.g. via implicit surfaces).
(Not sure I 100% agree with your timing, but that doesn't matter too much...)
The last 10 years have seen a major shift from desktop to laptops (and mobile/tablet). That has different needs & constraints. e.g. Power, form-factor, capabilities of mobile GPUs. So I think that's part of the effect you've witnessed.
Resolution growth didn't really stop. It's just that vertical resolution growth took a huge backseat to horizontal res growth for a decade or so. I guess since vertical growth didn't really occur, there was little need for UI scaling.
No, it stopped. I just tossed a 10 year old CRT with a 2560x2048 resolution. The key is that tech switched over to favoring LCDs which are finally catching up, and in some cases, passing CRT.
I don't agree that the stagnation in resolution evolution was an artifact of the CRT to LCD conversion. I put the blame on the convergence of the living room and desktop display markets and the disturbing marketing effectiveness of the "HD" moniker [1].
Resolution didn't simply stagnate; it regressed. In approximately 2000, Dell sold laptops with 1600x1200 LCD displays. Once "HD" appeared, display manufacturers lost interest in resolutions higher than 1920x1080. For several years to a decade, the most common top resolution was 1920x1080 and many mid and low-spec laptops were sold with shockingly poor 1024x768 and 1366x768 resolutions. (The 2560x1600 30-inch monitors appeared on the market as a prosumer option in ~2004 but these were tainted by their own blight--a fixed price of $1,100 that never wavered--and didn't see any compelling competition until the Korean and Chinese manufacturers disrupted the incumbents.)
I blame porn, and more generally, the desire for flashy colours in the consumer market. Circa 1990 we had 1600×1200 monochrome CRTs at work, and nothing I've used since has equaled them for text. Considering displays as effectively limited by signal bandwidth, colour forces a √3 drop in linear resolution.
>Circa 1990 we had 1600×1200 monochrome CRTs at work, and nothing I've used since has equaled them for text.
Really? I've used monochrome hi-res CRTs in the early nineties (made by SUN for its workstations nonetheless) and they were shite -- (not to mention the text rendering of the software at the time was shite too).
I actually think you're just seeing those things through rose-colored glasses. Try a modern 5K retina iMac or 4K dell monitor.
Then up to around the year 2000, resolutions increased, going through such stages as 800x600, 1024x768 and 1280x1024 (your CRT would flicker if much higher, and then LCDs came along pinning it at 1280x1024 for 17").
Then the resolution (or especially DPI) growth basically stopped for 10 years imho ("HD" was not really any higher at all than what a year 2000 CRT could do).
But now there's some growth again :) Why the problems though? has 10 years of the same DPI made developers forget about different DPIs?