There has been a rather interesting shift over the past few years that has gone largely unnoticed by tech and gaming enthusiasts: I don’t believe that PC gaming represents a clearly superior technical proposition compared to console gaming anymore.

Three big changes took place to alter the technical gaming landscape: HDR, W-OLED, and the Xbox One X. To make my case, I will first elaborate about each of these.

HDR is deeply underappreciated

High dynamic range display essentially consists of two components: greater contrast, and new math. ST 2084 is the SMPTE’s standardized form of PQ, Dolby’s perceptual quantizer, an electro-optical transfer function (EOTF) that maps video levels to light output and replaces traditional gamma. If you don’t know what any of these words mean, but would like to learn more, see Dolby’s white paper. In short, this function encodes color data far more efficiently than gamma ever could across a wider luminance range and takes into account real-world modeling of human perception, allowing a larger spectrum of colors to be perceived from an otherwise identical source image.

To reduce our focus to the most significant problem that HDR addresses: content such as movies and video games has been effectively constrained to 100 nits of maximum luminance for several decades. Considering that simply walking outside will showcase a distribution of luminance values well into the thousands of nits for reflected surfaces that are perfectly safe to observe, you can begin to appreciate how severely this historical constraint has restricted all of our media and computing interfaces. To quantify the benefits of high dynamic range in combination with support for wider color gamuts on modern displays, color volume is one measure for expressing the expanded range of colors able to be realized.

In other words, HDR provides much more than a leap in contrast alone. Since HD and the “Retina” era of high-DPI displays, HDR offers the single biggest individual improvement to image quality. And it does not apply to video alone, but to any digital display output, including photos and application UI.*

PC monitors have fallen behind for good

There is however a tragic downside to the HDR era: it’s currently impossible on the PC, and will be for years. This comes down to two factors: power and OLED economics.

Over the past half-decade, OLED TVs quickly superseded plasma TVs in almost all technical regards, thanks to key (heavily patented) technologies that LG Display was able to bring to market with its W-OLED IGZO panels. Briefly, W-OLED displays utilize an RGBW pixel structure, with white subpixels that allow for high light emission in conjunction with filters to convert the colors for final output.

W-OLED technology provides much greater luminance and efficiency, decreases the risk of burn-in, and altogether greatly improves panel yields for production. The alignment precision required for patterning is far easier to achieve than for standard OLED fabrication, enabling the production economics to be viable for even very large panel sizes.

While there was some early hope of bringing good OLED (and even IGZO OLED) panels to laptops, those attempts effectively ended in failure. The economics simply do not work. Worse, it is brutally hard to increase max luminance for monitors compared to for TVs, as TVs are able to draw far greater amounts of power.

That sadly hasn’t stopped a recent sham push by monitor vendors and VESA to peddle “HDR” PC displays to consumers, but I assure you every single last one of these products is effectively a fake, and the overall marketing effort is stupid nonsense.

Meanwhile, the console space has seen widespread HDR adoption by developers, often to magnificent effect. Despite most games doing HDR poorly in various ways, it’s generally a massive improvement to image quality, one that happens to be effectively free from a performance perspective. It’s glorious.

Microsoft has changed the game

With the Xbox One X, Microsoft has pushed like hell to re-establish its technical dominance over Sony, and it has hugely succeeded in doing so. Despite having a CPU microarchitecture that would best be described as horrendously uncompetitive, the GTX 1070-level of graphics, abundant memory bandwidth, and fixed hardware target with low-level developer access of the One X have combined to set a new benchmark in extracting pure pixel performance from a console. Locked 4K30 and even 4K60 on the One X are far less uncommon than you might think.

I’ve watched hundreds of hours of Digital Foundry and other performance analysis videos, and the One X games to date almost always perform outstandingly well. For the cases with consistent performance drops, all the games nonetheless support variable refresh and are future-proof for running with greatly improved performance on the next Xbox. You basically can’t lose.

This is because Microsoft’s vGPU mastery has brought one of the key advantages of the Windows PC platform, incredible backwards compatibility support in software, to the console space for the first time. And its amazing efforts haven’t stopped there. (If you’re wondering why any company would go to such lengths to resurrect old, existing games, it’s because game preservation is an enormous problem.)

Despite using a fairly terrible SoC, the Switch to an extent has also demonstrated the ever-shrinking gap between the performance of cost-optimized chipsets and silicon designed for powerful PCs. Mark my words, with its ARM efficiency advantage, Nintendo is well-poised to embarrass Microsoft and Sony on hardware technical capabilities over the coming years. Though the less constrained GPU die sizes, memory bandwidth, and especially bill of materials of the latter two companies’ consoles will continue to ensure advantages over Nintendo’s specs, Nintendo will trend ever closer to the x86 platforms in raw performance.

The PC’s remaining performance advantages

Some factors influencing graphics fidelity remain unchanged. Brute-force high quality anti-aliasing is still a major advantage in favor of the PC. 120fps console gaming is currently limited to a literal handful of Xbox One X games that require variable refresh to eliminate stutter. And a precious minority of PC games such as Battlefield V really do still take advantage of PC hardware superiority these days.

You will also always be able to drop huge sums of money on the best-binned, heavily overclocked CPU, with a completely overkill dual-GPU setup and liquid cooling to keep a sky-high power budget in check. This will only get more and more expensive and wasteful, however, with the recent death of Moore’s Law. (It’s not really the lack of competition that has NVIDIA’s GPU pricing soaring through the roof.)

Finally, I won’t neglect to acknowledge that the PC has recently garnered one hell of a trick in its favor: real-time ray tracing. The current, very early, best case for it is demonstrated by Battlefield V, which can just run at 1440p60 on an ~$800 GPU. Even though it’s early days, the ray tracing revolution has indeed already begun.

Weighing the tradeoffs for the gamer

But I don’t believe higher average frame rates and Ultra settings that are pretty poor on performance efficiency are worth quite as much as commonly held. I do however think the advantages of generally better overall image quality thanks to HDR and OLED panels, more consistent frame-times (through micro-optimization), and little required frustration toiling with graphics and motherboard settings heavily weigh the accessibility of high-end graphics fidelity towards console gaming for the overwhelming majority of consumers.

If you truly value stable performance, it’s almost sufficient to note that optimizing settings to get a modern AAA PC game to run with extremely consistent frame delivery is generally an *enormous* pain requiring a great deal of research, and even then Windows will try to ensure you misery. Heaven forbid you use a workstation CPU with NUMA.

(There’s also more that can be said about the immaturity of Direct3D 12 and Vulkan drivers, and how wildly behind Intel and AMD are competitively on efficiency and hardware platform features compared to the ARM players, but these are secondary matters.)

For any PC gamers who think my overall argument is nonsense, genuinely, show me your frame-time data on a non-monster setup. I’m quite confident that 99+% of PC gamers are not playing AAA games at nearly locked frame rate multiples of 60, but I’m always open to more comprehensive data analysis.

(PC gaming pro-tip: always set a frame rate cap of exactly 120, 60, or 30fps, or do whatever trick you need to to achieve the equivalent with VSync on. And if you’re using a variable refresh display, still set a frame rate cap that ensures nearly 100% frame-time consistency for a given game. See here for further nuance about maximum VRR refresh rates with external frame rate control.)

Lastly, if you haven’t seen it yet and are interested in this topic, take a look at my console gaming optimization reference guide. I put quite a lot of free time into it.

* How well even tech enthusiasts can distinguish HDR from non-HDR content on mobile devices when not displayed side-by-side is another lengthy topic.