In the PC gaming world, it’s pretty much taken for granted that hardware makers are going to chase the latest specs. The games themselves are constantly getting more photorealistic, and failing to keep up with those graphics is a sure way to push PC buyers over to your rival. That’s led to some massive progress in rendering tech — in fact, the current AI boom might not have happened if it weren’t for Nvidia’s aggressive GPU development.
When it comes to home theaters, though, the pace of feature adoption tends to be much slower. It took years for HD to become the norm, never mind 4K. And there are advancements from the PC world that have yet to be carried over, most notably its main video port format: DisplayPort. Why does HDMI continue to reign supreme on TVs? Would DisplayPort legitimately be better? I’ll try to answer both of these questions as best I can.
DisplayPort vs HDMI
What the TV world is missing out on
In many respects, HDMI 2.1 is plenty capable. It has 48Gbps of bandwidth, more than the USB 4/Thunderbolt 4 port on your computer. That allows it to handle 4K resolution at 120Hz, and simultaneously deliver uncompressed spatial audio in the form of Dolby Atmos or DTS:X. Compressed Atmos is just fine, incidentally — it’s what you’ll get from streaming services — but Blu-ray fans with high-end speakers may demand the best of the best.
HDMI 2.1 also supports VRR (variable refresh rate) sync to match framerates, Auto Low Latency Mode for gaming, and dynamic HDR modes including Dolby Vision and HDR10+. It should also be enough to handle Vision 2 and HDR10+ Advanced once those finally hit the market. Really, for any immediate movie, show, or even gaming needs, it fits the bill.
DisplayPort has supported 4K120 since its 1.3 release in 2014, however, when it had just 25.92Gbps of bandwidth. We’re now up to DisplayPort 2.1b, which offers 77.37Gbps. That supports pumping video to two 8K120 displays simultaneously, with HDR and uncompressed colors to boot. In theory, it can handle a 16K HDR display at 60Hz, albeit with compression.
There’s more. DisplayPort supports AMD FreeSync and Nvidia G-Sync, an Alt Mode for connecting via USB and Thunderbolt, and the ability to daisy-chain multiple devices. And yes, 2.1b still carries plenty of room for uncompressed spatial audio. Simply put, it wipes the floor with HDMI 2.1.
So why do most TVs only have HDMI inputs?
Nothing personal, strictly business
The leading reason, as you might expect, is profit margins. An individual DisplayPort might cost a trivial amount to include, but across thousands or millions of TVs, that’s a significant expense. Even that might not be so critical except that outside of the “premium” market, TVs can be low-margin goods. They’re relatively commoditized, and with customers not wanting to pay more than a few hundred dollars in many circumstances, TV makers try to claw back profits wherever they can.
They can be remarkably stingy, sometimes. Many TVs still incorporate HDMI 2.0, despite the fact that it dates back to 2013, and imposes sharp limitations on audio and video. You’ll run into TVs that are capped at 60Hz. Samsung, meanwhile, flat out refuses to support Dolby Vision, no matter how expensive the TV, presumably because that would involve paying more royalties. If you’ve got a Samsung product, you have to be content with HDR10+.
You can safely assume that customers will own and expect to use HDMI accessories, which of course feeds into a self-perpetuating cycle, with both TV and accessory makers placing their bets in the most logical place.
There’s a less (inherently) greedy factor at work too: compatibility with existing products. HDMI 1.0 launched four years before DisplayPort, and since then, the number of cables, players, and (later) speakers with HDMI has only grown exponentially. You can safely assume that customers will own and expect to use HDMI accessories, which of course feeds into a self-perpetuating cycle, with both TV and accessory makers placing their bets in the most logical place.
As I mentioned, HDMI 2.1 is also good enough on a technical level, at least for now. In fact it has a couple of conveniences that DisplayPort doesn’t, namely CEC and ARC/eARC. The first lets you control power and volume with your regular remote, while ARC and eARC simplify audio pipelines between devices. For a lot of customers, having everything working simply and in unison probably trumps options like daisy-chaining or running at 8K120.
Will we ever see DisplayPort become de facto on TVs?
Maybe, possibly, don’t count on it
Currently, the only real motivation to put DisplayPort on TVs is compatibility with your laptop or desktop. TVs have improved to the point that they can be used as monitors in some cases, whether for work or gaming, and that’s obviously simpler without an HDMI adapter, which also imposes performance restrictions. Any audio or video chain is bottlenecked by its lowest common denominator.
We may already be witnessing some bending in the industry. There’s a small but burgeoning category of TV/monitor hybrids, which tend to support DisplayPort either directly or through USB-C/Thunderbolt’s DisplayPort Alt Mode. The catch is that anything small enough to be a monitor is going to be too small for your living room, and monitor-quality specs command higher prices. What people like me are really asking for is a conventional TV with a single DisplayPort alongside the usual HDMI options. It’s not the most extravagant request, I think — for years after the introduction of HDMI, TVs continued to have component inputs, and modern sets typically include optical and Ethernet ports that few owners care about. I actually do make use of optical, and should be using Ethernet, but I realize I’m an outlier.
I’m not too optimistic. If TV makers discover they can attract PC gamers who want a bigger canvas, DisplayPort might stand a chance.
I’m not too optimistic. If there’s any hope, though, it might come from PC gaming. That sector is surprisingly strong — presumably because most people need a computer anyway, and the PC game catalog goes back decades, with sharp discounts that console makers like Sony and Nintendo can’t beat. While a new gaming PC might cost you upwards of $1,000 to $2,000, that’s offset by cheaper games and owning a multi-purpose device. I can use one as a workstation one moment and a game console the next. If TV makers discover they can attract PC gamers who want a bigger canvas, DisplayPort might stand a chance.
On the other hand, HDMI 2.2 is rolling out now. That enables a colossal 96Gbps of bandwidth, and hence many of the features of DisplayPort 2.1 combined with benefits like CEC and eARC. It’s going to be good enough for just about anything — if and when it’s widely adopted, that is. The stinginess of TV makers means that you can’t really find it on TVs just yet, so there might be time for it to be leapfrogged by DisplayPort once again.
- Brand
-
Samsung
- Screen Size
-
32-inches
- Display Technology
-
LCD
- HDR
-
Yes, HDR10, HDR10+


