Hopefully I finally got it right...
TL;DR
In my opinion, Intel CPU's actually suck for HTPC use, especially 4K UHD HDR Blu-ray playback. But due to Intel's complex and confusing product line-up, and a general failure of the media to recognize the fundamental issues, unwary users like myself continue to make poor buying decisions.
If you want to play 4K UHD Blu-ray discs with HDR on your HTPC, all things considered I believe that your best path to success is with a discrete GPU (Nvidia RTX 2000 or 3000 series, or maybe AMD Radeon series), paired with whatever CPU you prefer (I really like AMD), and go in knowing you'll need to use unlocking software that removes SGX playback restrictions.
If you want to play official, protected 4K UHD Blu-ray + HDR discs on an Intel setup connected to your consumer HDMI gear, you'll need help from DisplayPort to HDMI converters, otherwise you'll have to connect to a display device that accepts DisplayPort. Period, end of story.
Considering that the mainstream standard of 4K UHD Blu-ray disc playback on PC's is CyberLink's PowerDVD, which requires Intel's SGX technology, you'd think Intel would have better aligned their product stack to leverage this advantage. But they haven't. Read on if you want to know more.
The Backstory
For those that have followed along in these forums, you already know that I'm unhappy with my new HTPC setup. I attempted to assemble a value HTPC to use with a projector to play official 4K UHD Blu-ray HDR discs using PowerDVD, and without requiring help from unlocking software like the red foxy provides.
A year ago I created a value HTPC based around an AMD APU, and even though Windows recognized HDR capabilities, PowerDVD refused to play HDR content with the AMD iGPU. I don't know if that limitation extends to AMD discrete Radeon GPUs, or if it is an APU issue only. Regardless, since the AMD APU solution didn't work with PowerDVD, I figured an Intel only solution would surely work, right?...
Coming into this article, I had two main issues to troubleshoot. One is that even though I have an Intel 10th gen CPU, the i3-10100, PowerDVD refuses to play any 4K Blu-rays, and the Cyberlink Blu-ray Advisor reports that Intel SGX is not available. The second issue is that even though I thought all components supposedly support HDR, Windows reports that my display does not support HDR (and of course PowerDVD agrees it's not available).
I figure I'm not the first person, nor the last person, to encounter these issues. I wanted to share my story here, in hopes it helps other users dealing with the same challenges.
The Hardware
As the centerpiece of my new HTPC setup, I recently picked up a new 4K projector, the Optoma UHZ50. I chose this projector because it had a few features that caught my eye. A true 4K DLP-based laser projector with tons of light output for daytime viewing, basic HDR capabilities (HDR10 & HLG), 24fps support for judder-free movie watching, wall color compensation so I don't have to paint my wall white or get a screen (but probably should), and all this for under $3-large. The laser should last the life of the projector, so I expect to easily get 10-20 years out of it and no expensive bulbs to buy. While this isn't my dream projector, it's good enough for now and I will find interesting uses for it in the future when I retire it from main theater duty. For example, while the projector displays 4K at up to 60Hz, if you drop the resolution to 1080p it jumps up to a 240Hz refresh rate, making this a great gaming projector.
For the HTPC I decided to go with an Intel processor, which is the only route to get SGX support. As I wrote above, last year I discovered on another HTPC build that AMD's integrated graphics are not compatible with PowerDVD's HDR capabilities, something that seems likely a fake software restriction and not a limitation of the hardware, especially when Windows reports HDR support on AMD chips. So, my hope was that the Intel chip would solve both the PowerDVD HDR support issue and the SGX issue in one little package.
Oddly, all desktop Intel chips use the exact same integrated GPU, which for the 10th-Gen series is the UHD Graphics 630 chip. With both the i7-10900 and i3-10100 having the exact same GPU, I went for the much cheaper and energy efficient i3-10100. I made sure to do my research, and the UHD Graphics 630 solution supports 4K at 10-bit color with HDR.
Sidenote, unknown at the time of purchase, Intel's last CPU generation to support SGX is the 10th-gen series. Newer CPUs in the 11th and 12th gen, like the i3-11400 and i3-12400, no longer support SGX. Apparently, I got lucky in my choice of a i3-10100. Well, lucky if it actually worked, which it currently doesn't, so... yeah... lucky.
Even though 8GB of RAM is normally enough for a HTPC, I sprang for 16GB so I had plenty to share with the iGPU. For the motherboard I chose the AsRock B460M Pro4. There's not much special about this board, but I figured for HTPC duty I didn't really need special. It had an HDMI port, and it's a 10th-gen Intel CPU board, so I figured this was a perfect match.
For the 4K UHD Blu-ray drive, I made sure to get a friendly drive. It actually shipped with a non-friendly firmware version, so I had to down-flash it to an earlier friendly firmware version. In hindsight, I'm not quite sure why I wanted this drive to be friendly, since I was making a UHD Blu-ray compliant build there was no need for this drive to be friendly.
For the most astute and experience out there, you may have already realized two of my fatal flaws. What, you missed them? Me too. And it's taken me a lot of research to figure out how I went so wrong.
Troubleshooting SGX
PowerDVD 21 refused to play any 4K discs, indicating my PC was not compliant. To troubleshoot, I used CyberLink's Ultra HD Blu-ray Advisor (sad that such a tool is a necessity), which confirmed the issues with SGX and the friendly drive. I already had a theory on why SGX wasn't working, so I decided to troubleshoot this first.
Both my CPU and motherboard support SGX, and I had it enabled in BIOS. Why, then, did the CyberLink UHD Blu-ray Advisor say it wasn't available?
I had a hunch that while SGX might be an Intel CPU feature, it is a feature that is applied to the entire data-path. If I was right, then I might have a lead on the problem. Here's the data path:
- Friendly UHD Drive > SATA Bus > Intel CPU > RAM > Intel GPU > HDMI w/ HDCP 2.0 > HDCP 2.0 Projector
But that didn't solve the SGX problem. Intel SGX still presented as not available in the Advisor.
While double-checking that SGX was indeed enabled in the BIOS, I noticed that a newer BIOS version was available for my motherboard. The changes from version P1.50 to P1.60 didn't sound promising, but I tried anyway. As expected, the newer BIOS didn't resolve the SGX issue either.
I finally decided that I needed to install an official UHD drive (non-Friendly). Luckily, I had one available in another HTPC, so I installed it to see if that would resolve the issue. Nope. CyberLink's UHD Blu-ray Advisor still reported SGX was not available, though for the first time I had official UHD drive support.
I decided the next step was to focus on just SGX, outside of the boundaries of PowerDVD. My first Google query returned a recommendation to run the Intel SGX Activator, a software app available in the Microsoft Windows Store. I had no idea this tool existed, nor about any requirement to run it to enable SGX. I figured that since it was baked into the CPU+BIOS, it should just work. I never would have imagined a need to activate it in Windows. Regardless, I ran the tool, and sure enough I had to go through a couple steps to activate SGX.
Even with SGX now active, CyberLink's Advisor tool still reported it was unavailable. But one thing that bothered me about the Advisor is that it kept complaining that I wasn't running PowerDVD 17 - which is technically true as I'm running PowerDVD 21. How old is this app?!
On a hunch, I ran PowerDVD 21 again, in my new configuration, and for the first time I was able to play my UHD 4K copy of No Time to Die, yay! There were still plenty of warnings that I didn't have HDR, but otherwise, I was quite happy to have this working.
Because the Advisor still incorrectly reports SGX as not available, I'm not clear which steps above fixed the issue, though likely it was both installing an official UHD drive and activating SGX.
Of course, to keep this newfound capability I would have to keep using an official, non-friendly drive. I'd have to think on that a bit, because my bigger goal is HDR support, and I had a feeling I was going to have to break SGX again in pursuit of HDR...
Troubleshooting HDR
For a couple months now, I've been quite baffled as to why HDR wasn't working with my setup. Everything I could think to check fully supported HDR: the Optoma UHZ50 projector, the HDMI 2.0a compliant cable, the Intel i3-10100 CPU with UHD Graphics 630 iGPU, my copy of Win10 Pro, and the Intel GPU drivers.
I thought this was supposed to be plug and play, and the result just didn't make sense to me. I was beginning to have concerns that the projector, being a very new model, had a firmware issue. I checked for new firmware, but it was already on the latest release.
I started to wonder if perhaps I had made a mistake with the i3-10100, so I started researching it's HDR compliance. It was then that I stumbled onto the real problem: UHD Graphics 630 only support HDMI 1.4, not 2.0a or 2.0b! Arrggghhhh!!!!!
HDR support, even the most basic HDR10, requires 10-bit color, and at 4K you can only achieve this with the bandwidth and features provided by HDMI 2.0a or DisplayPort 1.4. HDMI 1.4 does not have enough bandwidth for 4K @ 10-bit color, so HDR is impossible on this connection.
Note that HDMI 2.0a only supports static HDR (HDR10 and HLG). You'll need HDMI 2.0b for dynamic HDR support (HDR10+ and Dolby Vision).
But for some reason I was thinking Intel's 10th gen had HDMI 2.0a support. I kept searching and realized I was confusing Intel's 10th and 11th-gen CPU's, and only the 11th-gen adds HDMI 2.0 support.
Wait, so SGX + HDR doesn't even exist?!
So, if Intel's 10th-Gen desktop CPU's with UHD Graphics 630 don't support HDMI 2.0 or HDR, and Intel's 11th and 12th-Gen CPU's drop support for SGX, then how in the world has this ever worked for anyone ever?
The answer, it seems, is that this only worked when using the DisplayPort output. This applies equally to all 10th-gen and earlier CPUs, desktop, mobile, even NUC devices. Intel never added HDMI 2.0 support to any CPU's until the 11th-gen.
But you can't upgrade to the 11th-gen Intel CPU's for HDMI 2.0, because then you lose Intel SGX support, which means you can't play protected UHD Blu-ray discs at all. So, it's IMPOSSIBLEto play official UHD Blu-ray discs with HDR using PowerDVD on an Intel CPU using the stock HDMI output - any generation, doesn't matter.
One way around this limitation is to use unlocking software (i.e. AnyDVD/MakeMV/etc.) in combination with 11th-gen or newer Intel CPU's. The legality of such tools vary by region, and for this reason I prohibit discussion of how to use these tools on this forum. Which generally makes for a really sad situation, where laws want to make criminals of legitimate consumers.
But as long as you plan to use unlocking software, you might as well rethink your hardware component selection. Do you prefer AMD CPUs over Intel? Cool, go AMD. Prefer Nvidia GPUs over Intel, awesome, get something powerful and crank up the eye candy and power through AAA gaming.
Though it was relatively cheap, I now regret every dollar I wasted on this Intel i3-10100 CPU. And I'm absolutely dismayed that this limitation doesn't seem to be well understood.
The pessimist in me is apt to think it is all a conspiracy, that manufacturers have purposely used easy to confuse acronyms, product names, and even version numbers, making it a nightmare to discern the truth. For example, DisplayPort 1.4 supports HDR while HDMI 1.4 does not, yet they are both confusingly similar, even the plug shape. You might remember that v1.4 has the feature you need, but confused DisplayPort for HDMI. Ditto for HDMI 2.0 and DisplayPort 2.0, way too easy to confuse capabilities. If you happen to buy the wrong thing, the easiest solution is to simply buy more stuff, and the manufacturers have zero incentive to correct this mess as they are actually profiting from it. The same thing is happening with USB 4.0 and HDMI 2.2, where manufacturers seem to have gone out of their way to make sure a revision number means next to nothing.
Fixing HDR
Actually fixing my HDR issue was super easy. I had a spare Nvidia RTX 2080 Super that I had decommissioned from a previous build, and I simply installed it in my new HTPC, and disabled Intel's integrated UHD Graphics 630. After installing the latest Nvidia drivers, HDR magically started working in Windows 10 and in CyberLink PowerDVD 21.
Of course, using a non-Intel integrated GPU invalidates the SGX end-to-end compliance, so without SGX then PowerDVD wouldn't actually play 4K UHD Blu-ray discs unless you also use unlocking software. And since you have to use unlocking software, SGX becomes a non-issue, so that really opens up the door to CPU & GPU & software choice freedom.
I don't know if an AMD Radeon discrete GPU would have also worked for HDR in PowerDVD, as I don't have one handy to test with. I'm doubtful, as the AMD integrated GPU in their APU series chips was flagged as incompatible by CyberLink, so that issue may also extend to their discrete GPUs.
To me, the biggest benefit of unlocking software is that now I don't have to use PowerDVD if I prefer another player, and don't have to use Intel iGPU's if I prefer a serious GPU solution. For example, MPC+BE with madVR and LAV filters can make your movies look better than ever, and gives you control over HDR mapping if you need that for your particular setup, but you'll need a powerful Nvidia RTX GPU to do all the video processing.
Are You Supporting Piracy?!
To be clear, media piracy is theft, and I don't condone that behavior. We live in a very fortunate period in time, for this brief window during which 4K UHD Blu-ray discs exist and can be purchased once and viewed forever. If we fail to support this industry, this option will disappear, and the only replacement will be streaming services. Please don't use unlocking software for theft, even though it does seem to be almost a necessity to view 4K UHD HDR Blu-rays on a typical PC.
Your only legitimate solution to play official, protected 4K UHD HDR Blu-ray content is to use DisplayPort, or a DisplayPort to HDMI 2.0a or 2.0b converter, with a pre-11th-gen Intel CPU.
Uhm, DisplayPort!
I'm sure there's at least one of you reading this that's been screaming "just use DisplayPort, you dolt!" for half the article already. And you have a point.
If you have a display that accepts DisplayPort, then that completely changes the game. As long as you have a motherboard that supports DisplayPort 1.4, then you can use Intel's 10th-gen CPUs, and possible even older generations. DisplayPort 1.4 fully supports 4K HDR signals.
But what if you don't have a DisplayPort on your consumer TV or projector? After all, most consumer gear is HDMI only, not DisplayPort. Well, in that case they do make DisplayPort to HDMI converters. Many are marketed, advertising DisplayPort 1.4 to HDMI 2.1 conversion. Sounds great, right? Be sure to read the reviews, as these devices have definite limitations and performance issues...
Nahhh, You're Wrong! I play 4K UHD Blu-rays with HDR via HDMI on My PC
Interestingly, there are certainly reports of users playing official 4K UHD Blu-rays with HDR content using nothing more than their HDMI output on their motherboard. No external converter required, and their 10th-gen or earlier CPU works just fine for this. How is this possible?
The secret here is that some motherboards have a DisplayPort to HDMI 2.0a or 2.0b (or higher) converter chip built-in. So technically you are using the DisplayPort output of the CPU, but the motherboard is handling the conversion of the DisplayPort signal to HDMI 2.0a/b, and you might not even realize that this is happening.
Of course, actually finding a motherboard with HDMI 2.0a/b support is easier said than done. I did some quick searching on NewEgg, which has a nice motherboard search/filter tool, and while I could filter on HDMI ports, I couldn't filter on supported spec. You'll have to tread carefully, read every last spec, possibly even check the manufacturer's website, to verify it includes not just an HDMI 2.0 converter chip, but one that supports at least 2.0a, preferably 2.0b. The same goes when buying a NUC for HTPC use, you have to check the HDMI specs to make sure it has the HDR support you want, and don't buy 11th-gen or newer or you'll lose SGX.
Is HDR Worth It?
After this long, painful journey, now that I finally have 4K HDR working, was it worth it? For me personally, the jury's still out.
I rewatched the first half of Ghostbusters: Afterlife, thinking I really couldn't tell the difference, before realizing I was accidentally watching the regular 1080p Blu-ray without HDR!
I swapped in the 4K disc, and rewatched some of the same scenes. I think if I was able to have two projectors, side-by-side, showing the same scene rendered both ways, I might be able to discern the difference, but from visual memory I was finding it challenging to identify a true improvement in dynamic range. Part of the problem might be that the 1080p transfer of Ghostbusters: Afterlife is really good, minimizing the observable improvement of the 4K. And part of the problem is likely that projectors are notorious bad at contrast ratio and HDR content, so even though my Optoma UHZ50 supports HDR10 and HLG, the benefit is expected to be limited. Perhaps a much more expensive projector would produce a more recognizable improvement.
I did perceive the colors were ever so slightly more vibrant on the 4K disc, and that color banding/solarization was reduced compared to the 1080p Blu-ray version. Color banding has long been one of my pet peeves, as this artifact will distract me from a film while I stare at ugly pixels and wonder how to fix them. So the 10-bit color of 4K UHD Blu-ray discs is a welcome improvement, with or without HDR.
I did find that the HDR image on Ghostbusters: Afterlife was by default much darker than the non-HDR version, and to solve this I had to crank up the projector's Brightness to compensate, otherwise those dark scenes became indecipherable. This gave me a flash-back to watching The Hulk for the first time in 2004 on my first projector, the scene when Hulk is fighting the mutant dogs in the tree. The first time I watched that on my projector, I could barely make out dark blobs moving. Later I rewatched it and cranked up the brightness and discovered there was a ton of detail I had missed. In my experience, projectors have always been finicky to setup, because cranking up the brightness to see the shadow details can blow out the highlights and also turn blacks to muddy grays. I had hoped a modern projector wouldn't suffer these issues, but here I was back in 2004 again, simply trying to get the video viewable at the expense of color accuracy.
For this reason, after cranking the brightness for Ghostbusters: Afterlife, I felt I was cutting into the impact that HDR could present when presenting bright lights, explosions, and other high-contrast scenes. I've also lost confidence that I'm setting brightness for HDR content correctly. My next step will be to use a 4K calibration disc to hopefully set HDR levels correctly.
This also makes switching between HDR content and non-HDR content a bit clunky. Right after watching Ghostbusters: Afterlife I switched over to watch the SuperBowl using the same HTPC. I had to disable HDR on the PC and re-adjust brightness settings on the projector, two manual tasks that I would love to have auto-adjust. Perhaps a better projector with more user configurable presets would handle this requirement a bit easier, but I have doubts that my "value" projector is a champ in this regard.
The other big oddity with HDR is how the Windows 10 desktop looks once you turn on HDR Streaming support, the entire screen becomes washed out and very difficult to read. This is a known issue in Windows 10, and supposedly Windows 11 has improvements in this regard. To see if this is true, I've upgraded (for free!) to Windows 11 on this HTPC. My initial impression is that the Win11 desktop seems easier to read with HDR Streaming enabled, but I need more time to be sure there really is an improvement.
At this point, I'm concerned that there may exist a need to toggle on HDR only when you are watching HDR content, and otherwise have it disabled in Windows for proper display of content. I've noticed that the projector sometimes reacts incorrectly to toggling HDR On/Off on the PC, requiring manual intervention to synch modes correctly. The end result might be that you have to make a real commitment to configuring everything in order to watch HDR, and to reverting back to non-HDR settings when you're done. Yuck.
Hopefully with more time and experience with HDR, I'll come to appreciate this visual upgrade, and learn how to better live with this feature. For now, my handful of hours is not enough to truly come to any conclusions.
I recently ordered the Spears & Munsil UHD HDR Benchmark disc which was just delivered today, so my next task is to learn how to use this calibration disc and spend hours tweaking my setup to extract the best possible picture. Perhaps that will become another write-up...