Daniel Stone
December 11, 2017
Reading time:
Recently, Sean Paul from Google's ChromeOS team, submitted a patch series to enable HDCP support for the Intel display driver. HDCP - or High-bandwidth Digital Content Protection to its parents - is used to encrypt content over HDMI and DisplayPort links, which can only be decoded by trusted devices.
HDCP is typically used to protect high-quality content. A source device will try to negotiate a HDCP link with its downstream receiver such as your TV or a frame-capture device. If a HDCP link can be negotiated, the pixel content will be encrypted over the wire and decrypted by the trusted downstream device. If a HDCP link cannot be successfully negotiated and pixel data remains unencrypted, the typical behaviour is to fall back to a lower resolution, or quality that is in some way less desirable to capture.
This is a form of copy protection usually lumped in with Digital Rights Management, something the open source community is often jumpy about. Most of the sound and fury typically comes from people mixing up the acronym with the kernel's display management framework called the Direct Rendering Manager; this is thus the first known upstream submission of DRM for DRM.
Regardless, there is no reason for the open-source community to worry at all.
HDCP support is implemented almost entirely in the hardware. Rather than adding a mandatory encryption layer for content, the HDCP kernel support is dormant unless userspace explicitly requests an encrypted link. It then attempts to enable encryption in the hardware and informs userspace of the result. So there's the first out: if you don't want to use HDCP, then don't enable it! The kernel doesn't force anything on an unwilling userspace. Sinks (such as TVs) cannot demand an upstream link provide HDCP, either.
HDCP support is also only over the wire, not on your device. A common misconception is that DRM means that the pixel frames coming from your video decoder are encrypted. Not so: all content is completely unencrypted locally, with encryption only occurring at the very last step before the stream of pixels becomes a stream of physical electrons on a wire.
Technically speaking, this means that all framebuffers presented to DRM/KMS, are provided unencrypted; if GPU composition is involved, the buffers presented through OpenGL or Vulkan for composition are also unencrypted, as is the GPU output. These unencrypted buffers are placed on a plane, which is mixed into a single CRTC's unencrypted output by the display controller. Only once the final CRTC pixel stream makes it to the encoder stage (where it is transformed from pixel content into a stream of DisplayPort/HDMI signals) does the encryption occur. By this stage, the content is already unrecognisable, as it has been prepared for electrical transmission by 8/10b encoding, potentially cut into DisplayPort packets, and so on.
HDCP is only downstream facing: it allows your computer to trust that the device it has been plugged into is trusted by the HDCP certification authority, and nothing more. It does not reduce user freedom, or impose any additional limitations on device usage.
The only way for a secure decode pipeline to be implemented, is a complete hardware-backed verified boot sequence. In this, the hardware itself must be a vital link in the content pipeline (holding, e.g., decryption keys for content), and it must be able to attest that the device is only running trusted code which is unwilling to leak content. There are a number of ways to boil that particular ocean, but having your display hardware enable over-the-wire encryption is pretty much irrelevant.
In short, if you already run your own code on a free device, HDCP is an irrelevance and does not reduce freedom in any way.
15/08/2024
After rigorous debugging, a new unit testing framework was added to the backend compiler for NVK. This is a walkthrough of the steps taken…
01/08/2024
We're reflecting on the steps taken as we continually seek to improve Linux kernel integration. This will include more detail about the…
27/06/2024
With each board running a mainline-first Linux software stack and tested in a CI loop with the LAVA test framework, the Farm showcased Collabora's…
26/06/2024
WirePlumber 0.5 arrived recently with many new and essential features including the Smart Filter Policy, enabling audio filters to automatically…
12/06/2024
Part 3 of the cmtp-responder series with a focus on USB gadgets explores several new elements including a unified build environment with…
06/06/2024
The final installment of a series explaining how Collabora is helping shape the video virtualization story for Chromebooks with a focus…
Comments (13)
oiaohm:
Dec 12, 2017 at 07:56 AM
Really you can buy all the hardware to have a raspberry pi capture and record a HDMI port that sent transmission encoded by HDCP 2.2. Yes its a HDMI to CSI-2 camera supporting HDCP 2.2 connect to the raspberry PI and a HDCP 2.2 to HDCP 1.4 converter box for old monitors.
The reality that HDCP protects high-quality content is just a huge myth that content produces believe. Van Eck phreaking is what HDCP protects against. Van Eck Phreaking is where someone is capturing the radio signal coming out the monitor cable encrypted mixed with background noise getting a captured screen image this way disappears. So from a security point of view we should want to HDCP the signal to monitor.
>>Sinks (such as TVs) cannot demand an upstream link provide HDCP, either.
Reply to this comment
Reply to this comment
Simon Farnsworth:
Dec 12, 2017 at 04:13 PM
The issue with sinks is that if they don't like your signal, they simply ignore it outright. So, in the case I described in my other comment, I had EDID saying that I could do 7.1 multichannel or bitstream audio, and a sink that "accepted" anything above HDMI Basic Audio by discarding it if you didn't do HDCP. I've seen people claim that Dolby TrueHD bitstream specs say that this is the correct behaviour with TrueHD audio over HDMI, too - which means that you can't bitstream TrueHD to the sink with free OSes unless you also do HDCP.
So, while they can't "demand" HDCP, they can react to you not using it by failing to work as expected (nothing stops a display putting up a "HDCP required" message on screen, either, bar angry users - and all my certified HDMI sources turn on HDCP automatically if the sink supports it, so there's a good chance that the average home user will not know that their display has this antifeature). Having HDCP available in a free OS lets you interoperate with said sinks; they can't tell that you don't do DRM, just HDCP.
Reply to this comment
Reply to this comment
Daniel Stone:
Dec 13, 2017 at 02:48 PM
No content protection scheme is, or ever will be, perfect. In the case of open systems, where users can run their own bootloaders, kernels, underlying userspace, and manipulate the behaviour of applications through preload and ptrace, trying to protect content will always be a non-starter. Hence why the only way to use it for content protection, is if you already fully control the path from power-on to application: bootloaders, kernels, media decode blocks, applications, the works. The HDCP support does allow people to do this, but realistically, they were already patching the support in anyway. If you already control your own system, then this is a complete no-op.
Reply to this comment
Reply to this comment
Simon Farnsworth:
Dec 12, 2017 at 09:34 AM
It's actually a net gain for freedom lovers. I've encountered A/V gear that only does 48 kHz, 16 bit stereo (or less) PCM if you don't do HDCP. Negotiate HDCP and you can do bitstream audio, or the full 192 kHz 8 channel 24 bit PCM.
Cost me some time working out why the PC was silent...
Reply to this comment
Reply to this comment
Keith Curtis:
Dec 13, 2017 at 12:49 AM
You are making an interesting point, but just remember that if it's Blu-Ray 4K content, it is encrypted on the device. In that scenario there are two encryption / decryptions going on. We can argue whether this adds value or costs freedom, but it's worth remembering that doing it twice starts to sound silly.
Also remember that first they have to put the chains on, and then they tighten them over time. Things might be comfortable now, but what about in 5 years?
Reply to this comment
Reply to this comment
Daniel Stone:
Dec 13, 2017 at 02:45 PM
True, you're right that some content needs to be decrypted first. But the presence (or otherwise) of HDCP doesn't change that. HDCP does add an additional encryption stage at the end, but the content has to be decrypted first to get to that point, so the GPU and display controllers can consume it. Even if only for performance/power reasons, on-the-fly frame decryption in these devices is extremely unlikely to happen. Especially as any mixing of content (e.g. overlaying subtitles, controls, or descriptions) means that you'd need to decrypt the bytestream, encrypt the raw frames, decrypt the raw frames to composite, re-encrypt the full composited frame, decrypt it again and then encrypt it with HDCP. Much easier to just skip all the intermediate stages, and only decrypt content on completely trusted devices, as is already the case today.
Reply to this comment
Reply to this comment
Bob Hannent:
Feb 23, 2018 at 10:43 AM
One of the important parts of HDCP is that switching it on and off creates a visual disturbance while the display handshakes, so if any content explicitly requires HDCP then it should be enabled from boot otherwise when the secure component is invoked it will cause disruption.
But what secure component? Well, in some architectures, you can run what is called a secure video pipeline: the encrypted content is sent directly from the stream/storage parser into a secure co-processor. The secure coprocessor decrypts the video into scrambled RAM or directly pipes (e.g. DMA) it into the video decoding component and then the video decoding block will decode the video and composite it into the video frame above the frame buffer. If done correctly nothing, not even the kernel or video driver, can see that decrypted video content. You can complain about HDCP being a slippery slope but there are tens of millions of linux devices, possibly hundreds of millions, in existence which already do this. HDCP being exposed on public releases of Linux is just exposing something that has been used for a decade in private implementations.
Reply to this comment
Reply to this comment
Jason Ekstrand:
Dec 14, 2017 at 05:12 AM
At the risk of sewing fear, It's worth pointing out that there are ways of having the encryption go deeper than just the link to the display. There are extensions for EGL and OpenGL ES (EGL_EXT_protected_content and GL_EXT_protected_textures) which allow for creating protected images and keeping things protected as they pass through the compositor and on down to the display. It is possible to implement things in hardware such that the data gets encrypted in the producing process and remains encrypted and untouchable from software as it passes through the various window-system layers, through KMS, and on to the display.
That said, I'm still all for it. In the Wayland world, we've done a lot to protect against random applications that want to grab your screen. If I'm going to open a password manager, I would love it if the only things capable of seeing that password are the password management application and my eyes. Having a protected composting environment would mean that applications could have some level of trust that nothing is going grab their contents without their knowledge. The technology itself isn't evil but, just like a rock, it can be used for evil purposes.
Reply to this comment
Reply to this comment
Anonymous:
Dec 16, 2017 at 08:35 PM
I'll agree with that, with a significant caveat: the owner of the device (i.e., root) must still have full control and be able to bypass all of those "protections".
If I choose to, I should be able to install a version of EGL, or Wayland, or the i915 driver, or whatever, that *tells* applications they are using an "encrypted" API, but at the same time, continues to allow privileged applications to snoop on them.
If that's the case, I have no problem with it. However, if there is some sort of cryptographic verification protocol that would allow the application to tell the difference, that's when it becomes unacceptable for a libre operating system.
Reply to this comment
Reply to this comment
ralph stone:
Dec 17, 2017 at 05:29 AM
Sew it has come to this. Anyway as long as it's an unenforceable and negotiated encryption, user end, it's no worse than TLS. At least I think sow.
Reply to this comment
Reply to this comment
Howard Johnson:
Dec 21, 2017 at 07:27 PM
Credit card numbers, private data bases, customer accounts, and on and on, require security. Holding someone else's private information securely on my open source machine, is what this is all about I think.
The negotiation goes like this:
1) I can choose to not have their information on my machine, or
2) If I choose to hold their information, then they can either:
a) leave the security up to me, or
b) they can specify the type and method of security they want.
Because 2a has largely failed to secure content, now we're in 2b.
I think the Open Source Community should be the champion of protecting data, and thus embrace DRM as one way of accomplishing that.
Reply to this comment
Reply to this comment
James:
Jan 02, 2018 at 05:34 PM
Also will reduce splitting up the signal so it cannot be used around the home. How much extra data is required to encrypt HD content?
Reply to this comment
Reply to this comment
James Birkett:
Oct 03, 2019 at 05:18 AM
DRM has never worked for the stated purpose of preventing piracy, and it never will. It didn't stop DVD rips, blue ray rips or cable / satellite tv rips. It isn't going to get any better, because legitimate devices need to have access to the deception keys (otherwise it would be unable to display the content), so the first time there's a device with any kind of hardware or software flaws that allows the keys to be extracted, The DRM scheme is dead.
What DRM does accomplish is causing a nuisance for legitimate users. Linux should not enable this shit; if vendors like Google want to do it they should be forced to maintain their own patches out of tree.
Bit late now though.
Reply to this comment
Reply to this comment
Add a Comment