Compare to Debian, Ubuntu, RHEL, etc., where you can compile your own newer package, install it
None of these operating systems have Verified Boot. Desktop systems are inherently less secure than mobile devices.
even replace core operating system components
Don’t you see the problem there?
and then seamlessly upgrade to the OS vendor’s version when they catch up.
That’s not true either. Swapping out random parts of the OS will certainly lead to breakage and dependency hell in your package manager (unless you just replace files without using the package manager, which might make all of this even worse).
and don’t offer any way to install patches, besides building it again
Normal Android behavior. This will be the case for ANY Android-based OS with Verified Boot enabled.
and this is just TPM for Android
This isn’t comparable to TPM at all. TPM is a very insecure way of providing a hardware keystore. It can easily be bypassed. Here’s a demonstration of that https://www.youtube.com/watch?v=wTl4vEednkQ
I’ve opposed TPM from the start
TPM in desktop machines neither really provides a benefit, nor does it inconvenience the user. I’ve also opposed to it from a security perspective, since it’s misleading and makes users think that it’s actually secure and comparable to proper secure elements (such as the Titan M2, Apple Secure Enclave Processor, or the Qualcom SPU), while it really isn’t.
I don’t want a device that keeps secrets from me.
It’s not keeping secrets from you, the secure keystore is keeping cryptographic secrets away from attackers. Only you can use the cryptographic secrets from the secure element, by combining them with your PIN/passphrase (in a key derivation function) to derive the keys used for full disk encryption. Without a secure element that safely guards these secrets and throttles unlock attempts, attackers like law enforcement agencies can easily brute force your 6 digit PIN and gain full access to your device, including all of your data. The iPhone 11 was the last iPhone generation without a proper secure element. A 6 digit passcode was also bypassed on the iPhone 11 Pro Max, just a few months after its release. https://appleinsider.com/articles/20/01/15/fbi-reportedly-accessed-locked-iphone-11-pro-max-with-graykey-third-party-tool
It clearly shows that data cannot be extracted from iPhones with secure elements, unless the device is in the AFU state, meaning that the encryption keys are kept in memory.
I do want comprehensive backups, including all cryptographic keys.
Having a secure element to isolate keys there doesn’t make sense if you can just export them. The security of those keys cannot be guaranteed anymore, once they’re outside of the secure hardware keystore. This is not unique to Android/mobile devices. Look at U2F hardware security tokens, such as the YubiKey or NitroKey. You can’t export your keys there either, which is by design, and it’s a good thing.
I’m fully aware that Widevine won’t run on a device where the owner has control over the whole device.
This has nothing to do with Widevine. The vast majority of Android devices currently on the market doesn’t have a secure element. Widevine still works on these devices.
and you seemingly can’t do it incrementally
What do you mean? Generating an incremental update .zip package and flashing it over your stock GrapheneOS installation? No, that’s certainly not gonna work, due to Verified Boot verifying that the signature of the update package matches the signature of the OS that’s installed. This is a very important security feature, and prevents attackers (could just be cybercriminals, but that can also include law enforcement, intelligence agencies, etc.) from hacking into update servers and delivering mallicious updates to users. If I remember correctly, that’s how EncroChat got bugged by the French police. They hacked into the update servers that were hosted by OVH, and then distributed mallicious update packages, gaining access to the devices of all EncroChat users.
since you have to flash an entire operating system at a time
That is correct. Changing the signing keys requires you to unlock the bootloader, wipe the Verified boot keys, and replace them with a new set of trusted custom (i.e. non-stock) keys. Otherwise someone could just flash a mallicious OS over your current OS, while retaining all of your data. They could then use it to extract your data.
Of course you can easily make changes and install new versions incrementally, once you have installed your custom OS and signing keys to the device. Also, none of this is some crazy GrapheneOS invention, it’s the default Android Verified Boot behavior, which GrapheneOS builds upon.
Swapping out random parts of the OS will certainly lead to breakage and dependency hell in your package manager (unless you just replace files without using the package manager, which might make all of this even worse).
I’ve done it, and it works. I’ve built packages of libraries and binaries before, at higher version numbers than Debian had, and deployed them to multiple Debian sid systems. They worked. When Debian caught up, I seamlessly upgraded all 3 systems with no problems.
Even in the worst case scenario of dependency hell, you would be able to downgrade to the Debian supported version. But I never had to do anything like that.
I’m not going to respond to all the rest of your post, because I don’t think it will help with anything. It seems that we have very different ideas about device ownership.
you would be able to downgrade to the Debian supported version
That’s pretty specific to fixed release distros, and it’s not gonna work on e.g. Arch Linux.
I’m not going to respond to all the rest of your post, because I don’t think it will help with anything. It seems that we have very different ideas about device ownership.
You don’t have to respond to it, I’d be happy enough if you would just acknowledge it. I too like the fact that one can tinker with Linux systems. I’ve always told people who want to study OS architecture to daily drive either Linux or one of the BSDs. They’re really fantastic operating systems for learning how computers and operating systems work. I too have built libraries and system utilities from scratch. I still wouldn’t recommend it on production systems. I built Linux from Scratch many times, and I think it’s pretty fun and informative (if you pay attention, instead of just copy-pasting the commands from the instructions).
Yet the fact remains that desktop operating systems are inherently less secure than mobile systems, which were designed with a strong focus on security from the ground up. SELinux is a pretty good example. How many desktop Linux distributions do you know, that deploy SELinux (or a comparable LSM) in enforcing mode, and with meaningful policies? Yeah, some of the mainstream distros, such as Ubuntu, Fedora and SUSE do it (sometimes with pretty weak policies), but looking at the vast majority of distros? I’d say almost none. Android on the other hand has used SELinux by default for a long time, with actual meaningful, secure policies. Btw if you’re looking for a more secure Linux OS, check out secureblue. It’s based on Fedora Atomic, and applies lots of hardening on top. Not affiliated or anything, I just think it’s a nice and secure distro.
All in all, I think Production devices should be secure. You can always have a second device or that you can use to study the inner workings of an OS, or make changes to it (or in this case run GrapheneOS in the Android emulator).
None of these operating systems have Verified Boot. Desktop systems are inherently less secure than mobile devices.
Don’t you see the problem there?
That’s not true either. Swapping out random parts of the OS will certainly lead to breakage and dependency hell in your package manager (unless you just replace files without using the package manager, which might make all of this even worse).
Normal Android behavior. This will be the case for ANY Android-based OS with Verified Boot enabled.
This isn’t comparable to TPM at all. TPM is a very insecure way of providing a hardware keystore. It can easily be bypassed. Here’s a demonstration of that https://www.youtube.com/watch?v=wTl4vEednkQ
TPM in desktop machines neither really provides a benefit, nor does it inconvenience the user. I’ve also opposed to it from a security perspective, since it’s misleading and makes users think that it’s actually secure and comparable to proper secure elements (such as the Titan M2, Apple Secure Enclave Processor, or the Qualcom SPU), while it really isn’t.
It’s not keeping secrets from you, the secure keystore is keeping cryptographic secrets away from attackers. Only you can use the cryptographic secrets from the secure element, by combining them with your PIN/passphrase (in a key derivation function) to derive the keys used for full disk encryption. Without a secure element that safely guards these secrets and throttles unlock attempts, attackers like law enforcement agencies can easily brute force your 6 digit PIN and gain full access to your device, including all of your data. The iPhone 11 was the last iPhone generation without a proper secure element. A 6 digit passcode was also bypassed on the iPhone 11 Pro Max, just a few months after its release. https://appleinsider.com/articles/20/01/15/fbi-reportedly-accessed-locked-iphone-11-pro-max-with-graykey-third-party-tool
This hasn’t been possible anymore since the iPhone 12, which was released with the Secure Enclave Processor. This is even confirmed by leaked internal documents from forensics companies, such as Cellebrite: https://discuss.grapheneos.org/d/14344-cellebrite-premium-july-2024-documentation
Specifically, I recommend looking at this chart:
It clearly shows that data cannot be extracted from iPhones with secure elements, unless the device is in the AFU state, meaning that the encryption keys are kept in memory.
Having a secure element to isolate keys there doesn’t make sense if you can just export them. The security of those keys cannot be guaranteed anymore, once they’re outside of the secure hardware keystore. This is not unique to Android/mobile devices. Look at U2F hardware security tokens, such as the YubiKey or NitroKey. You can’t export your keys there either, which is by design, and it’s a good thing.
This has nothing to do with Widevine. The vast majority of Android devices currently on the market doesn’t have a secure element. Widevine still works on these devices.
What do you mean? Generating an incremental update .zip package and flashing it over your stock GrapheneOS installation? No, that’s certainly not gonna work, due to Verified Boot verifying that the signature of the update package matches the signature of the OS that’s installed. This is a very important security feature, and prevents attackers (could just be cybercriminals, but that can also include law enforcement, intelligence agencies, etc.) from hacking into update servers and delivering mallicious updates to users. If I remember correctly, that’s how EncroChat got bugged by the French police. They hacked into the update servers that were hosted by OVH, and then distributed mallicious update packages, gaining access to the devices of all EncroChat users.
That is correct. Changing the signing keys requires you to unlock the bootloader, wipe the Verified boot keys, and replace them with a new set of trusted custom (i.e. non-stock) keys. Otherwise someone could just flash a mallicious OS over your current OS, while retaining all of your data. They could then use it to extract your data.
Of course you can easily make changes and install new versions incrementally, once you have installed your custom OS and signing keys to the device. Also, none of this is some crazy GrapheneOS invention, it’s the default Android Verified Boot behavior, which GrapheneOS builds upon.
I’ve done it, and it works. I’ve built packages of libraries and binaries before, at higher version numbers than Debian had, and deployed them to multiple Debian sid systems. They worked. When Debian caught up, I seamlessly upgraded all 3 systems with no problems.
Even in the worst case scenario of dependency hell, you would be able to downgrade to the Debian supported version. But I never had to do anything like that.
I’m not going to respond to all the rest of your post, because I don’t think it will help with anything. It seems that we have very different ideas about device ownership.
That’s pretty specific to fixed release distros, and it’s not gonna work on e.g. Arch Linux.
You don’t have to respond to it, I’d be happy enough if you would just acknowledge it. I too like the fact that one can tinker with Linux systems. I’ve always told people who want to study OS architecture to daily drive either Linux or one of the BSDs. They’re really fantastic operating systems for learning how computers and operating systems work. I too have built libraries and system utilities from scratch. I still wouldn’t recommend it on production systems. I built Linux from Scratch many times, and I think it’s pretty fun and informative (if you pay attention, instead of just copy-pasting the commands from the instructions).
Yet the fact remains that desktop operating systems are inherently less secure than mobile systems, which were designed with a strong focus on security from the ground up. SELinux is a pretty good example. How many desktop Linux distributions do you know, that deploy SELinux (or a comparable LSM) in enforcing mode, and with meaningful policies? Yeah, some of the mainstream distros, such as Ubuntu, Fedora and SUSE do it (sometimes with pretty weak policies), but looking at the vast majority of distros? I’d say almost none. Android on the other hand has used SELinux by default for a long time, with actual meaningful, secure policies. Btw if you’re looking for a more secure Linux OS, check out secureblue. It’s based on Fedora Atomic, and applies lots of hardening on top. Not affiliated or anything, I just think it’s a nice and secure distro.
All in all, I think Production devices should be secure. You can always have a second device or that you can use to study the inner workings of an OS, or make changes to it (or in this case run GrapheneOS in the Android emulator).