

susceptible to backdoors and other USA shenanigans.
that’s pretty much the major difference here: Signal is provably not back-doored:
- it’s frequently independently audited
- all their code is open, so there are plenty of eyes on it to catch shenanigans
- they have reproducible builds which means you can prove that the code that you can read is exactly the same code that produced the binary running on your device
- if you don’t know what this means, basically every time you compile the code it produces the exact same binary result
- there are people that do this automatically so that if there were different source code that created a binary - with a secret backdoor or something - it would be very obvious, and public
- given that, it’s reasonable to assume that the binary running on your device was produced using the same open code everyone can read: you don’t need to do it yourself
- whilst you can’t prove their server is the exact same as what’s in their open repos, it doesn’t really matter… the point of their architecture is that it doesn’t matter what the server is running: it could be announcing all data publicly and it’d still be secure because the encryption, security, and privacy feature are all ensured by the client
they receive whatever Google/Apple give them which may be quite different from what’s in the source code.
i don’t disagree: it’d be better if we all had the time, skill, and energy to invest into auditing our own systems… but realistically nobody does, let alone people that don’t really care about privacy
with that in mind, it’s all about getting as close as possible… given signals reputation, you can be pretty sure the source code has a lot of eyes on it, and that if there were back doors found it would be news
and given reproducible builds, as i said earlier, you can (or rather, i certainly do) assume that if there were a mismatch between the binaries and the source you’d also hear about it
of course, that doesn’t stop targeted attacks by nation states, but that’s never what we talk about in personal security and privacy situations… it’s just not the threat model that most (i’d wager any) of us should be thinking about because that is not just a full time job: that is an entire teams full time job… we just aren’t being directly targeted like that, and if we are then tbh it’s all over. we protect against general surveillance… we can’t protect against zero days, physical device access, etc
If they can then Signal can as well, right?
kinda… again, reproducible builds: either of them could technically put code in their app that sends private keys to their servers somehow, but if you break it down it’s far more likely to be caught in signal than in whatsapp
more likely Google and Apple will
i’m not sure what you mean by this… sure, apple or google could send you an update to ios/android to extract data from apps, but again that seems much more likely a very large-scale attack… you can protect against this by running graphene etc which does similar reproducible builds, but in that case we aren’t talking about the app: signal is absolutely the app you would rely on if you’re going that far… you just wouldn’t ensure your hardware and OS integrity and then just skip the app integrity lol
or perhaps you mean that google or apple could send you specifically a binary of signal that’s been modified? but that’s actually not really likely because apps are signed by developers: apple and google can’t actually send you something that the developer hasn’t “approved”… sure, they control the OS so they can circumvent all the restrictions, but again that’s a massive attack, and really far beyond what’s reasonable to consider for most people (and again, that applies to both whatsapp and signal so it’s not really a point in favour of whatsapp)
But as I understand it any US company will have to store and provide metadata, logs, etc when the government agencies tell them to
absolutely correct… the point of privacy like signal does is that they hand everything over and it’s useless: the information signal themselves can extract, even by modifying their code is completely worthless. they have your IP address, phone number, some timestamps, and encrypted blobs (AFAIK they don’t store a lot of that, but that’s not provable so we should assume that it’s stored either accidentally or because of coercion)… they can see when you messaged, but not even things like who you messaged
if signals infra and private keys etc were literally handed over to the US government right now and they specifically wanted to target you personally, it’s highly unlikely they would be able to do anything particularly useful with any of that before it’s noticed, and then you can stop using signal before they actually intercept new communications (and old communications are protected, assuming you wipe the app and all its stored info before they can send you a poisoned update)
and with all of this, it doesn’t really matter where signal is based: US, China, Russia, Guam, Switzerland, Iran: doesn’t matter… the structure is built in such a way that if Signal the organisation is coerced, it’s either:
- obvious, and therefore noticed by the community at large and thus you’d hear about it
- not useful: ie all information that Signal has is provably garbage
- such a large scale that we globally have huge problems (and we do, but that’s not something you can solve)
- targeted, in which case you have big problems and whilst this may be part of it, you need to have a lot more resources to detect and solve it. this just isn’t the reality for most people
it’s about your threat model: you can’t worry about massive scale, and you can’t worry about being individually targeted… unless that is part of your threat model, in which case signal is still part of your solution (along with auditing and validating every part of the chain from hardware to OS to the apps which all require reproducibility or building from your audited source) and whatsapp fundamentally is not






that’s the incredibly clever part
https://signal.org/blog/sealed-sender