Why I Tell Everyone to Use Signal Instead of WhatsApp
When a company’s entire business model depends on harvesting data, why on earth would you trust them to protect your privacy?
These days I assume everyone has heard of Signal. But a surprising number of people have still never heard of it. Honestly, this baffles me. When I mention it, they look at me like I’ve asked them to switch from a smartphone to a ham radio. “But WhatsApp is encrypted,” they say. “It’s fine!”
Here’s the thing: WhatsApp is owned by Meta. The same company that paid $5 billion in FTC fines after the Cambridge Analytica scandal. The same company whose investors sued Mark Zuckerberg, alleging that he ran Facebook as “an illegal data harvesting operation.” The same company that settled for $190 million in November 2025 because board members allegedly failed to stop repeated violations of user privacy.
I also believe the encryption argument is a distraction. If I told you to hand over your birth certificate, passport details, and driving license, and then I will scan it and store it in an encrypted VeraCrypt container on my machine, what do you think the problem is?
The problem is that you’re trusting me with the keys. The encryption is irrelevant if I’m the one who controls access to it. I could decrypt it whenever I want, share it with whoever I want, or hand it over if someone with a court order comes knocking. The fact that it’s “encrypted” means nothing when you have no control over who holds the keys.
Now apply that to WhatsApp. To be fair, technically neither WhatsApp nor Signal “hold” your decryption keys in the traditional sense. Both apps use the Signal Protocol, which generates your private key on your device. When you install either app, your phone creates a key pair: a public key that gets shared with the server, and a private key that stays on your device. Messages are encrypted so that only your private key can decrypt them. The server never sees it.
In theory.
With Signal, you can verify this is actually happening. The code is open source. Security researchers have audited it. You can compile it yourself if you’re paranoid enough. With WhatsApp, you’re taking Meta’s word for it. The private key should stay on your device, but you can’t verify that it does. And even if it does today, a silent update tomorrow could change that without you ever knowing.
Both apps use the same underlying protocol. The Signal Protocol. Open Whisper Systems developed it, and WhatsApp adopted it in 2014. So yes, technically, your messages are encrypted end-to-end. Meta cannot read the content of your texts.
But they can read everything else.
The Metadata Problem
WhatsApp collects your device details, IP addresses, usage patterns, and identifiers that tie you to your account. According to WhatsApp’s own privacy policy, this includes your network details, browser, ISP, and other identifiers linked to Meta products like Instagram and Facebook. They share certain categories of this information with Meta.
WhatsApp claims they don’t keep logs of who you message or call. But their privacy policy confirms they record “the time, frequency, and duration of your activities and interactions.” What exactly this means in practice remains disputed, though multiple reports indicate the FBI can access WhatsApp metadata in real time.
Signal collects almost nothing. They store only the timestamp of account creation and the last time you connected to their server. That’s it. No logs of who you talked to. No record of how long. Nothing useful for advertisers or governments.
One company makes money by knowing everything about you. The other is a nonprofit funded by donations and grants. When a company’s entire business model depends on harvesting data, why on earth would you trust them to protect your privacy?
Group Chat Vulnerability
In 2025, researchers from King’s College London published findings that like Whittaker said in the video above, would make anyone say “WTF”. They reverse-engineered WhatsApp’s code and confirmed something first flagged back in January 2018: WhatsApp provides no cryptographic management for group messages.
Your group messages are encrypted. The content is protected. But (and this is the big BUT) the membership of the group is controlled entirely by WhatsApp’s servers, not by cryptography.
What does this mean in practice? If someone compromises WhatsApp’s servers, they can silently add a spy to any group chat. You’d see “User X was added” in the chat, but there’s no cryptographic proof that a legitimate group admin actually made that change. The server just tells everyone to accept the new member, and every client obeys.
This is a fundamental flaw. End-to-end encryption is supposed to mean that even if the server is compromised, your communications remain private. But if a malicious actor can just add themselves to your group, the encryption is meaningless. They receive every message, decrypted, like any other member.
Signal handles this differently. Group membership changes are cryptographically signed and include a versioned group state. An admin’s device must actually sign off on adding someone. The server cannot fake that signature.
This vulnerability has been known for seven years. WhatsApp still hasn’t fixed it.
The Trust Question
People want to believe the mental gymnastics. They want to believe that WhatsApp’s encryption makes it safe, that Meta’s privacy violations don’t matter because the protocol is sound, that closed-source code from a surveillance capitalism company is trustworthy.
Consider what we know about Meta’s leadership. In December 2025, California settled with Meta for $50 million over privacy violations. Shareholders alleged that Zuckerberg and other senior executives knew about privacy risks as early as 2012 but delayed action. Sheryl Sandberg was sanctioned during the shareholder litigation for deleting emails that may have been relevant to the Cambridge Analytica investigation. The case settled before these allegations could be tested at trial.
This is the company asking you to trust their encryption implementation. Their code is closed source. You cannot verify their claims. Independent security researchers cannot audit what’s actually running on your phone.
Signal’s code is open source. Every line is published on GitHub. If Signal tried to add a backdoor or some code that transfers data to their servers, they would have to do so publicly. Thousands of developers and security researchers scrutinize every change.
When a government asks Meta to help surveil a user, they have options. They could push a modified version of WhatsApp to that specific user. The user would never know. The app would look the same. But it could silently forward decrypted messages before encryption, or leak cryptographic keys. With closed source code, this is undetectable.
When governments have demanded data from Signal, the response has been simple: we don’t have it. They’ve published the subpoenas they’ve received. All Signal could provide was the timestamp of account creation and the last connection date. Nothing else exists on their servers.
My Advice
People ask me to explain the difference between WhatsApp and Signal in simple terms, so here it is:
Both apps encrypt your messages so nobody can read them in transit. But WhatsApp is owned by an advertising company that has been fined billions of dollars for privacy violations, collects extensive data about your device and usage patterns, uses closed-source code you cannot verify, and has a known vulnerability that lets server operators add spies to your group chats.
Signal is run by a nonprofit, collects almost nothing, publishes all their code for anyone to inspect, and cryptographically prevents unauthorized changes to group membership.
You can keep doing mental gymnastics to convince yourself WhatsApp is fine. Or you can just download Signal. It works the same way. It’s free. Everyone you know can join.
The hardest part is getting your friends to switch. But that’s a social problem, not a technical one. And it gets easier every time Meta settles another lawsuit.
A Note on Privacy in Layers
I should be clear that Signal alone isn’t extreme privacy. It’s one tool in a basket of many. For example, I don’t even use my real SIM card’s phone number. I only use VoIP numbers from services like MySudo. I have hundreds of email addresses and use tools like SimpleLogin to manage them at scale.
Why does this matter? Context. If I got a message supposedly from my bank asking me to reset my password through Signal, it would immediately seem absurd. That’s not where my bank contacts me. If my wife asks me what time I’m home for dinner and sends it to an email address I typically use for banking, something is also amiss. These boundaries create natural tripwires. I’ve written more about this approach in the ODSF framework, which applies these defensive techniques at an organisational level.
But here’s what I think is really going on with WhatsApp’s dominance: a combination of social proof and fear of missing out. Just because everyone else uses WhatsApp, like a herd of sheep, many of us follow along without questioning it.
Signal is the only way my close friends and family can message me. I’m very selective about who I add, and I still don’t trust Signal with access to my contacts. But to my friends and family who aren’t any wiser about the technical details, they find Signal an absolute joy to use. I have kids, and even they use it. We have a family group chat, and while most of the time it’s full of emojis and nonsense, this proves my point: Signal isn’t some obscure technological app that requires a PhD in cryptography to use.
There really is no excuse.
If you found this useful, share it with someone who still uses WhatsApp. They probably need to hear it.