If TikTok can read your DMs, someone else eventually will
TikTok told the BBC it will not add end-to-end encryption to direct messages because it thinks encryption would make people “less safe.” The thing that keeps hackers, the company, and governments out of your private conversations is being framed as a threat.
TikTok told the BBC it believes end-to-end encryption “prevents police and safety teams from being able to read direct messages if they needed to.”
“Safety” is the one word that can bulldoze privacy debates, especially when kids are involved. But if a platform says it needs the ability to read your messages to keep you safe, it is asking for a master key.
And once that master key exists, it is not just for TikTok. It becomes an access path for law enforcement. You can call that “lawful access” if you want. It is still domestic surveillance.
End-to-end encryption is a design choice
Most people hear “encrypted” and assume that means “private.” It usually does not. A lot of services encrypt messages in transit, then decrypt them on the server.
End-to-end encryption (E2EE) means only the people in the conversation can read it. The service that delivers the message cannot. There is no server-side copy employees can browse, and no plaintext to hand over.
TikTok says its DMs are still protected with “standard encryption,” and that only authorized employees can access messages in limited situations, like responding to a valid law enforcement request or a user report.
That is the whole story. If employees can read the content, your DMs are not end-to-end encrypted. The company can read them, which means the government can potentially get them too.
Safety teams do not need a master key to do safety work
TikTok's argument is the oldest one in tech policy: if the company cannot read messages, it cannot help when something bad happens.
Bad things do happen in DMs. Grooming and harassment are real. Child safety groups like the NSPCC and the Internet Watch Foundation praised TikTok's choice for that reason.
But “we need to be able to read everything” is a lazy, high-risk solution to a hard problem. No safety team is proactively reading DMs at scale. What platforms actually do is respond to reports, look for patterns of abuse, and use product design to reduce exposure in the first place.
Second, E2EE does not prevent a platform from doing the stuff that actually matters for protecting minors. You can make DMs from strangers opt-in by default for teens, put friction and rate limits in the path of unknown accounts, block obvious spam and grooming patterns, and let users report a thread by sharing what they choose to share.
You can build strong guardrails without giving the company a standing ability to open every private envelope.
Internet Society's piece Encryption keeps kids safe online makes a point worth repeating: encryption protects kids too, including from breaches and profiling.
The real risk is the access path
Once you decide the company must be able to read DMs, you have created a permanent high-value target. Even with good controls, that access eventually leaks or gets used in ways users never agreed to.
And it aligns perfectly with what governments have asked for, over and over: a world where providers can be compelled to produce message content when served with legal process (see EFF on the long-running push for encryption backdoors).
The safest messages are the ones the platform cannot read. You cannot leak what you cannot access. You cannot be compelled to hand over plaintext you never had.
If you keep DMs readable, you are keeping an access path open for domestic surveillance. TikTok is explicit that it can access DMs in response to a “valid law enforcement request.” End-to-end encryption removes that option by design.
This matters for any company, but it matters even more for TikTok because of its “combustible optics,” as analyst Matt Navarra put it to the BBC. TikTok is already fighting suspicion about its ownership and state pressure. Keeping DMs decryptable just adds another “trust us.”
And this is where the “for safety” framing really annoys me. If TikTok wants to differentiate itself from rivals by keeping DMs readable, it should say that plainly. It should tell users: we will read DMs when we decide it is necessary, and we will keep them available for law enforcement requests.
That is the deal, and the rest is marketing.
The sharp part: stop calling encryption controversial
End-to-end encryption is the baseline now. Signal and WhatsApp do it, and iMessage has for years (Apple’s iMessage security overview). “We cannot read this” is the cleanest security promise a messaging service can make.
If TikTok wants to keep DMs in the “we can read it” category, fine. Users should know what they are signing up for.
But calling E2EE a safety risk flips the world upside down. The risk is not that criminals will hide from law enforcement. The risk is that regular people will keep being trained to accept surveillance as the default setting for communication.
If a platform tells you it needs to read private messages to keep you safe, believe the quiet part: it is building a system where somebody, somewhere, can read them. And once that door exists, it will get used.
Call it what it is: surveillance.