I can't imagine any justification for any government device that should be secure to have anything on it but the bare minimum software and the device in whatever hardened mode it has.
If they visit the White House, government facility ... should go in a locker.
I worked for a company that sent people onsite to government contractors. One contractor we rarely visited was at a facility where you arrived at the front gate in your rental car with your ID, keys, and equipment you needed. You were told if you brought anything else expect to lose it.
They took your ID and keys at the gate, searched the car, you were blindfoled and they escorted you to the location of the equipment. If you had to go to the bathroom your were escorted (all the way...). You left with the clothes on your back.
We went through a lot of laptops, but ... that place was secure.
Also government: installed special version of Signal that includes a backdoor (logs)
People: don't use Signal! It has a back door! Instead, use Telegram, it doesn't have encryption by default and is highly suspect of a foreign adversary
Also people: "I'll just send copies of all my messages to the government because they have my data anyways"
Look at what Trump does. After the Syrian fall, he continues the long standing policies and proposes the MIGA regime change in Iran. Since this is a decades old goal of US foreign policy, I assume he'll go through with it. They are already promoting Reza Pachlavi on Fox News.
Unless you have proof that he made a secret deal with Putin (you get Ukraine, we get Syria and Iran), how is this pro Russian?
> If "zuck" is really in the pocket of the US government, why should they worry about their own backdoors?
Have you ever watched a Saturday morning cartoon? Minions betray their masters all the time. An effective evil overlord doesn’t underestimate their lackey’s capacity for duplicity and betrayal at a pivotal moment.
Once it's backdoored you don't know who's watching it.
It's the most hilarious thing about backdoors or collecting extensive covert intel on your own population, that any failure of opsec makes it much easier for all your adversaries to also spy on them in ways they would never otherwise be able to, then compromise them, and flip them.
Software frequently has bugs and sometimes they have security implications. In order to claim that a specific bug is a backdoor you need to have evidence beyond the existence of a bug.
House (legislative branch) staffers presumably don't want executive branch snoops reading their group chats. Doubly so for Democratic staffers not wanting specifically the Trump executive branch reading them.
And on social media. Maybe I'm being too literal and pedantic, but it bugs me that they say "nobody" can read your messages. What's the point of using it if even the recipient can't read them (or the sender for that matter!).
I'm sorry, it's just flatly wrong to suggest Microsoft Teams is safer than WhatsApp and everyone here bandwagoning on this ridiculous decision should feel bad.
Perhaps you're unaware that there is a special, DoD-certified version of Teams called "Gov Teams", which can be used to share data at multiple impact levels securely. This version of Teams, and the entire Office365 suite, has undergone extensive security certification for use with high IL data.
They're almost certainly not using the same version as the general public. Most major service providers have a specific version for government with additional controls and restrictions and have undergone certification through FedRAMP, including Microsoft:
The unwarranted confidence is stunning in a post that is so fundamentally incorrect. I don't like Teams, but your take is deeply unaligned with reality.
Teams absolutely has more compliance controls than WhatsApp. Encryption, compliance, data governance, security, etc are all related but very different things.
MS products allow you to store data locally without any egress, so an IT team has access to it.
This is the sticking point, because WhatsApp has now integrated Meta AI into the app, but (obviously) do not provide an on-prem data store. This is why Deepseek AI (the Deepseek app) and ChatGPT (the OpenAI app) are barred as well.
Data Stewardship and Zero Trust has been an internal initiative in the House for a couple years now.
The fact that almost no one on this thead knows these (imo overused) terms and design patterns highlights one of the various major gaps in Software Dev I've been observing for several years now - especially the North American market (given the hours that this was posted). The inability to incorporate or understand some basic security architectures is a major gap.
Edit: Keep pushing the downvotes. The truth hurts, and plays a role in jobs leaving, and funds like my employer funding cybersecurity startups in Israel, India, and Eastern Europe because the ecosystem doesn't exist in the US anymore. A similar trend happened in data layer related work.
We don't need more SKLearn plumbers calling themselves "ML Engineers" or Angular monkeys calling themselves "Fullstack Engineers" - we need people who truly understand fundamentals (or - shudders - first principles), be they mathematical (optimization), systems (virtualization), or algorithms (efficient data structures)
> The fact that almost no one on this [thread] knows these
Its not that they aren't known, but rather we just came off a long trend of thin-clients and cloud storage. Some companies merely stay in that ethereal space, while others had concerns about their data. Criticizing people for doing what experts were pushing for the past 20 years doesn't need to devolve into calling their expertise into question.
The downvotes are for that, not because "you're wrong".
Teams doesn’t require access to my entire contacts book on my phone to run smoothly. I can choose the individuals whose contact details I want to give it
This is very reasonable if you have compliance needs or similar. That’s not what this office is saying - it’s saying teams is more secure. This is wrong. The nature of banning private messaging apps is trading security for legibility. If this office is interested in that (which it’s not - it allows Signal), they should say so.
Microsoft maintains specific secure government versions of Teams that use their own special secure data centers. It's a full parallel extra secure set of infrastructure.
I do have a compliance need, similar to this office i imagine.
Teams is more secure in my opinion.
I as an admin can control who you can/can't talk to, what you can share with them, when you can share it. Correctly configured MS Teams is a pretty secure setup.
On the flipside im not sure i can make someone else's Whatsapp not auto download anything sent to it.... The two apps aren't really comparable unless I've missed an entire 'Whatapps for government/enterprise' business arm.
E2EE is mostly useful for consumer applications, where you trust the endpoint (yourself), but not the intermediary servers (some megacorp that doesn't care about you).
The situation is entirely different when you are managing very large organizations.
In those situation, you don't necessarily need the need the data to be invisible to the intermediary servers, because you might either just be able to control them yourself, secure them with NDAs, etc. And if the server is controlled by you, then you might not even want the data to be invisible to yourself. But, your primary risks may be the compromise of endpoint devices, mistakes or leaks by your users, or a lack of controls over data exchange. Also, many organizations may need to provide records of their internal communications in order to comply with legal requirements.
You might be surprised to know that enterprise offerings of many apps that otherwise support E2EE, often have a way for administrators to intentionally turn those features off.
Lack of complete e2ee is a feature for many large organizations—they still want everything encrypted, they just want a master key to be able to audit communications for compliance/investigations/insider threat identification. They also want strict control over who does what with the app, and where all of the associated data lives. Teams is just a totally different product from WhatsApp in that regard, with all sorts of functionality that will never exist in WhatsApp—tons of control over user identity and access management, integration with all sorts of other security tooling, etc.
The threat model of an organisation is almost the opposite of you as an individual.
For you, you trust yourself the most, followed by your device, and the intermediate servers are a threat. For an organisation, the servers are the most trusted entity, followed by the org-provided device, and a certain percentage of users are an active threat.
Their statement doesn't sound like what you said at all:
> The Office of Cybersecurity has deemed WhatsApp a high-risk to users due to the lack of transparency in how it protects user data, absence of stored data encryption, and potential security risks involved with its use
(Of course that statement seems to be highly confused overall. What "stored data encryption"?)
Does WhatsApp encrypt the data on the device after it’s received and decrypted at your phone’s end (then stored indefinitely)? I thought the term of art was “encrypted at rest,” but “stored data encryption” makes sense to me too.
I was of the impression that Whatsapp’s messages (and its backups, photos, etc) kind of just hung around in plaintext once they reached the device.
Which would seem to be a problem should the device be stolen, or observed by other applications on the phone or a tethered device, or twiddled with sneaky hardware (e.g. [0]) that might use physical means to access the device’s file system.
Although as I understand it, the privacy claims are kind of window dressing anyway, and Meta has been more than willing to share plenty of WhatsApp’s data with all and sundry… even before AI-in-the-same-search-bar came along [1]
> "Messages on WhatsApp are end-to-end encrypted by default, meaning only the recipients and not even WhatsApp can see them."
The handling and metadata around encrypted messages is nearly as exploitable as the actual message contents. End-to-end encryption is necessary but not sufficient. The infrastructure has to be designed to minimize risk of other forms of exploitive analysis as well but in the case of WhatsApp that is essentially their business model.
OTR, for IRC/XMPP, PGP for Email and Olm/Megolm provided by Element for Matrix operators.
Essentially the software creating the keys is not controlled by the same entity controlling the transmission method.
In email/matrix you have an additional protection in that you can host your own server; the best protection is the one you never have the possibility of traffic being diverted, and even if it was it would be encrypted so that the server doesn’t leak anyway, security is like an onion after all.
If you think WhatsApp leaves a lot of metadata on the table for analysis, try doing a Matrix chat. You get a plaintext view of which device used which key to send which message ID to which room/person. If the message is a reply, you get the message ID your new message is a reply to in plaintext as well.
Without even looking at things like HTTP headers, this is what the metadata an E2EE-encrypted message (with verified+cross-signed keys) looks like, with specific identifiers censored just in case:
Unlike on platforms like Whatsapp, these message envelopes are available to anyone with access to either a session token or the user's password. The E2EE keys require a bit of extra verification, but you don't need those to build a pretty solid who-talks-to-who-when network even in encrypted chatrooms.
I understand why they implemented some of the metadata this way, but the encryption-stapled-to-unencrypted-messaging approach just leaves a lot to be desired. Signal, on the other hand, leaks pretty much nothing.
When I was at unnamed major financial institution, we were ordered to stop using WhatsApp, but it had nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena. Maybe the truth has more to do with that, or maybe not, what do I know, who are all you people anyway, and why am I posting here?
Heh. I have a friend here in the US. His father passed away in his home country. No will. The whole family needed to show up in court for probate, but he could not travel at that time.
The court: "No problem, just join the session on video using WhatsApp"
It sounds like the court they are referring to is in the "home country". The friend whose father passed is in the US but the "home country" is where the father passed.
i feel the same way about so many government departments switching to X as a primary public communications platform instead of... you know, the open web (with distribution to downstream closed platforms), as they always have. it just reeks of unseriousness.
> nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena
But that is a concern of information security.
Compliance is often part of this calculus, and many on this forum get wrapped around the axle thinking it's always about cryptography or something. Encryption is only a small part of the broader practice of information security.
Makes sense, there are lots of requirements for communication retention in financial institutions. If I recall the phone lines are permanently recorded on trading desks by regulators so if anything does happen they have all the info... it's why socializing in person is such a big part of being a trader.
I mean, regardless of any argument about Whatsapp, shouldn't installing any app on a government phone that's not allowed be impossible? Sheesh. This shouldn't even be a discussion in the first place.
This is due to the addition of Meta AI in WhatsApp [0].
Unsurprisingly, data egress to third parties is a major security vector - especially for mission critical jobs like working in the House. MS apps incorporating Copilot have faced similar blocks as well.
This requirement for data stewardship is called out in HITPOL8 as well [1][2] (the AI tool standards set by the House CAO).
Signal would be the obvious choice here - open source, no AI integration, minimal metadata collection, and recommended by security professionals for sensitive communications.
The article as well as HITPOL8 [0][1]. WhatsApp has been blocked for the same reason Deepseek AI (the Deepseek app) is blocked - "Stewardship of Legislative Branch Data".
I think it is fair to assume that the US intelligence apparatus has inside knowledge on how comprised or otherwise different platforms are. They are the experts in compromising apps so I'm going to take their word for it.
We learned from Snowden how this is achieved, have people forgotten all of that already?
So to recap, how I assume this is done. A combination of "legal" American routes to gain access to data and embedding agents in the actual organizations to do your technical bidding.
This is speculation but if I were compromising whatsapp I'd leave a bug in there that allowed me to compromise accounts on demand. Something like being able to reduce the randomness of the RNG for a particular account. Then I could just decrypt the messages super easy (cause I already know a range of RNG seeds that work) and it would look to everyone like it was encrypted.
So, who is the chief culprit for doing this, if I was a guessing man (and I am) I would probably say Israel has compromised WhatsApp and the US gov knows it and would like Israel not to know everything that Whitehouse staffers are saying.
> "We know members and their staffs regularly use WhatsApp and we look forward to ensuring members of the House can join their Senate counterparts in doing so officially," Stone said.
>Andy Stone, a spokesperson for WhatsApp parent company Meta, said in a statement to Axios, "We disagree with the House Chief Administrative Officer's characterization in the strongest possible terms."
(..)
"Messages on WhatsApp are end-to-end encrypted by default, meaning only the recipients and not even WhatsApp can see them. This is a higher level of security than most of the apps on the CAO's approved list that do not offer that protection."
Serious question: who else takes for granted that Zuck gets a daily summary of all high-level federal governmental communications, as harvested via backdoors or simply from non-end-to-end encrypted traffic on any Meta property?
I assume he does. I assume moreover that most people aware of this at Meta consider this due diligence in defending shareholder value. What's that line from Dune 2, a wise hunter climbs the tallest hill? _You need to see._
he U.S. Army is establishing Detachment 201: The Army’s Executive Innovation Corps, a new initiative designed to fuse cutting-edge tech expertise with military innovation. On June 13, 2025, the Army will officially swear in four tech leaders.
Det. 201 is an effort to recruit senior tech executives to serve part-time in the Army Reserve as senior advisors. In this role they will work on targeted projects to help guide rapid and scalable tech solutions to complex problems. By bringing private-sector know-how into uniform, Det. 201 is supercharging efforts like the Army Transformation Initiative, which aims to make the force leaner, smarter, and more lethal.
The four new Army Reserve Lt. Cols. are
Shyam Sankar, Chief Technology Officer for Palantir;
Andrew Bosworth, Chief Technology Officer of Meta;
Kevin Weil, Chief Product Officer of OpenAI; and
Bob McGrew, advisor at Thinking Machines Lab and former Chief Research Officer for OpenAI.
So yes, Meta's CTO is now a high ranking army officer
What would Meta get out of spying on their own government? That's a "life in secret jail" kind of risk for a sickeningly rich CEO with a private island. We haven't even found any evidence of backdoors used against foreign governments, they'd be pretty stupid to attack the American government.
Plus, when it comes to important communications, the weird, hacked, Israeli Signal fork already has access to these documents anyway, even when they don't accidentally add a journalist to the group chat.
If we're talking summaries of government communications, that's more Microsoft territory, who don't even bother adding proprietary E2EE implementations to their chat software.
He did say last year he was going to make this the most open and transparent administration in US history. What other administration would grant a hostile journalist an inside look at the planning and execution of an airstrike? Promises made, promises kept.
The article itself commented on how ironic it is that of all the journalists they could have invited to the chat, it was one who has been highly critical of the president and not some sycophant who might have kept it a secret or turned it into a puff piece like what I just did except without the sarcasm.
While it is correct that this was a PEBKAC error rather than Signal's error, I would like to suggest that, in general, all mobile phone apps are poor choices for anything as sensitive as planning a missile strike.
I think one could design a procedure involving a mobile phone and Signal that would be reasonably secure for that kind of use case. The number one point on that procedure would be that the phone in question isn't used for anything other than secure communication.
Of course, the US government already has approved procedures and devices for secure communication, so senior official making up their own is reckless and unprofessional.
I agree in principle but this was (probably) a result of somebody fat-fingering the wrong contact and I do think there's some culpability on either the app or the phone for making that possible to do by mistake. Touch screens are an inherently clumsy interface, and Android in general has a lot of problems with UI elements suddenly moving around without warning as you're clicking on things. And then there's auto correct, UI hanging for several seconds at a time only to suddenly wake up and replay everything that you tried to do while it was non responsive, phantom button presses caused by the device getting too warm, etc.
None of this is meant to excuse these officials for not authenticating everybody in that group or for using highly informal text messages to plan an airstrike of all things.
Ultimately there's no excuse for leaking information when you're at that level of government; I just feel like the app industry needs to take responsibility and fix several obvious, well-known and common UI issues.
>but this was (probably) a result of somebody fat-fingering the wrong contact...
Supposedly, it was the result of a helpful Apple feature getting the wrong phone number for one of the intended group participants. Then Signal cheerfully used that wrong phone number to add the reporter to the group.
I don’t think there’s any culpability or responsibility for the app, it doesn’t really bill itself as a good platform to do the high-level planning of military strikes.
If there are UI issues, they should be fixed because they are also annoying when planning somebody a surprise birthday party. (Or all the other stuff an encrypted chat app might be good for).
On the other hand, PGP just calling itself “pretty good” was pretty funny. Maybe that’s the level of active humbleness that everybody should aim for.
I thought the latest on this was that the journalist's number was in an internal email from spokesman Brian Hughes, and software or human error led to his phone number being associated with Hughes in Waltz's phone contact
Yeah, but Signal really didn't help them at all with that. As with most of these phone oriented encrypted messengers, Signal is pretty sloppy with identity management. It would be hard to find a better example of this than SignalGate 1.0.
When it comes to practical cryptography, nobody is doing signing parties anyway. It's all TOFU unless someone forces people's hands, and when you force people to do security you can assume they won't bother checking if the QR code they're scanning is coming from a real app or a livestream of someone else's app, they just want to get the scanning done. The whole key scan thing is probably only of any use to people keeping contact after meeting with journalists.
If you blame the incorrect phone number in the Apple address book then sure, but that implies that you think that a smart phone address book should be responsible for identity management in an end to end encrypted messenger. Oh, and the telephone number to identity mapping is the responsibility of:
* Signal
* Twillo
* The phone company
That's all OK as far as it goes, but the root problem here is that a typical Signal user is made aware of none of this. Sure it's legit to take convenience over security, but it is not OK to leave this tradeoff completely unknown to the people affected.
The federal government uses a third-party Signal client that saves their conversations in clear text to a database, which has been breached before. Clearly user error, not Signal's fault.
Well, if you just cannot be botherer to drive to the scif, and if you are best buds with the man in charge, do whatever least impacts your workout schedule.
Signal is only as secure as the device it runs on. Cell Phones are not secure. They are blackboxes and probably track you and may have built-in backdoors (only to be used to catch 'real' criminals), etc.
The idea that you can turn a device like that into some form of secure communication platform by installing an app is not realistic.
I can't imagine any justification for any government device that should be secure to have anything on it but the bare minimum software and the device in whatever hardened mode it has.
If they visit the White House, government facility ... should go in a locker.
I worked for a company that sent people onsite to government contractors. One contractor we rarely visited was at a facility where you arrived at the front gate in your rental car with your ID, keys, and equipment you needed. You were told if you brought anything else expect to lose it.
They took your ID and keys at the gate, searched the car, you were blindfoled and they escorted you to the location of the equipment. If you had to go to the bathroom your were escorted (all the way...). You left with the clothes on your back.
We went through a lot of laptops, but ... that place was secure.
Government: Zuck put a backdoor in WhatsApp or we will put you in a blacksite UFC ring and beat you up.
Also Government: WhatsApp has a backdoor. Don't use it.
Also government: installed special version of Signal that includes a backdoor (logs)
People: don't use Signal! It has a back door! Instead, use Telegram, it doesn't have encryption by default and is highly suspect of a foreign adversary
Also people: "I'll just send copies of all my messages to the government because they have my data anyways"
[flagged]
Which is the fascist government?
Haven't you been paying attention?
I'm pretty sure they were making a joke that both are
[flagged]
Look at what Trump does. After the Syrian fall, he continues the long standing policies and proposes the MIGA regime change in Iran. Since this is a decades old goal of US foreign policy, I assume he'll go through with it. They are already promoting Reza Pachlavi on Fox News.
Unless you have proof that he made a secret deal with Putin (you get Ukraine, we get Syria and Iran), how is this pro Russian?
Also Government: uses Israel-backdoored custom Signal
Yeah but Israel is Israel, so there's no actual problem there. Now, if it was Iran...
Tell that to Jonathon Pollard.
What source do you have for that?
They used it in view of press cameras, many articles about this but here’s the first one from Google for me: https://www.404media.co/mike-waltz-accidentally-reveals-obsc...
Jeffrey Goldberg.
The Government is made up of a huge number of organizations with competing goals, budgets, capabilities, and interests.
[dead]
Grammar is really needed here cos:
Zuck put a backdoor
And
Zuck, put a backdoor
…are about as different as they could be
Explains why Zuck has been training Brazilian jiu-jitsu.
>Government: Zuck put a backdoor in WhatsApp or we will put you in a blacksite UFC ring and beat you up.
Source?
>Also Government: WhatsApp has a backdoor. Don't use it.
If "zuck" is really in the pocket of the US government, why should they worry about their own backdoors?
> If "zuck" is really in the pocket of the US government, why should they worry about their own backdoors?
Have you ever watched a Saturday morning cartoon? Minions betray their masters all the time. An effective evil overlord doesn’t underestimate their lackey’s capacity for duplicity and betrayal at a pivotal moment.
The most fun may even appreciate the gall: https://memory-alpha.fandom.com/wiki/The_Nagus_(episode)#:~:...
I have a movie for you: "Broken City" (2013) great cast and constantly unexpected turns of events
Once it's backdoored you don't know who's watching it.
It's the most hilarious thing about backdoors or collecting extensive covert intel on your own population, that any failure of opsec makes it much easier for all your adversaries to also spy on them in ways they would never otherwise be able to, then compromise them, and flip them.
Why would there be a source for a backdoor of a closed source application?
Usually when you make important claims it's expected you back them up with some sort of evidence.
Sources to back up the claim, not source code of the application.
> Source?
https://www.facebook.com/security/advisories/cve-2019-3568
Software frequently has bugs and sometimes they have security implications. In order to claim that a specific bug is a backdoor you need to have evidence beyond the existence of a bug.
House (legislative branch) staffers presumably don't want executive branch snoops reading their group chats. Doubly so for Democratic staffers not wanting specifically the Trump executive branch reading them.
WhatsApp on TV: “Trust us! It’s encrypted :) :) :)”
And on social media. Maybe I'm being too literal and pedantic, but it bugs me that they say "nobody" can read your messages. What's the point of using it if even the recipient can't read them (or the sender for that matter!).
I'm sorry, it's just flatly wrong to suggest Microsoft Teams is safer than WhatsApp and everyone here bandwagoning on this ridiculous decision should feel bad.
Perhaps you're unaware that there is a special, DoD-certified version of Teams called "Gov Teams", which can be used to share data at multiple impact levels securely. This version of Teams, and the entire Office365 suite, has undergone extensive security certification for use with high IL data.
[flagged]
Well it _has_, but sure you can bring the value of the certifications into question
> but sure you can bring the value of the certifications into question
Yes I do
Valuable to whom?
They're almost certainly not using the same version as the general public. Most major service providers have a specific version for government with additional controls and restrictions and have undergone certification through FedRAMP, including Microsoft:
https://www.microsoft.com/en-us/microsoft-365/government
Some other examples:
- AWS GovCloud https://aws.amazon.com/govcloud-us/
- Google Workspace for Government https://workspace.google.com/industries/government/
- GovSlack https://slack.com/solutions/govslack
- Atlassian Government Cloud https://www.atlassian.com/government
> it's just flatly wrong
The unwarranted confidence is stunning in a post that is so fundamentally incorrect. I don't like Teams, but your take is deeply unaligned with reality.
Teams absolutely has more compliance controls than WhatsApp. Encryption, compliance, data governance, security, etc are all related but very different things.
It doesn't mean that MS Teams is safer, it means that the government has tighter control on MS Teams.
Or maybe that Microsoft pays more than Meta.
MS products allow you to store data locally without any egress, so an IT team has access to it.
This is the sticking point, because WhatsApp has now integrated Meta AI into the app, but (obviously) do not provide an on-prem data store. This is why Deepseek AI (the Deepseek app) and ChatGPT (the OpenAI app) are barred as well.
Data Stewardship and Zero Trust has been an internal initiative in the House for a couple years now.
The fact that almost no one on this thead knows these (imo overused) terms and design patterns highlights one of the various major gaps in Software Dev I've been observing for several years now - especially the North American market (given the hours that this was posted). The inability to incorporate or understand some basic security architectures is a major gap.
Edit: Keep pushing the downvotes. The truth hurts, and plays a role in jobs leaving, and funds like my employer funding cybersecurity startups in Israel, India, and Eastern Europe because the ecosystem doesn't exist in the US anymore. A similar trend happened in data layer related work.
We don't need more SKLearn plumbers calling themselves "ML Engineers" or Angular monkeys calling themselves "Fullstack Engineers" - we need people who truly understand fundamentals (or - shudders - first principles), be they mathematical (optimization), systems (virtualization), or algorithms (efficient data structures)
Isn't deepseek 100% open source?
The model weights themselves are, but there's also the hosted SaaS.
Deepseek the model sure. Not Deepseek AI - the app [0] published by Hangzhou DeepSeek (the company that developed DeepSeek)
[0] - https://apps.apple.com/us/app/deepseek-ai-assistant/id673759...
> The fact that almost no one on this [thread] knows these
Its not that they aren't known, but rather we just came off a long trend of thin-clients and cloud storage. Some companies merely stay in that ethereal space, while others had concerns about their data. Criticizing people for doing what experts were pushing for the past 20 years doesn't need to devolve into calling their expertise into question.
The downvotes are for that, not because "you're wrong".
I don't think I understand what you're saying here.
Teams doesn’t require access to my entire contacts book on my phone to run smoothly. I can choose the individuals whose contact details I want to give it
I ban Whatsapp but require Teams on company devices.
Can you explain why the thinking is wrong?
This is very reasonable if you have compliance needs or similar. That’s not what this office is saying - it’s saying teams is more secure. This is wrong. The nature of banning private messaging apps is trading security for legibility. If this office is interested in that (which it’s not - it allows Signal), they should say so.
Your Teams is not the government's Teams.
Microsoft maintains specific secure government versions of Teams that use their own special secure data centers. It's a full parallel extra secure set of infrastructure.
I do have a compliance need, similar to this office i imagine.
Teams is more secure in my opinion.
I as an admin can control who you can/can't talk to, what you can share with them, when you can share it. Correctly configured MS Teams is a pretty secure setup.
On the flipside im not sure i can make someone else's Whatsapp not auto download anything sent to it.... The two apps aren't really comparable unless I've missed an entire 'Whatapps for government/enterprise' business arm.
Not wrong.
MS Teams allow for offline/local storage of its video/chat conferencing.
How is WhatsApp safer to use than Microsoft Teams?
WhatsApp is always end-to-end encrypted, Teams only in certain cases.
If you think end-to-end encryption is the only thing that matters in security, then yeah sure, WhatsApp is more secure.
Personally, I'd be embarrassed to let people know I thought that way, but to each their own.
So you would potentially prefer an app without end-to-end encryption to WhatsApp? What are these important security features?
E2EE is mostly useful for consumer applications, where you trust the endpoint (yourself), but not the intermediary servers (some megacorp that doesn't care about you).
The situation is entirely different when you are managing very large organizations.
In those situation, you don't necessarily need the need the data to be invisible to the intermediary servers, because you might either just be able to control them yourself, secure them with NDAs, etc. And if the server is controlled by you, then you might not even want the data to be invisible to yourself. But, your primary risks may be the compromise of endpoint devices, mistakes or leaks by your users, or a lack of controls over data exchange. Also, many organizations may need to provide records of their internal communications in order to comply with legal requirements.
You might be surprised to know that enterprise offerings of many apps that otherwise support E2EE, often have a way for administrators to intentionally turn those features off.
Lack of complete e2ee is a feature for many large organizations—they still want everything encrypted, they just want a master key to be able to audit communications for compliance/investigations/insider threat identification. They also want strict control over who does what with the app, and where all of the associated data lives. Teams is just a totally different product from WhatsApp in that regard, with all sorts of functionality that will never exist in WhatsApp—tons of control over user identity and access management, integration with all sorts of other security tooling, etc.
The threat model of an organisation is almost the opposite of you as an individual.
For you, you trust yourself the most, followed by your device, and the intermediate servers are a threat. For an organisation, the servers are the most trusted entity, followed by the org-provided device, and a certain percentage of users are an active threat.
Message retention, audit logging, SSO to name a few off the top of my head.
> WhatsApp is always end-to-end encrypted, Teams only in certain cases
Which is an anti-feature given this application: you want a certain level of oversight and control over what staffers communicate.
Their statement doesn't sound like what you said at all:
> The Office of Cybersecurity has deemed WhatsApp a high-risk to users due to the lack of transparency in how it protects user data, absence of stored data encryption, and potential security risks involved with its use
(Of course that statement seems to be highly confused overall. What "stored data encryption"?)
Does WhatsApp encrypt the data on the device after it’s received and decrypted at your phone’s end (then stored indefinitely)? I thought the term of art was “encrypted at rest,” but “stored data encryption” makes sense to me too.
I was of the impression that Whatsapp’s messages (and its backups, photos, etc) kind of just hung around in plaintext once they reached the device.
Which would seem to be a problem should the device be stolen, or observed by other applications on the phone or a tethered device, or twiddled with sneaky hardware (e.g. [0]) that might use physical means to access the device’s file system.
Although as I understand it, the privacy claims are kind of window dressing anyway, and Meta has been more than willing to share plenty of WhatsApp’s data with all and sundry… even before AI-in-the-same-search-bar came along [1]
[0] https://shop.hak5.org/products/omg-cable
[1] https://www.propublica.org/article/how-facebook-undermines-p...
> "Messages on WhatsApp are end-to-end encrypted by default, meaning only the recipients and not even WhatsApp can see them."
The handling and metadata around encrypted messages is nearly as exploitable as the actual message contents. End-to-end encryption is necessary but not sufficient. The infrastructure has to be designed to minimize risk of other forms of exploitive analysis as well but in the case of WhatsApp that is essentially their business model.
If the network controls the endpoints; then E2EE is meaningless.
What implementation of end to end encryption doesn't involve this?
OTR, for IRC/XMPP, PGP for Email and Olm/Megolm provided by Element for Matrix operators.
Essentially the software creating the keys is not controlled by the same entity controlling the transmission method.
In email/matrix you have an additional protection in that you can host your own server; the best protection is the one you never have the possibility of traffic being diverted, and even if it was it would be encrypted so that the server doesn’t leak anyway, security is like an onion after all.
If you think WhatsApp leaves a lot of metadata on the table for analysis, try doing a Matrix chat. You get a plaintext view of which device used which key to send which message ID to which room/person. If the message is a reply, you get the message ID your new message is a reply to in plaintext as well.
Without even looking at things like HTTP headers, this is what the metadata an E2EE-encrypted message (with verified+cross-signed keys) looks like, with specific identifiers censored just in case:
Unlike on platforms like Whatsapp, these message envelopes are available to anyone with access to either a session token or the user's password. The E2EE keys require a bit of extra verification, but you don't need those to build a pretty solid who-talks-to-who-when network even in encrypted chatrooms.I understand why they implemented some of the metadata this way, but the encryption-stapled-to-unencrypted-messaging approach just leaves a lot to be desired. Signal, on the other hand, leaks pretty much nothing.
When I was at unnamed major financial institution, we were ordered to stop using WhatsApp, but it had nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena. Maybe the truth has more to do with that, or maybe not, what do I know, who are all you people anyway, and why am I posting here?
WhatsApp also feels... tonally weird to use at a serious company, like in the same way it would feel weird to be using snapchat for team meetings.
WhatsApp is already the de facto communication channel in a lot of countries.
In Brazil even subpoenas can be sent via WhatsApp.
Heh. I have a friend here in the US. His father passed away in his home country. No will. The whole family needed to show up in court for probate, but he could not travel at that time.
The court: "No problem, just join the session on video using WhatsApp"
Really?
Remote court sessions are usually on Google Meet or Zoom
It sounds like the court they are referring to is in the "home country". The friend whose father passed is in the US but the "home country" is where the father passed.
Totally agree. Now let me go play with this model I got off of Hugging Face
i feel the same way about so many government departments switching to X as a primary public communications platform instead of... you know, the open web (with distribution to downstream closed platforms), as they always have. it just reeks of unseriousness.
> nothing to do with security and everything to do with avoiding even the possibility of the appearance of backroom dealing or production avoidance in the event of subpoena
But that is a concern of information security.
Compliance is often part of this calculus, and many on this forum get wrapped around the axle thinking it's always about cryptography or something. Encryption is only a small part of the broader practice of information security.
Makes sense, there are lots of requirements for communication retention in financial institutions. If I recall the phone lines are permanently recorded on trading desks by regulators so if anything does happen they have all the info... it's why socializing in person is such a big part of being a trader.
i heard (anecdotally) that wall street used to run on Yahoo IM - fascinating. do you know if that extended into your previous employer?
I mean, regardless of any argument about Whatsapp, shouldn't installing any app on a government phone that's not allowed be impossible? Sheesh. This shouldn't even be a discussion in the first place.
Are they allowed to have X installed on them though? ;)
Man, politics and finance are a trainwreck enabled by apathetic voters who think democracy is about picking a sports team.
This is due to the addition of Meta AI in WhatsApp [0].
Unsurprisingly, data egress to third parties is a major security vector - especially for mission critical jobs like working in the House. MS apps incorporating Copilot have faced similar blocks as well.
This requirement for data stewardship is called out in HITPOL8 as well [1][2] (the AI tool standards set by the House CAO).
[0] - https://faq.whatsapp.com/203220822537614/?cms_platform=iphon...
[1] - https://cha.house.gov/_cache/files/4/2/42dca19e-194b-481e-b1...
[2] - https://cha.house.gov/_cache/files/0/8/08476380-95c3-4989-ad...
Signal would be the obvious choice here - open source, no AI integration, minimal metadata collection, and recommended by security professionals for sensitive communications.
Signal lacks other compliance features. e.g. message archiving
It might be good if you're a journalist, but it's not as good if you have compliance requirements beyond confidentiality.
Source for reason?
The article as well as HITPOL8 [0][1]. WhatsApp has been blocked for the same reason Deepseek AI (the Deepseek app) is blocked - "Stewardship of Legislative Branch Data".
[0] - https://cha.house.gov/_cache/files/4/2/42dca19e-194b-481e-b1...
[1] - https://cha.house.gov/_cache/files/0/8/08476380-95c3-4989-ad...
People seem to be missing the point here.
I think it is fair to assume that the US intelligence apparatus has inside knowledge on how comprised or otherwise different platforms are. They are the experts in compromising apps so I'm going to take their word for it.
We learned from Snowden how this is achieved, have people forgotten all of that already?
So to recap, how I assume this is done. A combination of "legal" American routes to gain access to data and embedding agents in the actual organizations to do your technical bidding.
This is speculation but if I were compromising whatsapp I'd leave a bug in there that allowed me to compromise accounts on demand. Something like being able to reduce the randomness of the RNG for a particular account. Then I could just decrypt the messages super easy (cause I already know a range of RNG seeds that work) and it would look to everyone like it was encrypted.
So, who is the chief culprit for doing this, if I was a guessing man (and I am) I would probably say Israel has compromised WhatsApp and the US gov knows it and would like Israel not to know everything that Whitehouse staffers are saying.
> "We know members and their staffs regularly use WhatsApp and we look forward to ensuring members of the House can join their Senate counterparts in doing so officially," Stone said.
Go on...
This seems sensible.
>Andy Stone, a spokesperson for WhatsApp parent company Meta, said in a statement to Axios, "We disagree with the House Chief Administrative Officer's characterization in the strongest possible terms."
(..)
"Messages on WhatsApp are end-to-end encrypted by default, meaning only the recipients and not even WhatsApp can see them. This is a higher level of security than most of the apps on the CAO's approved list that do not offer that protection."
Maybe they should use Meshtastic
Serious question: who else takes for granted that Zuck gets a daily summary of all high-level federal governmental communications, as harvested via backdoors or simply from non-end-to-end encrypted traffic on any Meta property?
I assume he does. I assume moreover that most people aware of this at Meta consider this due diligence in defending shareholder value. What's that line from Dune 2, a wise hunter climbs the tallest hill? _You need to see._
Official press release, https://www.army.mil/article/286317/army_launches_detachment...
he U.S. Army is establishing Detachment 201: The Army’s Executive Innovation Corps, a new initiative designed to fuse cutting-edge tech expertise with military innovation. On June 13, 2025, the Army will officially swear in four tech leaders.
Det. 201 is an effort to recruit senior tech executives to serve part-time in the Army Reserve as senior advisors. In this role they will work on targeted projects to help guide rapid and scalable tech solutions to complex problems. By bringing private-sector know-how into uniform, Det. 201 is supercharging efforts like the Army Transformation Initiative, which aims to make the force leaner, smarter, and more lethal.
The four new Army Reserve Lt. Cols. are
Shyam Sankar, Chief Technology Officer for Palantir;
Andrew Bosworth, Chief Technology Officer of Meta;
Kevin Weil, Chief Product Officer of OpenAI; and
Bob McGrew, advisor at Thinking Machines Lab and former Chief Research Officer for OpenAI.
So yes, Meta's CTO is now a high ranking army officer
What would Meta get out of spying on their own government? That's a "life in secret jail" kind of risk for a sickeningly rich CEO with a private island. We haven't even found any evidence of backdoors used against foreign governments, they'd be pretty stupid to attack the American government.
Plus, when it comes to important communications, the weird, hacked, Israeli Signal fork already has access to these documents anyway, even when they don't accidentally add a journalist to the group chat.
If we're talking summaries of government communications, that's more Microsoft territory, who don't even bother adding proprietary E2EE implementations to their chat software.
Good. Another point to be made when my friends push me to install bloated spyware just to plan a pizza party.
Use Signal.
> Use Signal.
And preferable not a hacked version of Signal that sends your messages in plain text to another country and its spy agencies.
See https://archive.ph/oXYXe for more info about TeleMessage version of Signal approved for use by government offices.
Are "paid for" and "properly approved for classified information" being conflated here? I may have missed something.
Also don't willfully send that info to your wife, lawyer ... friends ... for fun.
> Use signal.
... but not for planning strikes into other countries.
He did say last year he was going to make this the most open and transparent administration in US history. What other administration would grant a hostile journalist an inside look at the planning and execution of an airstrike? Promises made, promises kept.
"hostile"
The article itself commented on how ironic it is that of all the journalists they could have invited to the chat, it was one who has been highly critical of the president and not some sycophant who might have kept it a secret or turned it into a puff piece like what I just did except without the sarcasm.
That wasn't signal's fault. They accidentally invited a journalist to the chat.
While it is correct that this was a PEBKAC error rather than Signal's error, I would like to suggest that, in general, all mobile phone apps are poor choices for anything as sensitive as planning a missile strike.
I think one could design a procedure involving a mobile phone and Signal that would be reasonably secure for that kind of use case. The number one point on that procedure would be that the phone in question isn't used for anything other than secure communication.
Of course, the US government already has approved procedures and devices for secure communication, so senior official making up their own is reckless and unprofessional.
I wouldn't disagree with you, here.
I agree in principle but this was (probably) a result of somebody fat-fingering the wrong contact and I do think there's some culpability on either the app or the phone for making that possible to do by mistake. Touch screens are an inherently clumsy interface, and Android in general has a lot of problems with UI elements suddenly moving around without warning as you're clicking on things. And then there's auto correct, UI hanging for several seconds at a time only to suddenly wake up and replay everything that you tried to do while it was non responsive, phantom button presses caused by the device getting too warm, etc.
None of this is meant to excuse these officials for not authenticating everybody in that group or for using highly informal text messages to plan an airstrike of all things.
Ultimately there's no excuse for leaking information when you're at that level of government; I just feel like the app industry needs to take responsibility and fix several obvious, well-known and common UI issues.
>but this was (probably) a result of somebody fat-fingering the wrong contact...
Supposedly, it was the result of a helpful Apple feature getting the wrong phone number for one of the intended group participants. Then Signal cheerfully used that wrong phone number to add the reporter to the group.
* https://www.theguardian.com/us-news/2025/apr/06/signal-group...
I don’t think there’s any culpability or responsibility for the app, it doesn’t really bill itself as a good platform to do the high-level planning of military strikes.
If there are UI issues, they should be fixed because they are also annoying when planning somebody a surprise birthday party. (Or all the other stuff an encrypted chat app might be good for).
On the other hand, PGP just calling itself “pretty good” was pretty funny. Maybe that’s the level of active humbleness that everybody should aim for.
I thought the latest on this was that the journalist's number was in an internal email from spokesman Brian Hughes, and software or human error led to his phone number being associated with Hughes in Waltz's phone contact
Yeah, but Signal really didn't help them at all with that. As with most of these phone oriented encrypted messengers, Signal is pretty sloppy with identity management. It would be hard to find a better example of this than SignalGate 1.0.
* https://articles.59.ca/doku.php?id=em:sg End to End Encrypted Messaging in the News: An Editorial Usability Case Study (my article)
It wasn't Signal's identity management that proved to be a problem: https://www.theguardian.com/us-news/2025/apr/06/signal-group...
When it comes to practical cryptography, nobody is doing signing parties anyway. It's all TOFU unless someone forces people's hands, and when you force people to do security you can assume they won't bother checking if the QR code they're scanning is coming from a real app or a livestream of someone else's app, they just want to get the scanning done. The whole key scan thing is probably only of any use to people keeping contact after meeting with journalists.
If you blame the incorrect phone number in the Apple address book then sure, but that implies that you think that a smart phone address book should be responsible for identity management in an end to end encrypted messenger. Oh, and the telephone number to identity mapping is the responsibility of:
* Signal
* Twillo
* The phone company
That's all OK as far as it goes, but the root problem here is that a typical Signal user is made aware of none of this. Sure it's legit to take convenience over security, but it is not OK to leave this tradeoff completely unknown to the people affected.
The federal government uses a third-party Signal client that saves their conversations in clear text to a database, which has been breached before. Clearly user error, not Signal's fault.
Well, if you just cannot be botherer to drive to the scif, and if you are best buds with the man in charge, do whatever least impacts your workout schedule.
What, you don't bring your SCIF wherever you go?
https://www.theemcshop.com/benchtop-faraday-tents/select-fab...
Or just call, email or txt.
Signal is only as secure as the device it runs on. Cell Phones are not secure. They are blackboxes and probably track you and may have built-in backdoors (only to be used to catch 'real' criminals), etc.
The idea that you can turn a device like that into some form of secure communication platform by installing an app is not realistic.
Yeah, but the location of your next pizza party probably isn’t a state secret either.
It is if the party's in the Situation Room at 3am.
https://www.fastcompany.com/91352935/pentagon-pizza-index-th...