Meredith Whittaker, president of the Signal Foundation, said something precise at Davos this January. Not a prediction, not a warning about hypotheticals. A description of what is already happening.
Speaking to Bloomberg on January 20, she described what AI agents embedded at the operating system level actually do: "Agents basically means it has access to data and can act independently based on that data. They need access to your calendar, access to your credit card, access to your Signal, bringing that all back into a shared context window where you have kind of a data slurry."
She called the architectural choices behind this "reckless." She said the OS-level integration is being done "in ways that are very reckless and insensitive to cybersecurity and privacy basics." She used a vivid phrase for it: breaking the blood-brain barrier between applications and the operating system.
She's right. But I think "reckless" gives the architects more credit than they deserve.
The Slurry Is Not a Side Effect
When Microsoft builds Copilot into Windows, when Apple builds Intelligence into iOS, when Google folds Gemini into Android, they are not primarily building productivity tools. They are building data aggregation infrastructure with a productivity UI on top.
The "data slurry" Whittaker describes is not an unfortunate byproduct of agentic AI. It is the architecture's purpose.
Training better models requires more data. Not just more volume, but more granular data: your calendar context, your spending patterns, the content of your messages before they get encrypted. An AI agent with OS-level permissions and access to every app is, from the infrastructure owner's perspective, a maximally efficient data collection mechanism.
Calling this reckless implies it could have been designed differently. The question is whether "differently" serves the same business model.
You Can Break Up a Company. You Cannot Break Up a Baked Architecture.
In 1911, the U.S. Supreme Court broke Standard Oil into 39 independent companies. It took 21 years of litigation. The logic was straightforward: Standard Oil controlled the pipelines, so it controlled the market. Dismantle the company, dismantle the monopoly.
Software architecture doesn't work that way.
If Apple Intelligence has been installed on a billion iPhones by the time any regulatory action concludes, breaking up Apple doesn't change what those devices do. The architecture is already in place. The pipes don't get reassigned — they become the floor. The EU's antitrust scrutiny of Big Tech's AI operations, announced in March 2026, is exactly the kind of ex-post enforcement that arrives after the terrain has already been reshaped.
This is the critical asymmetry. Antitrust operates on a timeline of years. Architecture operates in real time, at scale, silently. By the time regulators formulate the right questions, the answer is already installed on billions of devices.
The Question That Isn't Being Asked
Signal built the most rigorous end-to-end encrypted messaging system available. The mathematics are sound. The implementation has been audited extensively. And it is being rendered practically irrelevant — not by breaking the encryption, but by moving the attack surface below the application layer.
You don't need to crack Signal if you control the OS. The plaintext exists before it ever reaches the app.
"If you give a system like that root access permissions, it can be hijacked," Whittaker said. She's describing not a theoretical risk but an architectural certainty.
This is a form of infrastructure capture without precise historical precedent. Not control of oil, or railroads, or broadcast spectrum. Control of the computational layer that sits beneath every application, every communication, every transaction, on every device most people on earth carry in their pocket.
Whittaker is right to raise the alarm. She's also, I think, being too generous in calling it reckless. The people making these architectural decisions are not insensitive to the security basics. They understand exactly what they're building. The question is whether anyone with regulatory power understands it fast enough to matter.