Fast, private email hosting for you or your business. Try Fastmail free for up to 30 days.
Published to West Virginia’s Office of the Attorney General’s site last Thursday (CNet’s coverage):
West Virginia Attorney General JB McCuskey today filed a lawsuit against Apple Inc., alleging the company knowingly allowed its iCloud platform to be used as a vehicle for distributing and storing child sexual abuse material (CSAM) — and for years chose to do nothing about it.
I understand the purpose of a press release of this sort is for the Attorney General to position their case as favorably as possible, but “allowed” and “do nothing about it” are awfully strong. But that’s not nearly as assertive as this:
The lawsuit reveals that Apple, in its own internal communications, described itself as the “greatest platform for distributing child porn” — yet took no meaningful action to stop it. Rather than implement industry-standard detection tools used by its peers, Apple repeatedly shirked their responsibility to protect children under the guise of user privacy.
The press release doesn’t cite its source for the quote, but the filing does, linking to a 2021 article in The Verge that highlighted “All the best emails from the Apple vs. Epic trial.” From entry 71, “Apple’s head of fraud suggests Apple may be unwittingly providing ‘the greatest platform for distributing child porn.’”:
Halfway through an iMessage conversation about whether Apple might be putting too much emphasis on privacy and not enough on trust and safety, [Eric] Friedman comments that “we are the greatest platform for distributing child porn,” adding that “we have chosen to not know in enough places where we really cannot say” and referencing a New York Times article where, he suspects, Apple is “underreporting” the size of the issue.
It’s a pretty damning quote, and, as you might suspect, is taken somewhat out of context, while being simultaneously presented as though Apple as a company was content with this assessment. That’s more salacious and headline-grabbing than what it actually was: a Trust and Safety executive expressing frustration about his efforts to resolve an inherent tension in Apple’s staunch privacy stance. As Friedman says in that iMessage chat just ahead of that quote:
We’re committed to doing the work so that we can maximize all three: features AND safety AND privacy. But it requires real commitment to get there.
In context, Friedman isn’t confessing; he’s lamenting the obvious privacy tradeoffs. I’m confident in saying that Apple was (and is) aware that its strong stance on privacy—which benefits most people—also has the unfortunate side effect of “protecting” a small group of bad actors who leverage that privacy to perform nefarious acts. The alternative is to reduce everyone’s privacy, which is something Apple has (rightfully) been loath to do. And make no mistake, that is very much what’s at stake here.
Back to A.G. McCuskey’s press release:
Federal law requires all technology companies based in the U.S. to report detected CSAM to the National Center for Missing and Exploited Children (NCMEC). In 2023, Apple made just 267 such reports. By contrast, Google filed 1.47 million reports and Meta filed more than 30.6 million.
The key word here is “detected.” Apple made 267 reports because Apple doesn’t have access to your files; Google and Meta have no such restrictions and thus can detect and report files they—or anyone—have deemed inappropriate.
The lawsuit alleges that Apple (knowingly) harbors large amounts of unreported CSAM, based on the assumption that because its peers do, Apple does too (and on a passing comment by an executive). Apple is being targeted because it doesn’t report sufficient CSAM—not because it hosts it. The lawsuit seeks to weaken privacy under the guise of protecting children: the exact opposite of Apple’s stance.
Also: Holy smokes, what is going on over at Meta? 30.6 million reports of CSAM? What percentage does that represent of total CSAM on Meta platforms? I truly hope it’s substantially all. That many reports—and the vast amount of information Meta has on its users to make them—are still insufficient to stem the tide of CSAM coursing through its platforms. It knows it’s there. It reports it. And yet it continues to proliferate. Meta reports the material, but seemingly does nothing to actually stop it.
Which may be enough to satisfy the West Virginia Attorney General.