Model Comparison 100% sign agreement
Model Editorial Structural Class Conf SETL Theme
claude-haiku-4-5-20251001 +0.53 +0.17 Moderate positive 0.28 0.44 Privacy & Digital Encryption Rights
@cf/meta/llama-3.3-70b-instruct-fp8-fast lite +0.80 ND Strong positive 0.90 0.00 Digital Privacy
@cf/meta/llama-4-scout-17b-16e-instruct lite +0.80 ND Strong positive 0.90 0.00 Digital Rights
deepseek/deepseek-v3.2-20251201 +0.66 +0.43 Neutral 0.34 0.34 Privacy & Surveillance
claude-haiku-4-5 lite +0.85 ND Strong positive 0.95 0.00 Digital privacy encryption rights
meta-llama/llama-3.3-70b-instruct:free ND ND
Section claude-haiku-4-5-20251001 @cf/meta/llama-3.3-70b-instruct-fp8-fast lite @cf/meta/llama-4-scout-17b-16e-instruct lite deepseek/deepseek-v3.2-20251201 claude-haiku-4-5 lite meta-llama/llama-3.3-70b-instruct:free
Preamble 0.50 ND ND 0.62 ND ND
Article 1 0.27 ND ND 0.92 ND ND
Article 2 0.34 ND ND ND ND ND
Article 3 0.30 ND ND 0.58 ND ND
Article 4 ND ND ND ND ND ND
Article 5 ND ND ND ND ND ND
Article 6 ND ND ND ND ND ND
Article 7 0.36 ND ND ND ND ND
Article 8 0.30 ND ND ND ND ND
Article 9 ND ND ND 0.36 ND ND
Article 10 ND ND ND ND ND ND
Article 11 0.35 ND ND ND ND ND
Article 12 0.63 ND ND 0.95 ND ND
Article 13 ND ND ND ND ND ND
Article 14 ND ND ND ND ND ND
Article 15 ND ND ND ND ND ND
Article 16 0.25 ND ND ND ND ND
Article 17 ND ND ND 0.89 ND ND
Article 18 0.50 ND ND 0.58 ND ND
Article 19 0.59 ND ND 1.00 ND ND
Article 20 0.39 ND ND 0.74 ND ND
Article 21 0.35 ND ND 0.40 ND ND
Article 22 ND ND ND ND ND ND
Article 23 ND ND ND ND ND ND
Article 24 ND ND ND ND ND ND
Article 25 ND ND ND ND ND ND
Article 26 ND ND ND ND ND ND
Article 27 ND ND ND 0.64 ND ND
Article 28 0.43 ND ND 0.56 ND ND
Article 29 0.25 ND ND 0.23 ND ND
Article 30 ND ND ND 0.68 ND ND
+0.53 Apple's plan to “think different” about encryption opens a backdoor to your life (www.eff.org S:+0.17 )
2260 points by bbatsell 1669 days ago | 824 comments on HN | Moderate positive Contested Editorial · v3.7 · 2026-02-28 10:14:08 0
Summary Privacy & Digital Encryption Rights Advocates
This EFF article advocates strongly for encryption privacy rights and warns against Apple's proposed client-side scanning systems, arguing that even well-intentioned backdoors inevitably enable censorship and surveillance. The piece exemplifies human rights advocacy focused on Articles 12 (privacy), 19 (free expression), and 28 (international governance), engaging extensively with risks to vulnerable populations including LGBTQ+ individuals, minors, and domestic abuse victims while calling readers to action through organized protest.
Article Heatmap
Preamble: +0.50 — Preamble P Article 1: +0.27 — Freedom, Equality, Brotherhood 1 Article 2: +0.34 — Non-Discrimination 2 Article 3: +0.30 — Life, Liberty, Security 3 Article 4: ND — No Slavery Article 4: No Data — No Slavery 4 Article 5: ND — No Torture Article 5: No Data — No Torture 5 Article 6: ND — Legal Personhood Article 6: No Data — Legal Personhood 6 Article 7: +0.36 — Equality Before Law 7 Article 8: +0.30 — Right to Remedy 8 Article 9: ND — No Arbitrary Detention Article 9: No Data — No Arbitrary Detention 9 Article 10: ND — Fair Hearing Article 10: No Data — Fair Hearing 10 Article 11: +0.35 — Presumption of Innocence 11 Article 12: +0.63 — Privacy 12 Article 13: ND — Freedom of Movement Article 13: No Data — Freedom of Movement 13 Article 14: ND — Asylum Article 14: No Data — Asylum 14 Article 15: ND — Nationality Article 15: No Data — Nationality 15 Article 16: +0.25 — Marriage & Family 16 Article 17: ND — Property Article 17: No Data — Property 17 Article 18: +0.50 — Freedom of Thought 18 Article 19: +0.59 — Freedom of Expression 19 Article 20: +0.39 — Assembly & Association 20 Article 21: +0.35 — Political Participation 21 Article 22: ND — Social Security Article 22: No Data — Social Security 22 Article 23: ND — Work & Equal Pay Article 23: No Data — Work & Equal Pay 23 Article 24: ND — Rest & Leisure Article 24: No Data — Rest & Leisure 24 Article 25: ND — Standard of Living Article 25: No Data — Standard of Living 25 Article 26: ND — Education Article 26: No Data — Education 26 Article 27: ND — Cultural Participation Article 27: No Data — Cultural Participation 27 Article 28: +0.43 — Social & International Order 28 Article 29: +0.25 — Duties to Community 29 Article 30: ND — No Destruction of Rights Article 30: No Data — No Destruction of Rights 30
Negative Neutral Positive No Data
Aggregates
Editorial Mean +0.53 Structural Mean +0.17
Weighted Mean +0.41 Unweighted Mean +0.39
Max +0.63 Article 12 Min +0.25 Article 16
Signal 15 No Data 16
Volatility 0.12 (Medium)
Negative 0 Channels E: 0.6 S: 0.4
SETL +0.44 Editorial-dominant
FW Ratio 64% 32 facts · 18 inferences
Evidence 28% coverage
2H 10M 3L 16 ND
Theme Radar
Foundation Security Legal Privacy & Movement Personal Expression Economic & Social Cultural Order & Duties Foundation: 0.37 (3 articles) Security: 0.30 (1 articles) Legal: 0.34 (3 articles) Privacy & Movement: 0.63 (1 articles) Personal: 0.38 (2 articles) Expression: 0.44 (3 articles) Economic & Social: 0.00 (0 articles) Cultural: 0.00 (0 articles) Order & Duties: 0.34 (2 articles)
HN Discussion 20 top-level · 30 replies
trangus_1985 2021-08-05 20:30 UTC link
I've been maintaining a spare phone running lineage os exactly in case something like this happened - I love the apple watch and apple ecosystem, but this is such a flagrant abuse of their position as Maintainers Of The Device that I have no choice but to switch.

Fortunately, my email is on a paid provider (fastmail), and my photos are on a NAS, I've worked hard to get all of my friends on Signal. While I still use google maps, I've been trialing out OSM alternatives for a minute.

The things they've described are in general, reasonable and probably good in the moral sense. However, I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out). On the surface, it seems good - but I am concerned about other snooping features that this portents.

However, with icloud photos csam, it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior (even if the initial dataset is the most reprehensible behavior).

I'm saddened by Apple's decision, and I hope they recant, because it's the only way I will continue to use their platform.

triska 2021-08-05 20:39 UTC link
I remember an Apple conference where Tim Cook personally assured us that Apple is fully committed to privacy, that everything is so secure because the iPhone is so powerful that all necessary calculations can happen on the device itself, and that we are "not the product". I think the Apple CEO said some of this in the specific context of speech processing, yet it seemed a specific case of a general principle upheld by Apple.

I bought an iPhone because the CEO seemed to be sincere in his commitment to privacy.

What Apple has announced here seems to be a complete reversal from what I understood the CEO saying at the conference only a few years ago.

Shank 2021-08-05 21:05 UTC link
I really love the EFF, but I also believe the immediate backlash is (relatively) daft. There is a potential for abuse of this system, but consider the following too:

1. PhotoDNA is already scanning content from Google Photos and a whole host of other service providers.

2. Apple is obviously under pressure to follow suit, but they developed an on-device system, recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

3. Nobody, and I mean nobody, is going to successfully convince the general public that a tool designed to stop the spread of CSAM is a "bad thing" unless they can show concrete examples of the abuse.

For one and two: given the two options, would you rather that Apple implement serverside scanning, in the clear, or go with the on-device route? If we assume a law was passed to require serverside scanning (which could very well happen), what would that do to privacy?

For three: It's an extremely common trope to say that people do things to "save the children." Well, that's still true. Arguing against a CSAM scanning tool, which is technically more privacy preserving than alternatives from other cloud providers, is an extremely uphill battle. The biggest claim here is that the detection tool could be abused against people. And that very well may be possible! But the whole existence of NCMEC is predicated on stopping the active and real danger of child sex exploitation. We know with certainty this is a problem. Compared to a certainty of child sex abuse, the hypothetical risk from such a system is practically laughable to most people.

So, I think again, the backlash is daft. It's been about two days of the announcement being public (leaks). The underlying mathematics behind the system has barely been published [0]. It looks like the EFF rushed to make a statement here, and in doing so, it doesn't look like they took the time to analyze the cryptography system, to consider the attacks against it, or to consider possible motivations and outcomes. Maybe they did, and they had advanced access to the material. But it doesn't look like it, and in the court of public opinion, optics are everything.

[0]: https://www.apple.com/child-safety/pdf/Alternative_Security_...

hncurious 2021-08-05 21:31 UTC link
Apple employees successfully pressured their employer to fire a new hire and are petitioning to keep WFH.

https://www.vox.com/recode/2021/5/13/22435266/apple-employee...

https://www.vox.com/recode/22583549/apple-employees-petition...

Will they apply that energy and leverage to push back on this?

How else can this be stopped before it goes too far? Telling people to "Drop Apple" is even less effective than "Delete Facebook".

c7DJTLrn 2021-08-05 21:37 UTC link
Catching child pornographers should not involve subjecting innocent people to scans and searches. Frankly, I don't care if this "CSAM" system is effective - I paid for the phone, it should operate for ME, not for the government or law enforcement. Besides, the imagery already exists by the time it's been found - the damage has been done. I'd say the authorities should prioritise tracking down the creators but I'm sure their statistics look much more impressive by cracking down on small fry.

I've had enough of the "think of the children" arguments.

nicetryguy 2021-08-05 21:48 UTC link
I'm looking forward to this platform being expanded to facially ID against more databases such as criminals, political dissenters, or anyone with an undesirable opinion so that SWAT teams can barge into the homes of false positive identifications to murder them and their dogs.
iamleppert 2021-08-05 21:55 UTC link
It’s pretty trivial to iteratively construct an image that has the same hash as another, completely different image if you know what the hash should be.

All one needs to do, in order to flag someone or get them caught up in this system, is to gain access to this list of hashes and construct an image. This data is likely to be sought after as soon as this system is implemented, and it will only be a matter of time before a data breach exposes it.

Once that is done, the original premise and security model of the system will be completely eroded.

That said, if this does get implemented I will be getting rid of all my Apple devices. I’ve already switched to Linux on my development laptops. The older I get, the less value Apple products have to me. So it won’t be a big deal for me to cut them out completely.

roody15 2021-08-05 21:59 UTC link
My two cents: I get the impression this is related to NSO pegasus software. So once the Israeli firms leaks were made public Appple had to respond and has patched some security holes that were exposed publicly.

NSO used exploits in iMessage to enable them to grab photos, texts among other things.

Now shortly after Apple security patches we see them pivot and now want to “work” with law enforcement. Hmmm almost like once access was closed Apple needs a way to justify “opening” access to devices.

Yes I realize this could be a stretch based on the info. Just seems like an interesting coincidence… back door exposed and closed…. now it’s back open… almost like governments demand access

geraneum 2021-08-05 22:21 UTC link
Didn’t they [Apple] make the same points that EFF is making now, to avoid giving FBI a key to unlock an iOS device that belonged to a terrorist?

“ Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”

“… We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

“ The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Tim Cook, 2016

cwizou 2021-08-05 22:27 UTC link
The FT article mentioned it was US only, but I'm more afraid of how other governments will try to pressure Apple to adapt said technology to their needs.

Can they trust random government to give them a database of only CSAM hashes and not insert some extra politically motivated content that they deem illegal ?

Because once you've launched this feature in the "land of the free", other countries will require for their own needs their own implementation and demand (through local legislation which Apple will need to abide to) to control said database.

And how long until they also scan browser history for the same purpose ? Why stop at pictures ? This is opening a very dangerous door that many here will be uncomfortable with.

Scanning on their premises (considering they can as far as we know ?) would be a much better choice, this is everything but (as the "paper" linked tries to say) privacy forward.

skee_0x4459 2021-08-05 22:38 UTC link
wow. in the middle of reading that, i realized that this is a watershed moment. why would apple go back on their painstakingly crafted image and reputation of being staunchly pro privacy? its not for the sake of the children (lol). no, something happened that has changed the equation for apple. some kind of decisive shift has occurred. maybe apple has finally caved in to the chinese market, like everyone else in the US, and is now making their devices compatible with chinese surveillance. or maybe the US government has finally managed to force apple to crack open its shell of encryption in the name of a western flavored surveillance. but either way, i think it is a watershed moment because securing privacy will from this moment onward be a fringe occupation in the west. unless a competitor rises up -- but thats impossible because there arent enough people who care about privacy to sustain a privacy company. thats the real reason why privacy has died today.

if you really want to save the children, why not build the scanning into safari? scan the whole phone! just scan it all. its really no different than what they are doing. its not like they would have to cross the rubicon to do it, not anymore anyway.

and also i think its interesting how kids will adjust to this. i think a lot of kids wont hear about this and will find themselves caught up in a child porn case.

im so proud of the responses that people seem to generally have. it makes me feel confident in the future of the world.

isnt there some device to encrypt and decrypt messages with a separate device that couples to your phone? like a device fit into a case and that has a keyboard interface built into a screen protector with indium oxide electrodes.

strogonoff 2021-08-05 22:54 UTC link
If Mallory gets a lawful citizen Bob to download a completely innocuous looking but perceptual-CSAM-hash-matching image to his phone, what happens to Bob? I imagine the following options:

- Apple sends Bob’s info to law enforcement; Bob is swatted or his life is destroyed in some other way. Worst, but most likely outcome.

- An Apple employee (or an outsourced contractor) reviews the photo, comparing it to CSAM source image sample used for the hash. Only if the image matches according to human vision, Bob is swatted. This requires there to be some sort of database of CSAM source images, which strikes me as unlikely.

- An Apple employee or a contractor reviews the image for abuse without comparing it to CSAM source, using own subjective judgement. Better, but implies Apple employees could technically SWAT Apple users.

haskaalo 2021-08-05 23:32 UTC link
At this point, I think phones can be compared to a home in terms of privacy.

In your house, you might have private documents, do some things you don't want other people to have or see just like what we have on our phones nowadays.

The analogy I'm trying to make is that if suddenly the government decided to install cameras in every houses with the premise to make sure no pedophile is abusing a child and that the cameras never send data unless the AI done locally detects it is something that I believe would shock everyone.

farmerstan 2021-08-05 23:34 UTC link
Police routinely get drug sniffing dogs to give false positives so that they are allowed to search a vehicle.

How do we know Apple or the FBI don’t do this? If they want to search someone’s phone all they need to do is enter a hash of a photo they know is on the targets phone and voila, instant access.

Also, how is this not a violation of the 14th amendment? I know Apple isn’t part of the government but they are basically acting as a defacto agent of the police by scanning for crimes. Using child porn as a completely transparent excuse to start scanning all our material for anything they want makes me very angry.

Wowfunhappy 2021-08-05 23:36 UTC link
This isn't the biggest issue at play, but one detail I can't stop thinking about:

> If an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. [...] For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Why is it different for children under 13, specifically? The 18-year cutoff makes sense, because turning 18 carries legal weight in the US (as decided via a democratic process), but 13?

13 is an age when many parents start granting their children more freedom, but that's very much rooted in one's individual culture—and the individual child. By giving parents fewer options for 13-year-olds, Apple—a private company—is pushing their views about parenting onto everyone else. I find that a little disturbing.

---

Note: I'm not (necessarily) arguing for greater restrictions on 13-year-olds. Privacy for children is a tricky thing, and I have mixed feelings about this whole scheme. What I know for sure, however, is that I don't feel comfortable with Apple being the one to decide "this thing we've declared an appropriate invasion of privacy for a 12-year-old is not appropriate for a 13-year-old."

Waterluvian 2021-08-05 23:45 UTC link
If I go on 4chan and an illegal image loads and caches into my phone before moderators take it down or I hit the back button, will Apple’s automated system ruin my life?

This kind of stuff absolutely petrifies me because I’m so scared of getting accidentally scooped up for something completely unintentional. And I do not trust police one bit to behave like intelligent adult humans.

Right now I feel like I need to stop doing ANYTHING that goes anywhere outside the velvet ropes of the modern commercial internet. That is, anywhere that cannot pay to moderate everything well enough that I don’t run the risk of having my entire life ruined because some #%^*ing algorithm picks up on some content I didn’t even choose to download.

shrimpx 2021-08-06 02:01 UTC link
From Apple's original text[0]:

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching [...]

It's incredible that Apple arrived at the conclusion that client-side scanning that you cannot prevent is more private than cloud-scanning.

Since they claim they're only scanning iCloud content, why not scan in the cloud?

They decided the most private way is to scan iCloud content before it's uploaded to the cloud... Because if they scanned in the cloud it would be seen as a breach of privacy and is bad optics for a privacy-focused company? But scanning on the physical device that they have described as "personal" and "intimate" has better optics? That's amazing.

This decision can only be read as Apple paving the way to scanning all content on the device, to bypass the pesky "Backup to iCloud" options being turned off.

[0] https://www.apple.com/child-safety/

blintz 2021-08-06 02:02 UTC link
One disappointing development from a larger perspective is that many privacy-preserving technologies (multi-party computing, homomorphic encryption, hardware enclaves, etc) are actually getting used to build tools that undermine once-airtight privacy guarantees. E2E starts to become… whatever this is.

A more recent example is how private set intersection became an easy way to get contact tracing tech everywhere while maintaining an often perfunctory notion of privacy.

I wonder where large companies will take this next. It behooves us cryptography/security people who actually care about not walking down this slippery slope to fight back with tech of our own.

This whole thing also somewhat parallels the previous uses of better symmetric encryption and enclaves technologies for DRM and copyright protection.

lovelyviking 2021-08-06 07:24 UTC link
- Apple: Dear User, We are going to install Spyware Engine in your device.

- User: Are you out of your f... mind?

- Apple: It's for children protection.

- User: Ah, ok, no problem, please install spyware and do later whatever you wish and forget about any privacy, the very basis of rights, freedom and democracy.

This is by the way how Russia started to filter the web from political opponents. All necessary controls were put in place under the same slogan: "to protect children"

Yeah, right.

Are modern people that naive and dumb and can't think 2 steps forward? Is that's why it's happening?

Edit: Those people would still need to explain how living in society without privacy, freedom and democracy with authoritarian practices when those children will grow up will make them any 'safer' ...

_red 2021-08-05 20:45 UTC link
Yes, my history was Linux 95-04, Mac 04-15, and now back to Linux from 2015 onwards.

Its been clear Tim Cook was going to slowly harm the brand. He was a wonderful COO under a visionary CEO-type, but he holds no particular "Tech Originalist" vision. He's happy to be part of the BigTech aristocracy, and probably feels really at home in the powers it affords him.

Anyone who believes this is "just about the children" is naive. His chinese partners will use this to crack down on "Winnie the Poo" cartoons and the like...before long questioning any Big Pharma product will result in being flagged. Give it 5 years at max.

Klonoar 2021-08-05 20:57 UTC link
I think the EFF is probably doing good by calling attention to the issue, but let's... actually look at the feature before passing judgement, e.g:

https://twitter.com/josephfcox/status/1423382200880439298/ph...

- It's run for Messages in cases where a child is potentially viewing material that's bad.

- It's run _before upload to iCloud Photos_ - where it would've already been scanned anyway, as they've done for years (and as all other major companies do).

To me this really doesn't seem that bad. Feels like a way to actually reach encrypted data all around while still meeting the expectations of lawmakers/regulators. Expansion of the tech would be something I'd be more concerned about, but considering the transparency of it I feel like there's some safety.

https://www.apple.com/child-safety/ more info here as well.

nerdponx 2021-08-05 21:01 UTC link
The cynical take is that Apple was never committed to privacy in and of itself, but they are commited to privacy as long as it improves their competitive advantage, whether by marketing or by making sure that only Apple can extract value from its customers' data.

Hanlon's razor does not apply to megacorporations that have enormous piles of cash and employ a large number of very smart people, who are either entirely unscrupulous or for whom scruples are worth less than their salaries. We probably aren't cynical enough.

I am not arguing that we should always assume every change is always malicious towards users. But our index of suspicion should be high.

echelon 2021-08-05 21:14 UTC link
> There is a potential for abuse of this system, but consider the following too

> I think again, the backlash is daft.

Don't apologize for this bullshit! Don't let your love of brand trump the reality of what's going on here.

Machinery is being put in place to detect what files are on your supposedly secure device. Someone has the reins and promises not to use it for anything other than "protecting the children".

How many election cycles or generations does it take to change to an unfavorable climate where this is now a tool of great asymmetrical power to use against the public?

What happens when the powers that be see that you downloaded labor union materials, documents from Wikileaks, or other files that implicate you as a risk?

Perhaps a content hash on your phone puts you in a flagged bucket where you get pat downs at the airport, increased surveillance, etc.

The only position to take here is a full rebuke of Apple.

edit: Apple apologists are taking a downright scary position now. I suppose the company has taken a full 180 from their 1984 ad centerpiece. But that's okay, right, because Apple is a part of your identity and it's beyond reproach?

edit 2: It's nominally iCloud only (a key feature of the device/ecosystem), but that means having to turn off a lot of settings. One foot in the door...

edit 3: Please don't be complicit in allowing this to happen. Don't apologize or rationalize. This is only a first step. We warned that adtech and monitoring and abuse of open source were coming for years, and we were right. We're telling you - loudly - that this will begin a trend of further erosion of privacy and liberty.

feanaro 2021-08-05 21:18 UTC link
> that a tool designed to stop the spread of CSAM is a "bad thing"

It's certainly said to be designed to do it, but have you seen concerns raised in the other thread (https://news.ycombinator.com/item?id=28068741)? There have been reports from some commenters of the NCMEC database containing unobjectionable photos because they were merely found in a context alongside some CSAM.

Who audits these databases? Where is the oversight to guarantee only appropriate content is included? They are famously opaque because the very viewing of the content is illegal. So how can we know that they contain what they are purported to contain?

This is overreach.

throwaway888abc 2021-08-05 21:24 UTC link
1. was new to me.

TIL - (2014) PhotoDNA Lets Google, FB and Others Hunt Down Child Pornography Without Looking at Your Photos

https://petapixel.com/2014/08/08/photodna-lets-google-facebo...

cle 2021-08-05 21:27 UTC link
Unfortunately with SafetyNet, I feel like an investment into Android is also a losing proposition...I can only anticipate being slowly cut off from the Android app ecosystem as more apps onboard with attestation.

We've collectively handed control of our personal computing devices over to Apple and Google. I fear the long-term consequences of that will not be positive...

shivak 2021-08-05 21:43 UTC link
> recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

Apple employs cryptographers, but they are not necessarily acting in your interest. Case in point: their use of private set intersection, to preserve privacy..of law enforcement, not users. Their less technical summary:

> Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection..

The matching is performed on device, so the user’s privacy isn’t at stake. But, thanks to PSI and the hash preprocessing, the user doesn’t know what law enforcement is looking for.

lijogdfljk 2021-08-05 21:53 UTC link
I doubt this will be as clean. A large swath of people will defend this "for the children".
JumpCrisscross 2021-08-05 21:56 UTC link
> with icloud photos csam, it is also a horrifying precedent

I'm not so bugged by this. Uploading data to iCloud has always been a trade of convenience at the expense of privacy. Adding a client-side filter isn't great, but it's not categorically unprecedented--Apple executes search warrants against iCloud data--and can be turned off by turning off iCloud back-ups.

The scanning of childrens' iMessages, on the other hand, is a subversion of trust. Apple spent the last decade telling everyone their phones were secure. Creating this side channel opens up all kinds of problems. Having trouble as a controlling spouse? No problem--designate your partner as a child. Concerned your not-a-tech-whiz kid isn't adhering to your house's sexual mores? Solved. Bonus points if your kid's phone outs them as LGBT. To say nothing of most sexual abuse of minors happening at the hands of someone they trust. Will their phone, when they attempt to share evidence, tattle on them to their abuser?

Also, can't wait for Dads' photos of their kids landing them on a national kiddie porn watch list.

bambax 2021-08-05 21:59 UTC link
> probably good in the moral sense

How, how is it even morally good?? Will they start taking pictures of your house to see if you store drugs under your couch? Or cook meth in your kitchen??

What is moral is for society to be in charge of laws and law enforcement. This vigilante behavior by private companies who answer to no one is unjust, tyrannical and just plain crazy.

jjtheblunt 2021-08-05 22:09 UTC link
Cryptographic hashes are exactly not trivial to "dupe".

https://en.wikipedia.org/wiki/Cryptographic_hash_function

that said, it's not clear to me from

https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...

how collision resistant what's to be used will be.

bambax 2021-08-05 22:11 UTC link
Yes. I'm not interested in catching pedophiles, or drug dealers, or terrorists. It's the job of the police. I'm not the police.
randcraw 2021-08-05 22:18 UTC link
You presume Apple and the DoJ will implement this with human beings at each step. They won't. Both parties will automate as much of this clandestine search as possible. With time, the external visibility and oversight of this practice will fade, and with it, any motivation to confirm fair and accurate matches. Welcome to the sloppiness inherent in clandestine law enforcement intel gathering.

As with all politically-motivated initiatives that boldly violate the Constitution (consider the FISA Court, and its rubber stamp approval of 100% of the secret warrants put before it), the use and abuse of this system will go largely underground, like FISA, and its utility will slowly degrade due to lack of oversight. In time, even bad matches will log the IDs of both parties in databases that label them as potential sexual predators.

Believe it. That's how modern computer-based gov't intel works. Like most law enforcement policy recommendation systems, Apple's initial match algorithm will never be assessed for accuracy, nor be accountable for being wrong at least 10% of the time. In time it will be replaced by other third party screening software that will be even more poorly written and overseen. That's just what law enforcement does.

I've personally seen people suffer this kind of gov't abuse and neglect as a result of clueless automated law enforcement initiatives after 9-1-1. I don't welcome more, nor the gradual and willful tossing of everyone's basic Constitutional rights that Apple's practice portends.

The damages to personal liberty that are inherent in conducting secret searches without cause or oversight is exactly why the Fourth Amendment requires a warrant before conducting a search. NOW is the time to disabuse your sense of 'daftness'; not years from now, after the Fourth and Fifth Amendments become irreversibly passe. Or should I say, 'daft'?

tcoff91 2021-08-05 22:24 UTC link
what is the hashing scheme? I assume it must not be a cryptographically secure hashing scheme if it's possible to find a collision. It's not something like sha256?
mrits 2021-08-05 22:30 UTC link
There isn't any reason to believe the CSAM hash list is only images. The government now has the ability to search for anything in your iCloud account with this.
rubatuga 2021-08-05 22:31 UTC link
Think of the children!!!
2OEH8eoCRo0 2021-08-05 22:38 UTC link
Why is it always "think of the children"? It gets people emotional? What about terrorism, murder, or a litany of other heinous violent crimes?
aalam 2021-08-05 22:42 UTC link
The initial rollout is limited to the US, with no concrete plans reported yet on expansion.

“The scheme will initially roll out only in the US. […] Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system.”

Researchers interviewed for the article would agree with your analysis. “Security researchers [note: appears to be the named security professors quoted later in the article], while supportive of efforts to combat child abuse, are concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

Article link for ease of access: https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f54...

zionic 2021-08-05 22:54 UTC link
I’m furious. My top app has 250,000 uniques a day.

I’m considering a 24h black out with a protest link to apple’s support email explaining what they’ve done.

I wonder if anyone else would join me?

burself 2021-08-05 22:57 UTC link
The algorithms and data involved are too sensitive to be discussed publicly and the reasoning is acceptable enough to even the most knowledgeable people. They can't even be pressured to prove that the system is effective at it's primary purpose.

This is the perfect way to begin opening the backend doors.

zionic 2021-08-05 23:00 UTC link
You can’t “save the children” by building a dystopia for them to grow up in.
divbzero 2021-08-05 23:02 UTC link
A sibling comment speculates that this is related to Pegasus [1] which sounds wild to me but maybe, just maybe, it’s not.

[1]: https://news.ycombinator.com/item?id=28080539

cronix 2021-08-05 23:09 UTC link
As soon as Cook became CEO, he let the NSA's Prism program into Apple. Everything since then has been a fucking lie.

> Andrew Stone, who worked with Jobs for nearly 25 years, told the site Cult of Mac last week that Steve Jobs resisted letting Apple be part of PRISM, a surveillance program that gives the NSA access to records of major Internet companies. His comments come amid speculation that Jobs resisted cooperating. “Steve Jobs would’ve rather died than give into that,” Stone told the site.

> According to leaked NSA slides about PRISM, Apple was the last tech behemoth to join the secret program — in October 2012, a year after Jobs died. Apple has said that it first heard about PRISM on June 6 of this year, when asked about it by reporters.

https://www.huffpost.com/entry/apple-nsa-steve-jobs_n_346132...

I mean, maybe they didn't call it "PRISM" when talking about it with Cook, so it could technically be true that they didn't hear of PRISM until media stories. Everyone knows the spy agency goes around telling all of their project code names to companies they're trying to compromise. Hello, sir. We're here to talk to you about our top secret surveillance program we like to call PRISM where we intercept and store communications of everyone. Would you like to join? MS did. So did Google. Don't you want to be in our select cool club?

bitexploder 2021-08-05 23:17 UTC link
Do we know that they are using perceptual hashing? I am curious about the details of the hash database they are comparing against, but I assumed perceptual hashing would be pretty fraught with edge cases and false positives.

e: It is definitely not a strict/cryptographic hash algorithm: "Apple says NeuralHash tries to ensure that identical and visually similar images — such as cropped or edited images — result in the same hash." They are calling it "NeuralHash" -- https://techcrunch.com/2021/08/05/apple-icloud-photos-scanni...

dang 2021-08-05 23:25 UTC link
Thanks! Macroexpanded:

Expanded Protections for Children - https://news.ycombinator.com/item?id=28078115 - Aug 2021 (291 comments)

Apple plans to scan US iPhones for child abuse imagery - https://news.ycombinator.com/item?id=28075021 - Aug 2021 (349 comments)

Apple enabling client-side CSAM scanning on iPhone tomorrow - https://news.ycombinator.com/item?id=28068741 - Aug 2021 (680 comments)

vineyardmike 2021-08-05 23:26 UTC link
> as a queer kid, I was terrified of my parents finding out

I think many queer people have a completely different idea of the concept of "why do you want to hide if you're not doing anything wrong" and the desire to stay private. Especially since anything sexual and related to queerness is way more aggressively policed than hetero-normative counterparts.

Anything "think of children" always has a second order affect of damaging queer people because lots of people still think of queerness as dangerous to children.

It is beyond likely that lots of this monitoring will catch legal/safe queer content - especially the parental-controls focused monitoring (as opposed to the gov'ment db of illegal content)

anonuser123456 2021-08-05 23:56 UTC link
Downloading an image to your phone is different than uploading it to iCloud.

Downloaded images are not uploaded to iCloud w/out user intervention.

anonuser123456 2021-08-06 00:01 UTC link
> How do we know Apple or the FBI don’t do this?

Because it requires Apple and law enforcement, two separate organizations, to collude against you.

The false positive would have to be affirmed to a court and entered into evidence. If the false positive we’re found to not match the true image by the court, any warrant etc. would be found invalid and the fruit of any search etc would be invalid as well.

Apple is a private company. By agreeing to use iCloud photos you agree to their terms, this no 14th amendment violation.

rajacombinator 2021-08-06 00:08 UTC link
> How do we know Apple or the FBI don’t do this? You can almost certainly know that they will do this. Or just text the target a bad photo.
Editorial Channel
What the content says
+0.85
Article 12 Privacy
High Advocacy Framing Coverage
Editorial
+0.85
SETL
+0.68

Core theme. Article extensively defends privacy rights and critiques Apple's encryption backdoor as a direct violation of privacy expectations. Provides technical analysis of surveillance mechanisms and warns of privacy erosion for all users.

+0.82
Article 19 Freedom of Expression
High Advocacy Framing Coverage
Editorial
+0.82
SETL
+0.68

Core theme. Article extensively documents how encryption backdoors enable censorship and chilling effects on freedom of expression, particularly for marginalized speech (LGBTQ+, protest, satire). Directly challenges Apple's claim of maintaining end-to-end encryption.

+0.70
Preamble Preamble
Medium Advocacy Framing Coverage
Editorial
+0.70
SETL
+0.59

Article invokes human dignity, freedom, and fundamental rights as grounds for protecting privacy and encryption. Frames digital surveillance as a threat to foundational human rights.

+0.70
Article 18 Freedom of Thought
Medium Framing Coverage
Editorial
+0.70
SETL
+0.59

Article warns extensively that Apple's classifier could suppress diverse viewpoints by identifying and restricting LGBTQ+ content, political satire, and protest materials. Documents how ML filters create chilling effects through over-blocking.

+0.62
Article 28 Social & International Order
Medium Framing Coverage
Editorial
+0.62
SETL
+0.54

Article discusses international governance issues, citing India and Ethiopia as examples of how encryption backdoors could enable authoritarian control. Criticizes GIFCT database for operating without external oversight.

+0.55
Article 20 Assembly & Association
Medium Framing Coverage
Editorial
+0.55
SETL
+0.47

Article warns that Apple's system could be weaponized to identify and suppress protest-related content, assembly materials, and protest flyers in authoritarian jurisdictions.

+0.50
Article 2 Non-Discrimination
Medium Framing Coverage
Editorial
+0.50
SETL
+0.45

Article warns that Apple's system could enable discrimination against LGBTQ+ content in jurisdictions with anti-LGBTQ+ laws, citing India as a concrete example.

+0.50
Article 7 Equality Before Law
Low Framing
Editorial
+0.50
SETL
+0.42

Article emphasizes unequal treatment of children vs. adults in Apple's system, raising fairness concerns about differential application of monitoring.

+0.48
Article 21 Political Participation
Medium Advocacy Coverage
Editorial
+0.48
SETL
+0.40

Article emphasizes user agency and self-determination, arguing that users and minors have the right to control their own communications and make decisions about encryption.

+0.45
Article 11 Presumption of Innocence
Medium Framing Coverage
Editorial
+0.45
SETL
+0.34

Article critiques the system for monitoring minors' communications without presumption of innocence and for restricting minors' ability to delete flagged content without parental consent.

+0.40
Article 3 Life, Liberty, Security
Low Framing
Editorial
+0.40
SETL
+0.32

Privacy is framed as a component of personal security in digital communications, connecting data protection to personal safety.

+0.40
Article 8 Right to Remedy
Medium Advocacy Coverage
Editorial
+0.40
SETL
+0.32

Article provides explicit calls to action for accessing remedy: nationwide protest coordination and contact information for Apple leadership.

+0.35
Article 1 Freedom, Equality, Brotherhood
Low Coverage
Editorial
+0.35
SETL
+0.26

Article identifies vulnerable populations (minors, LGBTQ+, abuse victims) as rights-holders affected by Apple's policy, suggesting concern for equal dignity.

+0.35
Article 16 Marriage & Family
Medium Framing Coverage
Editorial
+0.35
SETL
+0.30

Article warns that parental monitoring features could be misused by abusive partners as stalkerware, identifying domestic abuse and intimate partner surveillance as risks to family privacy.

+0.35
Article 29 Duties to Community
Medium Framing
Editorial
+0.35
SETL
+0.30

Article acknowledges the legitimate goal of protecting children from exploitation while arguing that Apple's technical approach violates privacy rights, reflecting a nuanced understanding of competing duties.

ND
Article 4 No Slavery

No data.

ND
Article 5 No Torture

No data.

ND
Article 6 Legal Personhood

No data.

ND
Article 9 No Arbitrary Detention

No data.

ND
Article 10 Fair Hearing

No data.

ND
Article 13 Freedom of Movement

No data.

ND
Article 14 Asylum

No data.

ND
Article 15 Nationality

No data.

ND
Article 17 Property

No data.

ND
Article 22 Social Security

No data.

ND
Article 23 Work & Equal Pay

No data.

ND
Article 24 Rest & Leisure

No data.

ND
Article 25 Standard of Living

No data.

ND
Article 26 Education

No data.

ND
Article 27 Cultural Participation

No data.

ND
Article 30 No Destruction of Rights

No data.

Structural Channel
What the site does
Element Modifier Affects Note
Legal & Terms
Privacy +0.25
Article 12 Article 17
Domain mission centers on privacy protection. EFF maintains Privacy Badger and Surveillance Self-Defense tools. Strong track record of privacy advocacy.
Terms of Service +0.05
Article 29
Standard TOS language; no significant human rights restrictions observed.
Identity & Mission
Mission +0.28
Article 1 Article 19 Article 20
EFF explicitly champions free speech, privacy, and digital rights. Mission statement aligned with UDHR values.
Editorial Code +0.12
Article 19
Editorial independence evident; no editorial policy discovered that undermines human rights discourse.
Ownership +0.08
Article 19 Article 25
Nonprofit 501(c)(3) structure; no profit-driven ownership conflicts observed.
Access & Distribution
Access Model +0.15
Article 19 Article 26
Content freely accessible; no paywall or access restrictions.
Ad/Tracking -0.08
Article 12 Article 17
Piwik analytics tracking present (anon-stats.eff.org); minor privacy concern despite anonymization claims.
Accessibility +0.10
Article 26 Article 27
Site appears functional and navigable; no apparent accessibility barriers detected.
+0.30
Article 12 Privacy
High Advocacy Framing Coverage
Structural
+0.30
Context Modifier
ND
SETL
+0.68

Content is freely accessible without paywall or login barriers, supporting open access to privacy information.

+0.25
Article 19 Freedom of Expression
High Advocacy Framing Coverage
Structural
+0.25
Context Modifier
ND
SETL
+0.68

Article is publicly accessible and openly advocates for freedom of expression, modeling the rights it defends.

+0.20
Preamble Preamble
Medium Advocacy Framing Coverage
Structural
+0.20
Context Modifier
ND
SETL
+0.59

Open access to article supports freedom of expression and information dissemination without barriers.

+0.20
Article 11 Presumption of Innocence
Medium Framing Coverage
Structural
+0.20
Context Modifier
ND
SETL
+0.34

Article provides accessible information about due process concerns in the system design.

+0.20
Article 18 Freedom of Thought
Medium Framing Coverage
Structural
+0.20
Context Modifier
ND
SETL
+0.59

Open-access article itself models freedom of expression by openly discussing censorship risks.

+0.15
Article 1 Freedom, Equality, Brotherhood
Low Coverage
Structural
+0.15
Context Modifier
ND
SETL
+0.26

No specific structural signals regarding dignity recognition.

+0.15
Article 3 Life, Liberty, Security
Low Framing
Structural
+0.15
Context Modifier
ND
SETL
+0.32

No specific structural engagement with security-related provisions.

+0.15
Article 7 Equality Before Law
Low Framing
Structural
+0.15
Context Modifier
ND
SETL
+0.42

No specific structural signals regarding equality before law.

+0.15
Article 8 Right to Remedy
Medium Advocacy Coverage
Structural
+0.15
Context Modifier
ND
SETL
+0.32

Website provides functional access to action resources and links to further information.

+0.15
Article 20 Assembly & Association
Medium Framing Coverage
Structural
+0.15
Context Modifier
ND
SETL
+0.47

No specific structural engagement with assembly/association rights.

+0.15
Article 21 Political Participation
Medium Advocacy Coverage
Structural
+0.15
Context Modifier
ND
SETL
+0.40

Call-to-action structure empowers readers to participate in decisions affecting their rights.

+0.15
Article 28 Social & International Order
Medium Framing Coverage
Structural
+0.15
Context Modifier
ND
SETL
+0.54

Article provides accessible information about international human rights concerns.

+0.10
Article 2 Non-Discrimination
Medium Framing Coverage
Structural
+0.10
Context Modifier
ND
SETL
+0.45

Open discourse about discrimination risks supports non-discriminatory information access.

+0.10
Article 16 Marriage & Family
Medium Framing Coverage
Structural
+0.10
Context Modifier
ND
SETL
+0.30

No specific structural engagement with family privacy.

+0.10
Article 29 Duties to Community
Medium Framing
Structural
+0.10
Context Modifier
ND
SETL
+0.30

No specific structural engagement with Article 29.

ND
Article 4 No Slavery

No data.

ND
Article 5 No Torture

No data.

ND
Article 6 Legal Personhood

No data.

ND
Article 9 No Arbitrary Detention

No data.

ND
Article 10 Fair Hearing

No data.

ND
Article 13 Freedom of Movement

No data.

ND
Article 14 Asylum

No data.

ND
Article 15 Nationality

No data.

ND
Article 17 Property

No data.

ND
Article 22 Social Security

No data.

ND
Article 23 Work & Equal Pay

No data.

ND
Article 24 Rest & Leisure

No data.

ND
Article 25 Standard of Living

No data.

ND
Article 26 Education

No data.

ND
Article 27 Cultural Participation

No data.

ND
Article 30 No Destruction of Rights

No data.

Supplementary Signals
How this content communicates, beyond directional lean. Learn more
Epistemic Quality
How well-sourced and evidence-based is this content?
0.76 medium claims
Sources
0.8
Evidence
0.8
Uncertainty
0.7
Purpose
0.9
Propaganda Flags
2 manipulative rhetoric techniques found
2 techniques detected
appeal to fear
Repeatedly frames Apple's system as 'opening the door' to abuse: 'Apple Is Opening the Door to Broader Abuses,' 'Opens a Backdoor to Your Private Life,' repeated warnings about surveillance and censorship risks.
loaded language
Uses charged terms: 'backdoor,' 'bend its privacy-protective stance,' 'shocking about-face,' 'tectonic shift,' 'chill expression,' 'stalkerware.'
Emotional Tone
Emotional character: positive/negative, intensity, authority
urgent
Valence
-0.6
Arousal
0.8
Dominance
0.7
Transparency
Does the content identify its author and disclose interests?
1.00
✓ Author
More signals: context, framing & audience
Solution Orientation
Does this content offer solutions or only describe problems?
0.68 mixed
Reader Agency
0.8
Stakeholder Voice
Whose perspectives are represented in this content?
0.45 4 perspectives
Speaks: institutionmarginalized
About: corporationgovernmentindividuals
Temporal Framing
Is this content looking backward, at the present, or forward?
present short term
Geographic Scope
What geographic area does this content cover?
global
United States, India, Ethiopia
Complexity
How accessible is this content to a general audience?
moderate medium jargon general
Longitudinal · 5 evals
+1 0 −1 HN
Audit Trail 25 entries
2026-02-28 10:14 model_divergence Cross-model spread 0.44 exceeds threshold (5 models) - -
2026-02-28 10:14 eval Evaluated by claude-haiku-4-5-20251001: +0.41 (Moderate positive)
2026-02-28 01:41 dlq Dead-lettered after 1 attempts: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-28 01:40 dlq Dead-lettered after 1 attempts: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-28 01:39 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-28 01:38 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-28 01:38 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-28 01:37 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-28 01:36 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-28 01:36 dlq_replay DLQ message 97677 replayed to LLAMA_QUEUE: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-28 01:36 dlq_replay DLQ message 97658 replayed to LLAMA_QUEUE: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-28 00:21 eval_success Light evaluated: Strong positive (0.80) - -
2026-02-28 00:21 eval Evaluated by llama-3.3-70b-wai: +0.80 (Strong positive)
2026-02-27 21:09 dlq Dead-lettered after 1 attempts: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-27 21:08 dlq Dead-lettered after 1 attempts: Apple's plan to “think different” about encryption opens a backdoor to your life - -
2026-02-27 21:07 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:06 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:06 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:06 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:05 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:05 rate_limit OpenRouter rate limited (429) model=llama-3.3-70b - -
2026-02-27 21:05 dlq_auto_replay DLQ auto-replay: message 97555 re-enqueued - -
2026-02-27 16:33 eval Evaluated by llama-4-scout-wai: +0.80 (Strong positive)
2026-02-27 13:59 eval Evaluated by deepseek-v3.2: +0.69 (Neutral) 11,112 tokens
2026-02-27 12:32 eval Evaluated by claude-haiku-4-5: +0.85 (Strong positive)