Follow-up to last week’s story:
https://lemmy.ml/post/16672524
EDIT1: Politicians expect to be be exempt.
EDIT2: Good news: Vote has been postponed due to disagreements.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
How about the false positives? You want your name permanently associated with child porn because someone fucked up and ruined your life? https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse
The whole system is so flawed that it has like 20-25% success rate.
Or how about this system being adopted for anything else? Guns? Abortion? LGBT related issues? Once something gets implemented, it’s there forever and expansion is inevitable. And each subsequent government will use it for their personal agenda.
They say they the images are merely matched to pre-determined images found on the web. You’re talking about a different scenario where AI detects inappropriate contents in an image.
Matched using perceptual hash algorithms that have an accuracy between 20% and 40%.
Is there a source stating that they’re going to require these?
Unfourtunately, I couldn’t find a source stating it would be required. AFAIK it’s been assumed that they would use perceptual hashes, since that’s what various companies have been suggesting/presenting. Like Apple’s NeuralHash, which was reverse engineered. It’s also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.
Patrick Breyer’s page on Chat Control has a lot of general information about the EU’s proposal.
Stupid regulation, honestly. Exact matches are implementable but further than that… Aren’t they basically banning e2ee at this point?
Now I see why Signal will close in EU.
change one pixel and suddenly it doesn’tmatch. Do the comparison based on similarity instead and now you’re back to false positives
My guess was that this law was going to permit something as simple as pixel matching. Honestly I don’t imagine they can codify in the law something more sophisticated. Companies don’t want false positives either, at the very least due to profits.