Controversial firm, which acts as a search engine for faces, wins appeal against a watchdog.

"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK’s privacy watchdog.

Last year, Clearview AI was fined more than £7.5m by the Information Commissioner’s Office (ICO) for unlawfully storing facial images.

Privacy International (who helped bring the original case I believe) responded to this on Mastodon:

"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview’s activities are entirely “related to the monitoring of behaviour” of UK data subjects.

In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.

BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn’t fall under UK GDPR jurisdiction.

So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical."

hiddengoat
link
fedilink
38M

Just using the information you have posted publicly in various places someone that has access to the right sources could pick your rather unique mobile device out of a haystack with very little issue. Doing so would give them location data that, combined with a number of hobbies you mention, would give them a reasonable assumption of a few different places you could be found in a given area. From that point it’s down to either obtaining surveillance video or, more readily, just trawling the background of photos that are tagged with that location and using physical descriptors you’ve used to determine which individual is you.

And from there it’s just a matter of tracing other appearances you made in other people’s photos and surveillance video.

They already have you, whether you want them to or not.

Indeed: spend enough time and effort and anybody can be deanonymized and fully documented. The point is that privacy-conscious individuals should make it as difficult to automate as possible.

Clearview - and to a large extent all the other corporate surveillance players - go primarily for the low hanging fruits: people who post selfies with their names attached or don’t remove the EXIF data, tagged group photos and such. Bots can easily scrape those. If you go out of your way to either not provide that data in the first place, or pollute the well by providing fake photos and/or fake names attached, you make it harder for big data to exploit your data.

It’s still possible, just less likely unless you’re a high value target - and realistically, most people aren’t.

Create a post

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

  • 0 users online
  • 57 users / day
  • 383 users / week
  • 1.5K users / month
  • 5.7K users / 6 months
  • 1 subscriber
  • 2.44K Posts
  • 57.6K Comments
  • Modlog