124

"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK's privacy watchdog.

Last year, Clearview AI was fined more than £7.5m by the Information Commissioner's Office (ICO) for unlawfully storing facial images.

Privacy International (who helped bring the original case I believe) responded to this on Mastodon:

"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview's activities are entirely "related to the monitoring of behaviour" of UK data subjects.

In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.

BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn't fall under UK GDPR jurisdiction.

So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn't, it can do whatever the hell it wants with UK people's data - this is at best puzzling, at worst nonsensical."

you are viewing a single comment's thread
view the rest of the comments
[-] hiddengoat@kbin.social 3 points 1 year ago

Just using the information you have posted publicly in various places someone that has access to the right sources could pick your rather unique mobile device out of a haystack with very little issue. Doing so would give them location data that, combined with a number of hobbies you mention, would give them a reasonable assumption of a few different places you could be found in a given area. From that point it's down to either obtaining surveillance video or, more readily, just trawling the background of photos that are tagged with that location and using physical descriptors you've used to determine which individual is you.

And from there it's just a matter of tracing other appearances you made in other people's photos and surveillance video.

They already have you, whether you want them to or not.

[-] ExtremeDullard@lemmy.sdf.org -1 points 1 year ago

Indeed: spend enough time and effort and anybody can be deanonymized and fully documented. The point is that privacy-conscious individuals should make it as difficult to automate as possible.

Clearview - and to a large extent all the other corporate surveillance players - go primarily for the low hanging fruits: people who post selfies with their names attached or don't remove the EXIF data, tagged group photos and such. Bots can easily scrape those. If you go out of your way to either not provide that data in the first place, or pollute the well by providing fake photos and/or fake names attached, you make it harder for big data to exploit your data.

It's still possible, just less likely unless you're a high value target - and realistically, most people aren't.

this post was submitted on 20 Oct 2023
124 points (97.7% liked)

Privacy

32177 readers
406 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS