The UK data protection regulator has announced its intention to Fined 17 million pounds (Approximately $23 million) to Clearview AI, the controversial facial recognition company.
If you have read any of our previous articles about the company, you will know that Clearview AI basically positions itself as an influential social network contact lookup service, even in its huge facial recognition database No one has registered for “belonging to” “services”.
In short, the company crawls on the Internet, looking for facial images from what it says “Only public sources, including news media, facial photo sites, public social media and other public sources.”
The company claims to have a database of more than 10 billion facial images and positions itself as a friend of law enforcement, able to search for photos that match facial photos and crime scene footage to help track down alleged criminals who may never be found.
In any case, this is the theory: find criminals who would otherwise evade recognition and justice.
Of course, in practice, any photos of you that have appeared on social media sites such as Facebook can be used to “recognize” you as a suspect or other interested person in a criminal investigation.
What’s important is that this “identification” not only happens without your consent, but you also don’t know that the system claims that there is a connection between you and criminal activity.
Therefore, when your portrait is uploaded to the relevant service (if you even know it has already been uploaded), any expectations you may have on how to use it and obtain permission will be completely ignored.
Understandably, this attitude has triggered a huge privacy backlash, including large social media brands such as Facebook, Twitter, YouTube, and Google.
You can’t do that!
At the beginning of 2020, those behemoths firmly told Clearview AI, “Stop stealing image data from our services.”
You don’t have to like any of these companies, or their own data throughput service terms and conditions, to sympathize with their position.
Uploaded images, no matter how they are publicly displayed, will not suddenly stop becoming personal information because they are published, and the terms and conditions applicable to their continued use will not magically disappear immediately when they appear online.
Clearview does not seem to have all of this, its confident and unapologetic founder Hoan Ton-That claim That:
Have […] a The public information right of the First Amendment. Therefore, the way we build the system is to obtain only publicly available information and index it in this way.
The other side of the coin, as the commenter pointed out CBS Video The source of the above citation is the observation:
You are so focused on whether you can, you don’t stop to think about whether you should.
In the 22 months since the video was broadcast, Clearview AI has obviously been serious about crawling Internet images because it claimed to have processed 3 billion images at the time, but now claims that there are more than 10 billion images in its database. picture.
Although the lawsuit filed against it implies obvious public opposition, including a class action lawsuit in Illinois, which has the strictest biometric data processing regulations in the United States, as well as the American Civil Liberties Union (ACLU) and four community organizations.
Britain and Australia join the competition
Claiming First Amendment protection is an interesting trick in the United States, but it is meaningless in other jurisdictions, including the United Kingdom and Australia, where the constitutions of these jurisdictions are completely different from the United States (in the case of the United Kingdom, it is completely different Constitutional institutions).
The two countries decided to pool resources and conduct a joint investigation into Clearview. The privacy regulators of both countries have recently issued reports on their findings and explained the results in local languages.
Office of the Australian Information Commissioner (OAIC) Decided That clear view “Interfering with the privacy of Australian individuals” Because the company:
- Sensitive information collected Without consent;
- Information collected Illegal or unfair means;
- Individuals not notified Data collected; and
- There is no guarantee that the information is Accurate and up-to-date.
Their colleagues in the UK ICO (Office of the Information Commissioner) came to Similar conclusion, Including Clearview:
- No legal reason Used to collect information first;
- No processing information In the way one might expect;
- No process Stop keeping data indefinitely;
- Does not meet “higher data protection standards” Biometric data required;
- Didn’t tell anyone What happened to their data.
Roughly speaking, both OAIC and ICO clearly concluded that personal privacy rights outweigh any “fair use” or “freedom of speech” considerations, and both regulators explicitly condemned Clearview’s data collection as illegal.
ICO has Decide now What it actually plans to do and what it thinks of the Clearview business model.
The proposed interventions include: the above 17 million pounds (23 million U.S. dollars) fine; Require No longer have access to the data of British residents; And notice Delete all data about the British Clearview already has it.
The Australians did not seem to have filed financial penalties and also required Clearview to Will not crawl Australia data in the future; must Delete all data that has been collected from Australians; And must In writing within 90 days It has accomplished these two things.
What’s next?
According to reports, Clearview CEO Hoan Ton-That has reacted Regarding these clearly unfavorable findings, let’s start with a remark that would certainly not be inappropriate in a tragic love song:
What breaks my heart is that Clearview AI was unable to provide assistance when it received an urgent request from a British law enforcement agency to seek to use the technology to investigate serious child sexual abuse cases in the UK.
However, Clearview AI may find that many opponents will reply to their own lyrics:
cry Me A River. (Don’t act like you don’t know.)
what do you think?
Is Clearview AI providing a truly useful and acceptable service for law enforcement, or is it just using proverbs? (Tell us in the comments. You can remain anonymous.)