The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 
 
 
 

Clearview AI Is So Broke It’s Now Offering Lawsuits Plaintiffs A Cut Of Its Extremely Dubious Future Fortunes

DATE POSTED:June 17, 2024

Clearview was probably its healthiest when it was still flying under the radar. It courted billionaires with a new facial recognition tech plaything — one capable of searching millions (and, ultimately, billions) of images for a match for any uploaded photo. The dirty secret? All of the images and data had been scraped from the web without the consent of any of its millions of “participants.”

Then it decided there just weren’t enough billionaires to go around. To get big in the surveillance world, a company needed to start courting governments. Clearview started pitching its tech to law enforcement agencies. The problem with doing that is that it creates the sort of paper trail public records requesters can obtain.

That was the beginning of the end. Kashmir Hill’s exposé of the company for the New York Times clued in millions to its existence, its web-scraping tactics, and its unseemly marketing efforts. Lawsuits followed in quick succession. So did orders from foreign governments forbidding Clearview from doing business on their turf. And those orders came with hefty price tags attached for multiple violations of local privacy laws.

But it wasn’t just Europe trying to collect cash from Clearview. As the company continued to tout the billions of images at its disposal, lawsuits filed in the US proved successful. One of those alleged violations of Illinois privacy laws. And it’s this class action lawsuit that’s finally paying off, although it seems unlikely the proposed pay off will result in anything of value for the class action suit’s plaintiffs. Here’s Kashmir Hill again with the latest for the New York Times:

Anyone in the United States who has a photo of himself or herself posted publicly online — so almost everybody — could be considered a member of the class. The settlement would collectively give the members a 23 percent stake in Clearview AI, which is valued at $225 million, according to court filings. (Twenty-three percent of the company’s current value would be about $52 million.)

If the company goes public or is acquired, those who had submitted a claim form would get a cut of the proceeds. Alternatively, the class could sell its stake. Or the class could opt, after two years, to collect 17 percent of Clearview’s revenue, which it would be required to set aside.

Well, all well and good if you’re hoping Clearview’s fortunes improve and that it finds plenty of public and private customers to sell access to its database of billions of scraped images. But if you’re this sort of person, you’re probably not engaging in litigation with the company. If you’re a plaintiff hoping the company will find itself pariah-ed out of the market, holding a 23 percent stake of a failing company isn’t going to get you much more than a rather unenjoyable form of schadenfreude.

Clearview’s lawyer, Jim Thompson, told the New York Times the company was “pleased” with this agreement, which is perhaps all you need to know about the company’s view of its future prospects. If it expected to be making headway towards being a billion-dollar company in the next few years, it might not have been so willing to give nearly a quarter of that away in a single lawsuit settlement. But if it sees years of financial struggle ahead, handing a peculiar I.O.U. to class action plaintiffs is a pretty good way to keep some cash in the coffers and halt the financial bleeding that is always part of protracted litigation.

Clearview may have lost the war, but it seems to have won this particular battle. It won’t have to pay anything now. And, given the numerous issues it still faces in other countries, it likely won’t have any cash lying around to pay anything in the future either.