Radio Free never accepts money from corporations, governments or billionaires – keeping the focus on supporting independent media for people, not profits. Since 2010, Radio Free has supported the work of thousands of independent journalists, learn more about how your donation helps improve journalism for everyone.

Make a monthly donation of any amount to support independent media.





Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids

Abusive parents searching for kids who have fled to shelters. Governments targeting the sons and daughters of political dissidents. Pedophiles stalking the victims they encounter in illicit child sexual abuse material.
The online facial recognition sea…

Abusive parents searching for kids who have fled to shelters. Governments targeting the sons and daughters of political dissidents. Pedophiles stalking the victims they encounter in illicit child sexual abuse material.

The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.

Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when the dark web has sparked an explosion of images of abuse.

“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”

Over the past few years, several child victim advocacy groups have pushed for police use of surveillance technologies to fight trafficking, arguing that facial recognition can help authorities locate victims. One child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. But searches on PimEyes for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re designed to help.

While The Intercept searched for fake faces due to privacy concerns, the results contained many images of actual children pulled from a wide range of sources, including charity groups and educational sites. PimEyes previously came under fire for including photos scraped from major social media platforms. It no longer includes those images in search results. Instead, searches churn up a welter of images that feel plucked from the depths of the internet. Some come from personal websites that parents created anonymously or semi-anonymously to feature photos of their children, likely not anticipating that they could one day be pulled up by strangers taking snapshots of kids on the street.

One search for an AI-generated child turned up images of a real boy in Delaware, where a photographer had taken portraits of his family on a sunny spring day. When she posted the portraits in her online portfolio, the photographer omitted the boy’s name and other identifying details. But a determined person might theoretically be able to find such information. (The photographer did not respond to requests to comment for this article.)

Another search turned up a girl displaying a craft project at an after-school program in Kyiv, Ukraine, in a photo taken just before the war. A second page on the same website showed the girl at home this spring; by then, Kyiv was under siege, the program had gone remote, and teachers were assigning kids craft projects to complete from their kitchen tables.

A third search turned up a photo of a 14-year-old British boy that had been featured in a video about the U.K. educational system. The commentator gave the boy’s first name and details about the school he attended.

Still another search turned up a photo of a toddler from an American home-schooling blog, where the girl’s mother had revealed her first name and, when the family was traveling, rough whereabouts.

PimEyes is the brainchild of two Polish developers who created the site in 2017 on a whim. It reportedly passed through the hands of an anonymous owner who moved the headquarters to the Seychelles and then in December 2021 was purchased by Georgian international relations scholar Giorgi Gobronidze, who had met the site’s creators while lecturing in Poland.

In a wide-ranging video interview that stretched to nearly two hours, Gobronidze offered a vague and sometimes contradictory account of the site’s privacy protections.

He said that PimEyes was working to develop better safeguards for children, though he offered varying responses when asked what those might entail. “It’s a task that was given already to our technical group, and they have to bring me a solution,” he said. “I gave them several options.”

At the same time, he dismissed the argument that parents who post anonymous photos of their children have any expectation of privacy. “Parents should be more responsible,” he said. “I have never posted a photo of my child on social media or on a public website.”

“Designed for Stalkers”

On its website, PimEyes maintains that people should only use the tool to search for their own faces, claiming that the service is “not intended for the surveillance of others and is not designed for that purpose.” But the company offers subscriptions that allow people to perform dozens of unique searches a day; the least expensive package, at $29.99 a month, offers 25 daily searches. People who shell out for the premium service can set alerts for up to 500 different images or combinations of images, so that they are notified when a particular face shows up on a new site.

Gobronidze claimed that many of PimEyes’s subscribers are women and girls searching for revenge porn images of themselves, and that the site allows multiple searches so that such users can get more robust results. “With one photo, you can get one set of results, and with another photo you can get a totally different set of results, because the index combination is different on every photo,” he said. Sometimes, he added, people find new illicit images of themselves and need to set additional alerts to search for those images. He acknowledged that 500 unique alerts is a lot, though he said that, as of Thursday, 97.7 percent of PimEyes subscribers had a lighter account.

Following criticism, the company pivoted to claiming that the search engine was a privacy tool.

PimEyes’s previous owners marketed it as a way to pry into celebrities’ lives, the German digital rights site Netzpolitik reported in 2020. Following criticism, the company pivoted to claiming that the search engine was a privacy tool. Gobronidze said that fraught features were being overhauled under his ownership. “Previously, I can say that PimEyes was tailor-designed for stalkers, [in that] it used to crawl social media,” he said. “Once you dropped a photo, you could find the social media profiles for everyone. Now it is limited only to public searches.”

But many people clearly do not see PimEyes as an aid to privacy. The site has already been used to identify adults in a wide variety of cases, from so-called sedition hunters working to find perpetrators after the January 6 insurrection, to users of the notorious site 4chan seeking to harass women.

Nor do PimEyes’s marketing materials suggest much concern for privacy or ethics. In a version of the “people kill people” argument favored by the U.S. gun lobby, a blog post on the site blithely alludes to its many uses: “PimEyes just provides a tool, and the user is obliged to use the tool with responsibility. Everyone can buy a hammer, and everyone can either craft with this tool, or kill.”

“These things should only be instrumentalized with the clear and knowledgeable consent of users,” said Daly Barnett, a staff technologist at the Electronic Frontier Foundation. “This is just another example of the large overarching problem within technology, surveillance-built or not. There isn’t privacy built from the get-go with it, and users have to opt out of having their privacy compromised.”

“We Do Not Want to Be a Monster Machine”

Alarmingly, search results for AI-generated kids also include images that PimEyes labels as “potentially explicit.” The backgrounds in the labeled images are blurred, and since clicking through to the source URLs could contribute to the exploitation of children, The Intercept could not confirm whether they are, in fact, explicit. Gobronidze said that the labels are assigned in part based on images’ source URLs, and that often the photos are harmless. When PimEyes representatives do run across child sexual abuse images, he said, representatives report it to law enforcement.

But one example he gave shows how easily the site can be used to unearth abusive or illegal content. A 16-year-old girl had used her parents’ credit card to open an account, Gobronidze said. She soon found revenge porn videos that had been uploaded by an ex-boyfriend — images that likely fit the legal definition of child pornography. (He said PimEyes issued takedown notices for the websites and advised the girl to talk with authorities, her parents, and psychologists.)

Gobronidze was vague on how he might limit abuse of children on the site. Subscribing requires a credit card, PayPal, or Amazon Pay account, and users upload their IDs only when asking PimEyes to perform takedown notices on their behalf. By design, he said, the search engine only seeks matches in photos and does not guess at age, gender, race, or ethnicity. “We do not want to be a monster machine,” he said, dubbing a more heavy-handed approach “Big Brother.” But at another point in the interview, he said he was planning to exclude images of children from search results. Still later, he said that his technical team was figuring out how to balance these two conflicting goals.

PimEyes flags people who “systematically” use the engine to search for children’s faces, he said. Users who plug in one or two faces of children are typically assumed to be family members. If a PimEyes representative gets suspicious, he said, they might ask a subscriber for a document like a birth certificate that would prove that a user is a parent.

When asked how a birth certificate would rule out abuse or stalking by noncustodial parents, Gobronidze said that PimEyes might instead request a signed form, similar to what parents and legal guardians provide in some countries when crossing borders with a child, to show they have any other parent’s consent. In a later email, he said that PimEyes had twice asked for “documents + verbal explanation” for people who uploaded images of children, and that the site had subsequently banned one of the accounts.

“The fact that PimEyes doesn’t have safeguards in place for children and apparently is not sure how to provide safeguards for children only underlines the risks of this kind of facial recognition service,” said Scott, of EPIC. “Participating in public, whether online or offline, should not mean subjecting yourself to privacy-invasive services like PimEyes.”

The inclusion of children’s faces in PimEyes search results underscores just how fraught the facial recognition landscape has become.

The inclusion of children’s faces in PimEyes search results underscores just how fraught the facial recognition landscape has become. For years, victim advocacy groups have pushed for expanded use of the technology by law enforcement. The Kutcher-Moore nonprofit, Thorn, has developed a facial recognition tool called Spotlight that it provides to investigators working on sex trafficking cases, as well as to the National Center for Missing and Exploited Children. In a recent report, the center said that in 2021, Spotlight helped it identify over 400 missing children in online sex trafficking advertisements.

Commercial providers of facial recognition have also gotten into trafficking prevention. The controversial facial recognition company Clearview AI sells its tools to police for identifying child victims.

But those same tools can also be used to target the vulnerable. Clearview AI promoted the use of its database for child trafficking after being sued by the American Civil Liberties Union for endangering survivors of domestic violence and undocumented immigrants, among others. Prostasia Foundation, a child protection group that supports sex workers rights and internet freedom, contends that an earlier Thorn tool sometimes flagged images of adults, leading to the arrest of sex workers.

This tension is even more extreme with PimEyes, which has virtually no guardrails and smashes long-standing expectations of privacy for both adults and children.

Gobronidze said that PimEyes had talked to Thorn about using its tool Safer to detect child sexual abuse material using image hashing technology — a potentially odd relationship given that PimEyes makes images of children searchable to the general public, while Thorn aims to protect children from stalkers and abusers.

“There has been one exploratory call between our Safer team and PimEyes to show how Safer helps platforms detect, report and remove CSAM,” a Thorn spokesperson said, using the acronym for child sexual abuse material. “No partnership materialized after that single call and they are not users of Safer or any tools built by Thorn.”

When asked about concerns about its facial recognition tool, Thorn sent a statement through a spokesperson. “Spotlight is a highly targeted tool that was built specifically to identify child victims of sex trafficking and is only available to law enforcement officers who investigate child sex trafficking.”

In the United States, PimEyes could run up against a 1998 law requiring the Federal Trade Commission to protect children’s online privacy. But so far, U.S. regulators have homed in on sites that store images or information, said Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology. PimEyes crawls images hosted on other sites. “PimEyes is just scraping whatever they can get their hands on on the web and isn’t making promises to users about what it will and won’t do with that data,” Llansó said. “So it’s something of a gray area.”

Gobronidze is keenly aware of the distinction. “We don’t store any photos,” he claimed. “We don’t have any.”

That is not entirely true. PimEyes’s privacy policy holds that for unregistered users — anyone who uses the site without a paid account — it retains facial images, along with the “fingerprint” of a face, for 48 hours and that data from the photos indexed in results is stored for two years. A sample PimEyes search showed thumbnail images of faces — photos returned in search queries that the site has edited to blur their backgrounds. A network traffic analysis showed that those photos are hosted on a PimEyes subdomain called “collectors.”

In an email, Gobronidze said he had not previously heard or read about that subdomain and was “intrigued” to learn of it. He noted that he had forwarded the results of The Intercept’s analysis to PimEyes’s tech and data security units, adding that he could not “disclose [the] full technological cycle” because it is proprietary.

Scott, of EPIC, would rather not wait around for courts and regulators to consider the storage question. “Congress needs to act to not only protect our children, but all of us from the dangers of facial recognition technology,” he said. “Services like this should be banned. That’s how you should regulate it.”


This content originally appeared on The Intercept and was authored by Mara Hvistendahl.


Print Share Comment Cite Upload Translate Updates

Leave a Reply

APA

Mara Hvistendahl | Radio Free (2022-07-16T10:00:39+00:00) Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids. Retrieved from https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/

MLA
" » Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids." Mara Hvistendahl | Radio Free - Saturday July 16, 2022, https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/
HARVARD
Mara Hvistendahl | Radio Free Saturday July 16, 2022 » Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids., viewed ,<https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/>
VANCOUVER
Mara Hvistendahl | Radio Free - » Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids. [Internet]. [Accessed ]. Available from: https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/
CHICAGO
" » Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids." Mara Hvistendahl | Radio Free - Accessed . https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/
IEEE
" » Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids." Mara Hvistendahl | Radio Free [Online]. Available: https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/. [Accessed: ]
rf:citation
» Facial Recognition Search Engine Pulls Up “Potentially Explicit” Photos of Kids | Mara Hvistendahl | Radio Free | https://www.radiofree.org/2022/07/16/facial-recognition-search-engine-pulls-up-potentially-explicit-photos-of-kids/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.