Nowadays, facial recognition is no longer confined to cops and spies in action movies. Numerous smartphone users use the technology to sign into their device each day, while related software is also used for passport controls at airports.

Facial recognition has made verifying identities and logging into our phones a lot easier. But some people have become concerned that others could use their online pictures for the software.

If you’re a Flickr user, an online tool has emerged to help you check whether your pictures were used for facial recognition. And if so, is there anything you can do about it?

Why Would My Online Pictures Be Used for Facial Recognition Software?

To test facial recognition technology, developers need—well—faces. Often, they choose photos that have been published online to see whether or not their technology works.

Facial recognition requires numerous different faces to improve. Gender, skin color, age, and other features are all taken into consideration.

As a photo-sharing website, Flickr is a clear target for companies and professionals looking for pictures. In 2019, it was widely reported that IBM used photos from Flickr for facial recognition training programs.

person looking at phone

The company wanted to use a diverse range of faces to improve bias-related facial recognition issues. However, the people in the pictures were not contacted to see if using their images would be okay.

Related: How Facial Recognition Search Is Destroying Your Privacy

In an article published by NBC, one photographer with pictures in the database said:

“None of the people I photographed had any idea their images were being used in this way.

“It seems a little sketchy that IBM can use these pictures without saying anything to anybody.”

Problems With Facial Recognition

Racial bias has been a big talking point for facial recognition. Studies have been carried out to investigate the issue further, including one published in 2020 by Harvard University.

In the study, people described as a “darker female” had the lowest accuracy levels when it came to facial recognition for every provider.

The research also looked at a history of racist law enforcement policies in the US and how technology could cause a continuation of these.

IBM has expressed these concerns itself. In 2020, the company pulled out of the facial recognition market. At the time, CEO Arvind Krishna wrote a public letter to Congress.

The letter called for “a national dialogue on whether and how facial-recognition technology should be employed by domestic law enforcement agencies.”

IBM also said:

“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparency.”

How Can I Check to See if My Photos Have Been Used?

If you have a Flickr account, you can use a tool called exposing.ai.

Exposing.ai searches through over three million photos on Flickr spread across six different datasets. You can use the platform to see if your pictures were used for various purposes, including:

  • Enhancing facial recognition technology
  • Training
  • Testing

To check if your photos were used for facial recognition software, go to the expose.ai website. On the homepage, you’ll see a search space at the top.

In this search space, you can enter any of the following:

  • Your Flickr username
  • A hashtag
  • A photo URL
homepage for exposing.ai

After entering your details and tapping on the Search key, you’ll find out whether your photos were used within a few seconds.

results for exposing.ai

Is There Anything I Can Do if My Photos Were Used for Facial Recognition?

Not for the most part, no. If your face is already in a dataset out in the open, you can’t get it removed.

Related: Is Facial Recognition Legal in Your Country?

You can stop the same thing from happening in the future, though, by requesting that your picture is not distributed in releases going forward.

As an example, the photos IBM used were initially part of the YFCC100M. These photos were published online as part of a Creative Commons License. Barring some exceptions, they can be used freely.

Has Your Face Been Used for Facial Recognition?

Many people don’t want their faces brandished on social media pages, let alone used for research. While facial recognition software must improve, it’s also understandable that some users won’t want their faces distributed without consent.

While there’s not much you can do if your face has already been used in a dataset, exposing.ai helps show companies’ artificial intelligence tactics.

For the most part, all you can do right now is read about image sharing rights before publishing on any platform. Other than that, you’d need to wait for regulatory changes to stop similar things from happening.