This week, NBC reported that facial recognition researchers at companies like IBM often feed algorithms photos from publicly available collections, often on Flickr, without requesting permission from the people who are photographed. The incident raised the question of whether or not such training could be considered fair use under Creative Commons licenses. It looks like the answer is yes. Copyright policies won’t protect people who just want companies like IBM to leave their photos alone. “Copyright is not a good tool to protect individual privacy,” Creative Commons writes in a blog post.
According to the NBC report from earlier this week, IBM took nearly a million photos from Flickr to train facial recognition programs. Many Flickr photographers told NBC that they and the people photographed had not been made aware that their images were being fed to facial recognition algorithms. “None of the people I photographed had any idea their images were being used in this way,” one person told NBC. “It seems a little sketchy that IBM can use these pictures without saying anything to anybody.”
In a statement, IBM told The Verge that it takes “the privacy of individuals very seriously and have taken great care to comply with privacy principles, including limiting the Diversity in Faces dataset to publicly available image annotations and limiting the access of the dataset to verified researchers.”
Under the most permissive of Creative Commons licenses, almost all uses are fair use. In fact, in its frequently asked questions page, under a new section discussing AI, Creative Commons explains that its licenses “have been carefully designed to work with all new technologies where copyright comes into play. No special or explicit permission regarding new technologies from a copyright perspective is required.”
That essentially means that even if someone invented a brand-new technology tomorrow, many Creative Commons photos would still be fair game. When it comes to questions of ethical concerns and data protection, copyright just can’t really help. People who feel that their photos have been misused don’t have much legal recourse unless public policy comes out with regulation of facial recognition. In the case of IBM, photographers can request that their photos be removed if they can verify that they were used in the first place. Helpfully, NBC has a tool within its story where photographers can check.