Key Takeaways Apple plans to switch all its Macs to Apple Silicon chips within two years. The current iMac is due for an update—its design dates back to 2008. This year’s Pro Macs may get a hot-rodded version of the M1 chip. Apple With new chips, a new iMac, and new laptops, 2021 might be the biggest year for the Mac since 1984. At the end of last year, App...
- Online stock images can show racist stereotypes of minorities, observers say.
- A Lifewire review of stock images found caricatures of Jews.
- Last year, Pete Buttigieg’s presidential campaign was criticized over its use of a stock image of a Kenyan woman on a campaign webpage.
Nora Carol Photography
Some online stock images depicting women and minorities are coming under increasing scrutiny from critics who say they can perpetuate racist and misogynistic stereotypes.
Stock image sites have been accused of under-representing minorities and marginalized groups. The photos that have been downloaded from these sites have gotten some politicians into trouble for using the wrong ethnicity in campaigns. And in some isolated instances, the images appear to demean the people they mean to represent.
“There is implicit bias in stock imagery, mostly due to the way the images are tagged and categorized,” Minal Bopaiah, Founder & Principal Consultant of Brevity & Wit, a design firm that focuses on diversity, said in an email interview. “For example, if you search for ‘attractive woman,’ most stock image databases return results that are predominantly White and of the same body size and shape. There are very few women of color who show up, and almost never images of women with any visible disabilities.”
A quick search of stock images sites found illustrations that seem biased. A review of Getty Images by Lifewire found some pictures that appear to reinforce anti-Semitic stereotypes. For example, one image shows a long-nosed man with devil wings holding a coin. The illustration is labeled “Making a deal with the devil, horned red demon flying and showing a Bitcoin Cryptocurrency to a man.”
Lifewire asked the Anti Defamation League (ADL), an anti-bias organization, to review these images.
“The character depicted in this image, with his stereotypically large nose, dark clothing, and desire for money, may raise anti-Semitic tropes in viewers, an ADL spokesperson said in an email interview. “There are over a dozen other images in this series in which this character is in situations that evoke similar anti-Jewish stereotypes. We do not know whether the artist intended to include these anti-Semitic implications or whether this is merely an unfortunate coincidence.”
alashi / Getty Images
A search of various terms for Jews and Judaism on Getty Images “returns some results which may give pause, including images of devils,” the ADL spokesperson added. “However, these images are also often tagged with labels such as “Christianity” and “Religion,” which suggests that they do not appear to be specifically targeting Jews, but rather western religions in general. Initial searches on Shutterstock and VectorStock found similarly limited results, with some images that may elicit anti-Semitic stereotypes even if not overtly anti-Semitic.”
The ADL spokesperson said the organization did not have any information that the issue of anti-Semitic images on stock photo sites is widespread, but added, “we are aware that various stock websites have at times included offensive images, some of which include anti-Semitic stereotypes, in their inventories.”
Anne Flanagan, Senior Director and Head of External Communications for Getty Images said in an email interview that the company is “reviewing the content to make sure the images depicted are compliant” with existing content policies. She added that “Getty Images regularly reviews content to ensure that it is compliant with not only legal, but also its societal responsibilities, and we have strict policies and standards in place to govern our contributors in the submission of content and our content inspectors in the review and approval of content submitted for inclusion on the site.”
Issues involving politics and stock images have emerged around the growing racial tension in American politics. Last year, Pete Buttigieg’s presidential campaign was criticized over its use of a stock image of a Kenyan woman on a campaign web page promoting his plan to address racial inequality. Rep. Ilhan Omar, D-Minn., tweeted that the use of the Kenyan image was “not ok or necessary.”
Companies and some public figures have been criticized for using stock images surrounding the recent Black Lives Matter protests. For example, New Orleans Saints quarterback Drew Brees was criticized for using a six-year-old ‘handshake against racism’ stock image. It was part of a public apology for saying that he will “never agree with anybody disrespecting the flag of the United States of America.”
The bias issue in stock images is common to many large data sets, experts say.
“Photographers upload images that can unconsciously reinforce social stereotypes,” Mikaela Pisani, AI company Rootstrap‘s Chief Data Scientist and Head of the company’s Machine Learning Practice Area, said in an email interview. “As users choose the same photos over and over again, the recommendation algorithms are skewed towards a social bias by surfacing ‘popular’ images.”
timsa / Getty Images
Default searches may contain implicit bias, experts say.
“A search for ‘man’ or ‘woman’ on iStock, for example, has a noticeable lack of Asian and South Asian people,” Pisani said. “Beyond racial stereotypes, other biases such as age should also be considered and their impacts on society as conveyed through the use of stock photography.
“The racism is not overt—there are still options, the question is how many options. When searching for ‘business people’ on Shutterstock, 15million results surfaced. When filters of ‘Caucasian woman’ and ‘Black woman’ are applied—the discrepancy is almost a 1.9million results vs just 25,000.”
Racism can also be subtle in some stock images, observers say. Bopaiah cites the example of “centering White people in a ‘multicultural’ image and putting people of color on the margins of the frame, and in failing to incorporate people of color in issues that concern other marginalized groups. There is a dearth of images of people of color with disabilities, which means the needs of people of color with disabilities are often ignored or erased.”
Dealing With the Issue
Education is key to combating the problem, experts say. “Stock image companies should educate their staff editors so that they are better equipped to identify and flag racist and anti-Semitic stereotypes and images,” the ADL spokesperson said. “Implicit bias and other anti-bias training for their staffs could help prevent potentially offensive images from making it into their catalogs.”
Stock photography sites have attempted to tackle bias issues by encouraging diversity. For example, Getty’s #ShowUs collection of images of purposefully diverse women, that don’t conform to the ‘Instagram-standard’ of women’s bodies. “It’s a step in the right direction, ensuring that users have to take fewer steps to access a wide variety of images that aren’t conforming to stereotypes,” Pisani said.
Companies should avoid using stock photo images that exacerbate racial tensions such as those that show police brutality, Wendy Melillo, an Associate Professor of Journalism in the School of Communication at American University, said in an email interview.
“From a strategic communications perspective, company officials in charge of messaging need to ask themselves ‘why am I choosing this stock photo image and what am I trying to say?’,” Melillo said. “If these officials are using the photo as a way to project some image that their company stands in solidarity against racism, they would be better off keeping their mouth shut. Such a strategy is not authentic and will only invite criticism rather than respect.”
In this fraught election year, the US appears more divided than ever. Stock images may be a small but ever present part of the problem as they reinforce stereotypes and biases.