Google Changes Image Search Yet Again

February 16, 2022

Google's 'CEO' image search gender bias hasn't actually been fixed

Two side by side screenshots of Google Image search results. One (a search for "CEO") shows at least four images that have women in them and the other (a search for "CEO United States") shows one.

Image search results in Google still reverberate gender bias. A search for an occupation, such every bit "CEO," yielded results with a ratio of cis-male person and cis-female presenting people that match the current statistics. But when UW researchers added another search term — for example, "CEO United States" — the prototype search returned fewer photos of cis-female presenting people.Academy of Washington

We use Google's image search to help united states understand the globe effectually us. For instance, a search about a certain profession, "truck driver" for instance, should yield images that show us a representative smattering of people who drive trucks for a living.

Only in 2015, University of Washington researchers found that when searching for a diverseness of occupations — including "CEO" — women were significantly underrepresented in the image results, and that these results can change searchers' worldviews. Since so, Google has claimed to take fixed this issue.

A different UW team recently investigated the company's veracity. The researchers showed that for four major search engines from around the world, including Google, this bias is only partially fixed, according to a newspaper presented in February at the AAAI Conference of Artificial Intelligence. A search for an occupation, such as "CEO," yielded results with a ratio of cis-male and cis-female person presenting people that matches the current statistics. Simply when the team added another search term — for example, "CEO + The states" — the image search returned fewer photos of cis-female presenting people. In the paper, the researchers propose three potential solutions to this issue.

"My lab has been working on the consequence of bias in search results for a while, and we wondered if this CEO paradigm search bias had only been fixed on the surface," said senior author Chirag Shah, a UW associate professor in the Information School. "We wanted to be able to show that this is a problem that tin exist systematically fixed for all search terms, instead of something that has to be stock-still with this kind of 'whack-a-mole' approach, one problem at a fourth dimension."

The team investigated image search results for Google as well equally for Red china'due south search engine Baidu, South korea's Naver and Russia's Yandex. The researchers did an image search for 10 common occupations — including CEO, biologist, computer programmer and nurse — both with and without an boosted search term, such every bit "United States."

"This is a common approach to studying machine learning systems," said lead author Yunhe Feng, a UW postdoctoral boyfriend in the iSchool. "Like to how people exercise crash tests on cars to brand sure they are safety, privacy and security researchers endeavor to challenge computer systems to see how well they hold up. Here, we but inverse the search term slightly. We didn't look to come across such different outputs."

For each search, the team collected the superlative 200 images and then used a combination of volunteers and gender detection AI software to identify each face equally cis-male or cis-female presenting.

One limitation of this study is that it assumes that gender is a binary, the researchers acknowledged. But that allowed them to compare their findings to data from the U.South. Bureau of Labor Statistics for each occupation.

The researchers were especially curious almost how the gender bias ratio changed depending on how many images they looked at.

"We know that people spend most of their time on the first page of the search results because they desire to find an answer very quickly," Feng said. "But peradventure if people did scroll by the first page of search results, they would outset to come across more than diversity in the images."

When the squad added "+ United States" to the Google paradigm searches, some occupations had larger gender bias ratios than others. Looking at more images sometimes resolved these biases, merely non always.

While the other search engines showed differences for specific occupations, overall the tendency remained: The addition of another search term changed the gender ratio.

"This is non simply a Google problem," Shah said. "I don't want to make information technology sound like we are playing some kind of favoritism toward other search engines. Baidu, Naver and Yandex are all from different countries with different cultures. This problem seems to be rampant. This is a problem for all of them."

The team designed iii algorithms to systematically address the result. The first randomly shuffles the results.

"This ane tries to shake things up to keep it from existence so homogeneous at the top," Shah said.

The other two algorithms add more strategy to the image-shuffling. One includes the prototype'south "relevance score," which search engines assign based on how relevant a result is to the search query. The other requires the search engine to know the statistics bureau data and and so the algorithm shuffles the search results so that the acme-ranked images follow the real trend.

The researchers tested their algorithms on the paradigm datasets collected from the Google, Baidu, Naver and Yandex searches. For occupations with a large bias ratio — for example, "biologist + United States" or "CEO + United States" — all three algorithms were successful in reducing gender bias in the search results. But for occupations with a smaller bias ratio — for example, "truck driver + Usa" — only the algorithm with cognition of the actual statistics was able to reduce the bias.

Although the team's algorithms can systematically reduce bias beyond a variety of occupations, the real goal will exist to encounter these types of reductions show up in searches on Google, Baidu, Naver and Yandex.

"We tin explicate why and how our algorithms work," Feng said. "Merely the AI model behind the search engines is a black box. Information technology may not be the goal of these search engines to present information adequately. They may be more than interested in getting their users to engage with the search results."

For more data, contact Shah at chirags@uw.edu and Feng at yunhe@uw.edu.

Tag(s): Chirag Shah • Information Schoolhouse • Yunhe Feng


ledgerverting50.blogspot.com

Source: https://www.washington.edu/news/2022/02/16/googles-ceo-image-search-gender-bias-hasnt-really-been-fixed/

0 Response to "Google Changes Image Search Yet Again"

Postar um comentário

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel