A Google spokesperson stated that the discrepancies in the results were not caused by censorship, and content about the Tiananmen Square massacre can be obtained in any language or regional setting through Google searches. The spokesperson said that in some cases, travel pictures will win attention, and when search engines detect travel intent, this is more likely for searchers who are closer to Beijing or input Chinese. Searching Tiananmen Square from Thailand or the United States using Google’s Chinese settings will also prompt the nearest, clean image of the historical site.
“We localize the results to your preferred region and language so that you can quickly access the most reliable information,” the spokesperson said. Google users can adjust their own results by adjusting their location settings and language.
Search Atlas collaborators also built maps and visualizations to show the differences in search results on a global scale. A Christian image showing how images of searching for “God” in Europe and America produced beards, Buddha images in some Asian countries, and the Arabic script of Allah in the Persian Gulf and northeastern Africa. A Google spokesperson said the results reflect how its translation service has converted the English term “God” into words with more specific meanings in certain languages, such as Allah in Arabic.
The other information boundaries drawn by the researchers did not directly map to national or language boundaries. The results of “how to deal with climate change” tend to separate island countries from mainland countries. In European countries such as Germany, the most common words in Google search results are related to policy measures such as energy conservation and international agreements; for islands such as Mauritius and the Philippines, the results are more likely to point out the immensity and urgency of the threat of climate change or sea level rise. Sex.
Search Atlas was published at the Designing Interactive Systems academic conference last month; its creator is testing a private beta version of the service and considering how to expand access to it.
Search Atlas cannot reveal why different versions of Google portray the world differently. The company’s lucrative ranking system is tightly controlled, and the company has hardly explained how it adjusts results based on geography, language or personal activity.
Ye, the co-creator of Search Atlas, said that regardless of the exact reason why Google shows or doesn’t show specific results, they have an ability that can easily be overlooked. “People ask search engines about things they never ask a person, and what they happen to see in Google search results can change their lives,” Ye said. “It could be’how do I have an abortion?’ a restaurant near you, or how do you vote or get vaccinated.”
WIRED’s own experiments show how people in neighboring countries are led by Google to very different information about a hot topic. When WIRED asked Search Atlas about the ongoing war in Ethiopia’s Tigray region, Google’s Ethiopian version pointed out that Facebook pages and blogs criticized Western diplomatic pressure to ease the conflict, suggesting that the United States and other countries are trying to weaken Ethiopia.The search results of neighboring Kenya and the US version of Google more prominently show New York Times.
Ochigame and Ye are not the first to point out that search engines are not neutral. Their project was partly inspired by the work of Safiya Noble, co-founder and co-director of the Critical Internet Research Center at UCLA.she was 2018 book Suppression algorithm Explored how Google searches using words such as “black” or “Hispanic” can produce results that reflect and reinforce social prejudice against certain marginalized people.
Noble said the project can provide a way to explain the true nature of search engines to a wider audience. “It’s hard to see how search engines are undemocratic,” she said.