The internet has provided unprecedented access to information, particularly through search engines. But instead of trawling through such a vast compilation of sources, most internet users perform cursory searches that often don’t take them past the first page of results. This is understandable: most people aren’t doing rigorous research regularly, instead looking for direct answers of limited depth to questions that come up throughout daily life. But in trusting whatever results a search engine pushes out first, we are assigning a degree of objectivity to a decidedly subjective platform. Search engines are influenced by societal structures, financial interests and other factors which can affect what rises to the top of a page.
The issue of subjectivity and control of information is not a new one. Before the internet, we were already putting trust in corporations and government institutions—whether news media companies, book publishers or libraries—to give us accurate facts. But even if the facts were accurate, we also were subject to the choices of these institutions in terms of what knowledge was pushed to the fore. Which stories did a newspaper cover and which did they ignore? Which titles were placed at the front of a bookstore? In this manner, the status quo could be reinforced by centering information that supported the dominant narrative on any given subject. Preexisting societal structures dictated that the people in control of the flow of information skewed white, male and wealthy. The social status of these arbiters of truth meant that even absent malicious intent, information channels were passively infected with the biases of those in control.
Our current reliance on search engines presents the same issue in a new context. How can we trust results that are filtered by a huge corporation with interest in profits, social hierarchy and the stability of the status quo? How can results be unbiased when they reflect a systemically prejudiced society? Safiya Noble writes in Algorithms of Oppression:
“Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings. While we often think of terms such as ‘big data’ and ‘algorithms’ as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors” (1).
These biases have an impact not only on searches surrounding race, as Noble goes on to demonstrate in her book, but on the representation of other politically loaded topics. In order to test the representation of environmental issues, I analyzed the results generated around the subject of the Great Barrier Reef, the huge coral reef off the coast of Australia. I specifically searched the phrase “Why is the Great Barrier Reef” on three different search engines: Google, Microsoft Bing and Yahoo Search, and analyzed both the autocompleted search suggestions as well as the first page of results.
All three engines suggested I complete the phrase “Why is the Great Barrier Reef” with the words “famous” and “important.” While one could argue that this indicates people looking to either support or interrogate the reef’s iconic status, I personally don’t think there’s much more to this suggestion than curiosity. Of course this is all speculative, but I don’t see a clear moral judgment attached to these terms, merely interest in why the Great Barrier Reef is the only well-known coral reef in the world. Analogously, if I search “Why is Marcel Marceau important?” I am likely not questioning his importance, but trying to find out why I know his name but not the names of any other mimes.
All three engines also suggested I finish the search phrase with “in danger” or “dying.” These seem to be much more loaded terms to me. The Great Barrier Reef has experienced substantial damage from humans since the Industrial Revolution, so neither of these results are factually incorrect. But I find the connotation of “in danger” very different from “dying.” The latter implies a sort of hopeless inevitability, whereas the former feels more like a call to action: if it’s in danger we can protect it, but if it’s dying there’s not much we can do. I would expect corporations like Google or Microsoft to prefer the apathy induced by the idea that the planet is doomed, since activism that challenges the current state of affairs could hurt their bottom line. But these corporations include both the apathetic and activist suggestions in their auto-complete, perhaps because the Great Barrier Reef is a small and isolated enough issue (as opposed to, say, global fossil fuel consumption) that action to protect it doesn’t pose a broad threat to corporate interests.
Other results reflected the damage that’s been done to the Great Barrier Reef. Both Google and Yahoo included the suggestion “bleaching,” and both Bing and Yahoo included “endangered” and “under threat”. Yahoo also suggested “shrinking.” In total Google had three suggested terms directly saying that the reef is at risk, Bing had four and Yahoo had six. However, I don’t know that this disparity indicates some grand environmentalist mission from Yahoo or some sinister plot by Google to cover up environmental damage, and I feel it would be absurdly speculative to draw either conclusion.
Google suggested I complete my search with the word “recovering,” and Bing suggested I use “protected.” Both of these terms hint at efforts to conserve the Great Barrier Reef and halt or reverse the damage done to it. The implication that things are getting better, which evokes in me a trust that things will sort themselves out, which means I don’t have to take action. Thus, these suggestions imply that the issue is out of the searcher’s hands, just for the opposite reason from the suggestion that the reef is “dying.”
Both Google and Yahoo suggested I ask why the Great Barrier Reef is “a wonder of the world.” Google also included “important to humans” and “a natural wonder,” Bing included “so popular,” “significant,” and “special,” and Yahoo included “so special.” These results could either be taken as genuine curiosity or as someone questioning why they should care about the reef, so whether the reef is being represented as important or not is again pure speculation.
All in all, the autocomplete suggestions represent the Great Barrier Reef as a grand natural feature that is at risk but which people are trying to protect and restore. The first page of results on Google for “Why is the Great Barrier Reef” reiterate this representation. The first two results are information pages from UNESCO and Wikipedia which immediately make clear the sheer size of the reef. Then come three video links, two of which are simply explorations of the reef and one that is about efforts to revive the reef. The remainder of the initial results are informational pages, but two of them are primarily about why the reef is important, which supports the representation of the reef as something to be protected. One last result includes the monetary value of the Great Barrier Reef, which supports its protection by depicting it as an asset that provides jobs and boosts the economy, a new representation of why the reef is important.
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press, 2018.