EARLY DRAFT
TK: note the various supposed sources of failures (searcher, search literacy, search engine (policy, ranking, indexing, advertising), missing content (lack of funding, lack of expectation of being found (connected w/ undone science)))
Journal: New Media & Society
Year: 2022
DOI: 10.1177/14614448221136505
Keywords: Complaint, Google, repair, search engines, sociotechnical imaginaries
In early 2017, a journalist and search engine expert wrote about “Google’s biggest ever search quality crisis.” Months later, Google hired him as the first Google “Search Liaison” (GSL). By October 2021, when someone posted to Twitter a screenshot of misleading Google Search results for “had a seizure now what,” users tagged the Twitter account of the GSL in reply. The GSL frequently publicly interacts with people who complain about Google Search on Twitter. This article asks: what functions does the GSL serve for Google? We code and analyze 6 months of GSL responses to complaints on Twitter. We find that the three functions of the GSL are: (1) to naturalize the logic undergirding Google Search by defending how it works, (2) perform repair in responses to complaints, and (3) boundary drawing to control critique. This advances our understanding of how dominant technology companies respond to critiques and resist counter-imaginaries.
TK: link to topical (etc.) pages for types of complaints (sexist, racist, climate, politics, advertising, autocomplete, policies, etc.); see also search audits
Publisher: NYU Press
Year: 2018
My goal in this book is to further an exploration into some of these digital sense-making processes and how they have come to be so fundamental to the classification and organization of information and at what cost. As a result, this book is largely concerned with examining the commercial co-optation of Black identities, experiences, and communities in the largest and most powerful technology companies to date, namely, Google. I closely read a few distinct cases of algorithmic oppression for the depth of their social meaning to raise a public discussion of the broader implications of how privately managed, black-boxed information-sorting tools have become essential to many data-driven decisions. I want us to have broader public conversations about the implications of the artificial intelligentsia for people who are already systematically marginalized and oppressed. I will also provide evidence and argue, ultimately, that large technology monopolies such as Google need to be broken up and regulated, because their consolidated power and cultural influence make competition largely impossible. This monopoly in the information sector is a threat to democracy, as is currently coming to the fore as we make sense of information flows through digital media such as Google and Facebook in the wake of the 2016 United States presidential election.
Journal: The Georgetown Law Technology Review
Volume: 2 | Pages: 557–584
Year: 2018
Search engines no longer merely shape public understanding and access to the content of the World Wide Web: they shape public understanding of the world. Search engine results produced by secret, corporate-curated “search scripts” of algorithmic and human activity influence societies’ understanding of history, and current events. Society’s growing reliance on online platforms for information about current and historical events raises the stakes of search engines’ content moderation practices for information providers and seekers and society. Public controversies over the results returned by search engines to politically and morally charged queries evidence the growing importance, and politics, of corporations’ content moderation activities.
Despite public concern with the political and moral impact of search engine results, search engine providers have resisted requests to alter their content moderation practices, responding instead with explanations, directions, and assistance that place responsibility for altering search results on information providers and seekers.
This essay explores a public controversy around the results Google’s search engine returned to the query “did the holocaust happen” in order to understand how different imaginaries of the script of search contribute to the production of problematic results and shape perceptions of how to allocate responsibility for fixing it.
Note: TK indicates something “to come”.
Brereton, D. (2022). Google search is dying. DKB. https://dkb.io/post/google-search-is-dying [brereton2022google]
Cadwalladr, C. (2016a). Google, democracy and the truth about internet search. The Guardian, 4(12). https://www.theguardian.com/technology/2016/dec/04/google-democracy-truth-internet-search-facebook [cadwalladr2016google]
Cadwalladr, C. (2016b). Google is not “just” a platform. It frames, shapes and distorts how we see the world. The Guardian, 11(12). https://www.theguardian.com/commentisfree/2016/dec/11/google-frames-shapes-and-distorts-how-we-see-world [cadwalladr2016googleb]
Carr, N. (2008). Is google making us stupid? https://www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868/. [carr2008google]
Carr, N. (2010). The shallows: What the internet is doing to our brains. W. W. Norton & Company. https://www.nicholascarr.com/?page_id=16 [carr2010shallows]
Chayka, K. (2022). What google search isn’t showing you. The New Yorker. https://www.newyorker.com/culture/infinite-scroll/what-google-search-isnt-showing-you [chayka2022google]
Griffin, D., & Lurie, E. (2022). Search quality complaints and imaginary repair: Control in articulations of Google Search. New Media & Society, 0(0), 14614448221136505. https://doi.org/10.1177/14614448221136505 [griffin2022search]
Lewis, A. C. (2023). The people who ruined the internet. In The Verge. https://www.theverge.com/features/23931789/seo-search-engine-optimization-experts-google-results. [lewis2023ruined]
Mulligan, D. K., & Griffin, D. (2018). Rescripting search to respect the right to truth. The Georgetown Law Technology Review, 2(2), 557–584. https://georgetownlawtechreview.org/rescripting-search-to-respect-the-right-to-truth/GLTR-07-2018/ [mulligan2018rescripting]
Noble, S. U. (2018). Algorithms of oppression how search engines reinforce racism. New York University Press. https://nyupress.org/9781479837243/algorithms-of-oppression/ [noble2018algorithms]
Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778. https://doi.org/10.1126/science.1207745 [sparrow2011google]
Sullivan, D. (2017). A deep look at google’s biggest-ever search quality crisis. Search Engine Land. https://searchengineland.com/google-search-quality-crisis-272174 [sullivan2017deep]