autocomplete in web search

    October 11th, 2023

    EARLY DRAFT

    Key Points

    Autocomplete can help users enter terms more quickly or identify alternative terms (costs savings for search tools are well)

    Autocomplete can nudge users in particular research directions (for good and bad).

    Some people look at autocompletions to research misinformation and as part of search engine optimization (SEO).

    Warnings
    • While it may seem so, autocompletions for a search are not the most popular searches.
    • Autocompletions, as search queries, do not reveal search intents.
    • The phrasing of autocompletes are not necessary indicative of the content in the search results.

    Similar terms

    Google Suggest, search autocompletes, search autosuggestions, search predictions, search suggestions, suggested searches

    Posts

    Notes

    Predictions and suggestions?

    Exemption policies

    Graham (2023, p. 3):

    exemption policies ignore the fact that every suggestion is inherently political. Google Autocomplete still makes suggestions for topics that contribute to political attitudes such as: [are immigrants…], [abortion is…], and [legalising…] that can actively shape a user’s line of enquiry and influence their political decision-making. Due to how suggestions are personalised and localised it is difficult for researchers to provide a survey of such topics, let alone study the kinds of political attitudes that might be embedded in aggregate. In this regard, the ethical challenges raised by political suggestions cannot simply be addressed by Google’s current approach of preventing suggestions for the names of politicians. Instead, a wider public conversation is required about the consequences of exemption policies and what topics, if any, should be exempt

    Amplifying social bias and gaming

    “Data Voids and the Google This Ploy: Kalergi Plan” (2019) engages with answering two questions raised by Caulfield himself: “What can we do as educators? What should we encourage our students to do?”:

    A final note — for the moment, avoid auto-complete in searches unless it truly is what you were just about to type. Auto-complete often amplifies social bias and for niche terms it can be gamed in ways that send folks down the wrong path. It’s not so bad when searching for very basic how to information or the location of the nearest coffee shop, but for research on issues of social controversy or conflict it should be avoided.[emphases added]

    Note: “often amplifies social bias” in the original is a hyperlink to Safiya Noble (website | Twitter)’s “Algorithms of Oppression” (2018).

    Harms of autocomplete

    Boaz Miller and Isaac Record’s “Responsible epistemic technologies: A social-epistemological analysis of autocompleted web search” (2017) examine three harms around autocomplete and argue search engines “bear the bulk of responsibility to mitigate them.” They write:

    Autocomplete inevitably and irreparably induces changes in users’ epistemic actions, particularly their inquiry and belief formation. While these changes have potential epistemic benefits for searchers and society, they also have harms, such as generating false, biased, or skewed beliefs about individuals or members of disempowered groups.

    See also Gillespie (2016):

    Consider Google’s autocomplete function, where the site anticipates the search query you’re typing based on the first few letters or words, by comparing it to the corpus of past search queries. While the primary purpose of autocomplete is merely to relieve the user of typing the remainder of their query, the suggestions it makes are a kind of measure of popular activity and taste (at least as represented through searching on Google).

    Context and texture

    Tripodi’s “The Propagandists’ Playbook” (2022, p. 145) suggests that it can be useful to follows traces in autocomplete.

    Studying the nuanced shifts in Google’s autocomplete suggestions can provide context, even texture, to what’s going on in the world. These automated suggestions can also provide a window into misinformation campaigns, which draw on the power of ideological dialects to game search engine optimization.

    Localization

    Some autocompletes rely more or less on localization. You can explore this by changing your location through a VPN, for example, see this from Ryan McBeth (YouTube Video), or, in Chromium browsers, through developer tools: Documentation > Chrome DevTools > Sensors: Emulate device sensors > Override geolocation

    Research, Reporting, and other Writing

    Reporting

    Teaching

    Google

    SEOs

    Tools

    See Ronald Robertson (website | Twitter)’s suggests to explore Google autocompletes programmatically (Robertson et al., 2019).

    This website

    Here is how this website does autocomplete.

    References

    Caulfield, M. (2019). Data voids and the google this ploy: Kalergi plan. https://hapgood.us/2019/04/12/data-voids-and-the-google-this-ploy-kalergi-plan/. [caulfield2019data]

    Gillespie, T. (2016). # trendingistrending: When algorithms become culture. In Algorithmic cultures: Essays on meaning, performance and new technologies (pp. 64–87). Routledge. https://tarletongillespie.org/essays/Gillespie%20-%20trendingistrending%20PREPRINT.pdf [gillespie2016trendingistrending]

    Graham, R. (2023). The ethical dimensions of google autocomplete. Big Data &Amp; Society, 10(1), 205395172311565. https://doi.org/10.1177/20539517231156518 [graham2023ethical]

    Haider, J., & Rödl, M. (2023). Google search and the creation of ignorance: The case of the climate crisis. Big Data &Amp; Society, 10(1), 205395172311589. https://doi.org/10.1177/20539517231158997 [haider2023google]

    Hazen, T. J., Olteanu, A., Kazai, G., Diaz, F., & Golebiewski, M. (2022). On the social and technical challenges of web search autosuggestion moderation. First Monday, 27(2). https://doi.org/10.5210/fm.v27i2.10887 [hazen2022social]

    Karapapa, S., & Borghi, M. (2015). Search engine liability for autocomplete suggestions: Personality, privacy and the power of the algorithm. Int J Law Info Tech, 23(3), 261–289. https://doi.org/10.1093/ijlit/eav009 [karapapa2015search]

    Leidinger, A., & Rogers, R. (2023). Which stereotypes are moderated and under-moderated in search engine autocompletion? Proceedings of the 2023 Acm Conference on Fairness, Accountability, and Transparency, 1049–1061. https://doi.org/10.1145/3593013.3594062 [leidinger2023stereotypes]

    Miller, B., & Record, I. (2017). Responsible epistemic technologies: A social-epistemological analysis of autocompleted web search. New Media & Society, 19(12), 1945–1963. https://doi.org/10.1177/1461444816644805 [miller2017responsible]

    Noble, S. U. (2018). Algorithms of oppression how search engines reinforce racism. New York University Press. https://nyupress.org/9781479837243/algorithms-of-oppression/ [noble2018algorithms]

    Robertson, R. E., Jiang, S., Lazer, D., & Wilson, C. (2019). Auditing autocomplete: Suggestion networks and recursive algorithm interrogation. Proceedings of the 10th Acm Conference on Web Science, 235–244. https://doi.org/10.1145/3292522.3326047 [robertson2019auditing]

    Tripodi, F. (2022). The propagandists’ playbook: How conservative elites manipulate search and threaten democracy (Hardcover, p. 288). Yale University Press. https://yalebooks.yale.edu/book/9780300248944/the-propagandists-playbook/ [tripodi2022propagandists]