“Examining bias perpetuation in academic search engines”

    Kacperski et al. (2023) | kacperski2023examining
    Examining bias perpetuation in academic search engines: an algorithm audit of Google and Semantic Scholar
    Authors: Celina Kacperski and Mona Bielig and Mykola Makhortykh and Maryna Sydorova and Roberto Ulloa

    Year: 2023
    DOI: 10.48550/arXiv.2311.09969

    Keywords: academic search engines, confirmation bias, algorithmic audit, search bias


    Abstract
    Researchers rely on academic web search engines to find scientific sources, but search engine mechanisms may selectively present content that aligns with biases embedded in the queries. This study examines whether confirmation-biased queries prompted into Google Scholar and Semantic Scholar will yield skewed results. Six queries (topics across health and technology domains such as “vaccines” or “internet use”) were analyzed for disparities in search results. We confirm that biased queries (targeting “benefits” or “risks”) affect search results in line with the bias, with technology-related queries displaying more significant disparities. Overall, Semantic Scholar exhibited fewer disparities than Google Scholar. Topics rated as more polarizing did not consistently show more skewed results. Academic search results that perpetuate confirmation bias have strong implications for both researchers and citizens searching for evidence. More research is needed to explore how scientific inquiry and academic search engines interact.

    References

    Kacperski, C., Bielig, M., Makhortykh, M., Sydorova, M., & Ulloa, R. (2023). Examining bias perpetuation in academic search engines: An algorithm audit of google and semantic scholar. http://arxiv.org/abs/2311.09969 [kacperski2023examining]