Hire me.

    June 20th, 2023

    The below is out-of-date, please see tooling to support people in making hands-on and open evaluations of search for a write-up on the work I’m looking to do.

    See my resume here. See my CV here. See more about me here.


    Below is a pitch for a user research scientist position in industry. This position would be focused on applying expertise from my academic training and research towards building systems that nudge the search landscape away from private rent-seeking and towards one that supports search rights and a flourishing of curiosity and new questions.


    If you are interested in hiring me for an academic research position, see this page on my research. Very related to the below, my goal in research is to study how people imagine, use, and make web search so that we might be better able to advocate for the appropriate role and shape of search in our work, lives, and society.

    Many people are now seeing that search is much more than ten blue links, much more than one company. We have a big chance to really change search for the better. Will we?

    I have a Ph.D. in Information Science from the School of Information at the University of California, Berkeley. I’m a technically-skilled qualitative researcher focused on web search tools and practices. I use interviews and digital ethnography to research how we talk about, imagine, know, build, and practice different ways of searching. My dissertation looked at how data engineers effectively use general-purpose web search at work.

    I’m looking at finding or creating opportunities in industry for me to bring my expertise in search and research to contribute to better understanding and improving search tools and practices, amidst changes around generative AI.

    Generative search and search-like tools are shifting how people imagine, discuss, and perform search. This presents massive challenges due to the immaturity of the models & interfaces and user conceptions & practices. Organizations introducing new generative tools not only need to keep making technical improvements but also need to work to better understand (and help support) the actual use. We know people are likely interpreting these tools and their outputs in many different ways, including some that may be harmful to the users themselves. And it isn’t clear yet how to help users identify use cases and how to query, prompt, or otherwise work alongside these tools most effectively.

    I want to position organizations to design technology and policy for the responsible adoption of generative search. I want to leverage what I have learned about search and conduct new research into similar practices related to generative AI to position organizations and people to effectively engage with these new tools: i.e. know when to use it, how to use it, and how to talk about it with others, as well as perhaps when not to use it or how not to use it). My goal is to work in a role where I can talk to and observe people using these tools (as substitutes or complements of their search practices) and bring insights back to development, design, and policy teams (and the users themselves).

    Here is some of what I can help do:

    • Surfacing effective and responsible strategies (of tool design and use), as well as the ineffective and harmful
    • Product design implications to support users
    • Disseminating those findings and best practices internally & externally

    Qualitative research—including interviews, participant observation, and digital and trace ethnography—can help us explore where and how people use these tools. By better understanding how users talk about and use these tools we can both better identify and address the various sources of harm and lower the barriers to effective use (including beneficial uses we haven’t yet imagined). We can learn how to prioritize model and interface changes and user education to better address concerns ranging from “hallucinations” to “automation bias”.

    I’m well positioned to do this research and help people and organizations get ahead of the challenges. In my dissertation research I used interviews and digital ethnography to better understand how data engineers effectively use general-purpose web search at work (amidst concerns about both misinformation and deskilling). I carefully situated their work practices within organizational and interactional contexts to understand how data engineers admit search into their work, extend searching across their tools and processes, collaboratively repair failed searches, and take or assign ownership for search. This required in-depth technical understanding, a grounding in science & technology studies, and close attention to individual search experiences. I built on my prior work looking at search engine responsibility, how different search tools support different values, and responses to search complaints.

    Despite the significant differences between a tool like ChatGPT, for instance, and Google web search, there are many similarities. People are sometimes embarrassed to admit they rely on such tools, domain knowledge and contextual factors of the topics searched are relevant to the success of queries or prompts, automation bias may be addressed through organizational processes, and places to collaborate around failure and share examples can provide immense add-on learning benefits.

    Thank you in advance for any connections, advice, or opportunities you can offer.

    See a variant of this on a LinkedIn post.

    daniel.griffin@berkeley.edu
    he/him
    /scholar_profiles
    GitHub
    Twitter
    Mastodon
    LinkedIn