jump to search:/  select search:K  navigate results:/  navigate suggestions:/  close suggestions:esc

    5. Repairing searching

    tags: diss
    December 16th, 2022

    Successful practitioners establish ways of working around the limits of their tools. What happens when web searches fail? Descriptions of successful practices need to include how practitioners navigate failure.

    The sort of search failures I’m discussing in this chapter is that of someone confronting a problem, turning to web search, and not finding resolution. I am not here concerned with situations where a searcher leaves a search with a wrong answer (their methods of search evaluation are addressed in the Extending searching chapter), though sometimes the identification of a wrong answer is the impetus to seek help from peers. The failed searches I am concerned with are where the searcher does not find useful I am also not particularly concerned with identifying root causes of the search failures here. The causes may seem to stem from any number of interacting individual, organizational, and other factors that affect:

    • the material available to be searched,
    • the design and performance of the search engine’s index, algorithms, and interface,
    • the knowledge, memory, and attention of the searcher

    How do data engineers navigate failed searches?80 Particularly, how do they do that within this environment where (1) professional credibility might be at risk (despite all the search confessions) and where (2) organizational and professional strategies have rendered the work searchable. In this chapter I talk about how the data engineers bridge web search gaps and demonstrate and share their skills and expertise in the process.

    This chapter involves a Handoff analysis of the configurations of components and their modes of engagement in web search practices that take place well-beyond the search box and the SERP (see Mulligan & Nissenbaum (2020) and Goldenfein et al. (2020) in The Handoff analytical lens ).

    A key observation of data engineering work is that they are operating at the edge, working with the new, or new to them, tools, systems, or other ways of handling data. Navigating around these edges, or learning around the edge, necessitates constantly working to “keep up” (Kotamraju, 2002) and engage in “intensive self-learning” (Avnoon, 2021) in a fluid, and loosely defined, field. They use web search to keep up. What do data engineers do when their searches fail?

    I answer that by looking at how they talk about asking and answering questions of or for colleagues. I show how data engineers package questions for colleagues, carefully explaining what they do or don’t know and have or haven’t tried. That includes a due diligence. Both how they do and communicate due diligence, helps them learn across their search failures, to access or understand something new, different, or unexpected. I consider the valuation and legitimation work that is part of both asking and answering and the impetus these interactions with others have on workplace web searching.

    Asking and answering

    Successful web search practices include articulation work—“work that gets things back ‘on track’ in the face of the unexpected, and modifies action to accommodate unanticipated contingencies” (Star & Strauss, 2004, p. 10)—that repairs failed searches. Web search itself may often be a sort of articulation work for the data engineers, and in the repairs of failed web searches the articulation work is turned back on itself. Jackson (2014) writes that repair work is “itself a facet or form of articulation work (and vice versa)” [p. 223]. Jackson (2014) also provides an appropriate warning for the findings of this chapter, to keep in mind that “repair is not always heroic or directed toward noble ends, and may function as much in defense as in resistance to antidemocratic and antihumanist project” [p. 233].

    What we know comes from how we navigate not knowing. Data engineers are able to operate at the edge through the ways they navigate not knowing, uncertainty, and change. I describe here the practices around the asking and answering of questions.

    Data engineer practices for asking and answering questions of and for colleagues allow them to navigate search failures as they package questions with due diligence, find renewed search impetus from (anticipated) conversation, and jockey around valuations and legitimations. This section presents these three patterns with examples from interviews, interweaving analysis and references to related prior literature. Data engineers employ these practices to navigate discrete search failures and also justify the nature of their search-reliant expertise. The practices allow for demonstrations of individual and occupational legitimacy in the face of search failures. The sites of supposed search failures become sites of shared learning that provide coordination for other work interactions. This cooperative problem solving is intrinsically valuable to the participants, as well as a site for constructing and presenting themselves as data engineers. As Sabel (1984) wrote discussing “collaboration between labor and capital”, a craftsperson will put up with even assembly line work “or as feeders of automatic machines” if “at least occasionally they can test their craft knowledge against unforeseen problems”. (14)

    Conversations in the interviews ranged from web search to the complements, substitutes, and complications of web search. Most interviewees discussed searching internal systems (various enterprise search systems, including searching on their workplace chat platform, an enterprise version of Stack Overflow for Teams, or otherwise searching internal documentation) and asking colleagues questions. I also explicitly asked what people did when search (seemed to have) failed.

    First I will provide a high-level intro to the conversations I had about asking and answering. Many discussions of how to ask colleagues questions (whether a Slack direct message or on a dedicated Slack support or troubleshooting channel; rarely via email) delve into how one is expected to demonstrate prior searches or search attempts. The reasons for this have been presented as ranging from demonstrating one deserves to ask a question to simply what is necessary to get the fastest response. Some people discussed this from the perspective of having to field questions and discussed things from wanting people to show they are invested in the answer to techniques to reduce the rate of questions (from not answering immediately to requiring ‘trouble tickets’82 to be completed).

    Some interviewees discussed heightened emotions or explicit concerns about how they may be judged (by colleagues or management) or treated that kept them from asking questions of colleagues until they’d exhausted their other resources (principally internal and web search). They don’t want a colleague to ask if they’ve searched yet or to suggest they ‘turn-it-off-and-back-on-again’. One interviewee provided some insight on an internal discussion around sharing guidelines for when and how to ask questions with those requesting help of their team, which I will expand on below. Individuals and teams seem to largely manage questions in informal routines, though generally in somewhat explicit channels83. This produced, to some participants, a way of subtly or crudely exercising power (Freeman, 2013).

    Highlighting some of that informality (and repeating some of what was discussed in Admitting searching ), here is an overview from Noah on what he tells new colleagues:

    When I have those onboarding meetings with new joiners, I point them to some of the search tools we have informally and I always point them to the support channels that we have.

    And I tell them — when I started I was very hesitant to ask questions in these support channels kinda for some of the reasons that I’ve been talking about, oh I didn’t want someone to be a dick to me. Or like think that I hadn’t done enough leg work before asking the question and what I tell them is, you know, ’just get over that, especially as a new employee, don’t sit around wasting your time and wondering if you phrased the question right, just ask it. Over time you will learn, from paying attention and watching other people ask, what the right way to phrase things is and the expectation around questions, but when you’re new just ask them, and people will realize that you’re new.

    No one is going to be mean to anyone for violating these norms on question asking on one instance. They are going to notice it over time, ‘oh that person always has bad questions’, ‘that person never really does the leg work and wants people to do it for them’. And that’s when people will say something to you. But on a one-off instance people might say, oh, you could have searched here or they might just give you the answer.

    I will now present the three patterns.

    Due diligence and packaging questions


    ‘hey I did my own due diligence on this’


    …show that I tried and that I have some grasp of the problem space


    This is what I tried. This is what I tried to find a solution. And I’m now stuck.


    I’ve done all these other things, I’m aware of all these other things , and I’m coming to you as a last resort.’

    This first pattern is focused on packaging questions, this includes doing and demonstrating due diligence. I asked data engineers about question-asking outside of search, initially hoping to surface examples of how people talked about search. Gradually I recognized the question-asking practices as repairing or completing searches, rather than simply a substitute for web searching.

    After Noah shared the comments above, I started to ask a follow-up question and he jumped in to correct my framing of “asking too many questions” as being rude or what is of concern. Rather than “too many” being the concern, in his eyes, it is how they are presented. He said:

    I don’t think it’s “asking too many questions”. If you show that you have done, that you have put some effort into this , even if it’s just a little bit, “here’s what I tried”.

    I said, “Yeah, bought-in” and he quickly continued:

    And, obviously, this is all a very fine line. That is the exact same that allows some people who are real jerks on Stack Overflow to justify what a jerk they are being. Right? Like, ‘oh that person just asked a stupid question so I was being a jerk to them because they didn’t make enough of a time investment before bothering me .’ Right? It’s a spectrum, it’s a complicated thing.

    I asked Michael when he would decide to ask colleagues a question. He noted that different engineers would approach things differently and then provided an extended answer (below are excerpts book-ending ten minutes of our conversation). He described the packaging of the questions similarly to Noah:

    try it out on your own, try to solve it the best you can, and keep searching, and spend all your resources first, and then go to your team if you’re actually truly stuck and you can’t figure out something. As an engineer: don’t depend too much on hand holding. But if I am stuck on something then I go to more senior technical leaders.

    [ . . . ]

    OK, I’m going to try my best to solve something and when I go to this lead or this senior engineer or someone above me to help me answer it. I want to have available the findings that I found from my own research so that I can present to them and kinda validate, ’hey I did my own due diligence on this . I’m not quite sure what the problem is but maybe you can see something I can’t or have an idea of something I can try. Or, altogether just scratch that because altogether this is an impediment and we don’t want to waste time on that.’

    Noah and Michael present the packaging of the question as intended, or read, as indicating investment or effort prior to the asking. Ross also mentioned due diligence, though added that its presentation also serves to show that he has “some grasp of the problem space”.

    Early in our conversation Ross said he’d do due diligence before he would “ping” someone with a question, saying “I try not to bother people until I’ve done my due diligence .”

    Yeah, it’s funny.You think that might be a time I post a question. I tend not to. I tend to loosen my search criteria. I tend to go more for something similar. This is a personal reflection of me.

    I don’t like bothering people. I kinda don’t like asking people to take time to respond to my questions until I’ve done a real true diligence search . Like, you know. ‘I tried. Here’s what I found and this encompasses what’s out there and its not me, my thing. So, here’s my question.’

    Soon after I asked directly about the presentation of “due diligence”84. He summarized here:

    I present some information. I try to present some summary of what I’ve done to show that I tried and that I have some grasp of the problem space. And then I present my question.

    He followed with, “I don’t really need to see due diligence but I feel like I owe it to others.” He also shared how he recollected coming to recognize the role of the presentation of due diligence:

    When you first get hired——oh, people are so helpful and everything——and then you’re expected to tow your own load. Then you ask a question and you get a response back:

    “Well, what have you tried?” Oooh. Oooh. OK. ‘Sorry. Here’s what I tried.’

    Maybe it doesn’t bother some people. But if I got that response, it’s like, OK. I didn’t do enough, I didn’t show I needed it. I think early, at some point when I started, I got that from someone somewhere. And you’re like, ‘OK’.

    Lauren also mentioned both prongs of the packaging of due diligence, the investment and the understanding. Her first mention of due diligence was what she would do before asking a question in an internal Slack support channel.

    Unless I was absolutely sure that my question was unanswerable some other way I wouldn’t just blast it out for a million people to see.

    (I checked, asking whether it was actually a million people and she clarified the channels she would ask questions in would have up to 100 people.) I asked her whether the context of being “absolutely sure” was provided when asking a question.85

    If you’re trying to get the quickest answer from somebody you’re like: I’ve done all these other things, I’m aware of all these other things , and I’m coming to you as a last resort.

    I would be very clear about that because especially when you’re talking to developers or people on the super back-end the first thing they’ll do is ask you, "oh, did you turn it off and turn it back on again, yes, I turned it off and back on again and I also did all these other things and that’s why I’m coming to you as a last resort.

    Or it’s an emergency. This page is down and I’ve refreshed it a million times.

    Charles talked about how people would get “pretty annoyed” at the lack of due diligence, or its presentation.

    I’m really conscious about making sure that I don’t ask a question that has already been asked. Because I feel like people on the forums, either general or even on our company’s [internal Stack Overflow] page would get pretty annoyed if somebody was asking a question that had already been asked and answered before. Um, and so would really go to pretty good lengths to look through the previous ones and make that sure my question hadn’t already been asked.

    Versus, you know, I’ve seen cases where somebody the second that they get an error from their IDE, they just take it and they type it into Stack Overflow and ask a new question. And I think people get pretty annoyed by that.

    [ . . . ]

    I would definitely try to convey to the person that I had been struggling with this for awhile and that is wasn’t just some error message that I just came across while I was messing around.

    He also spoke of people complaining about question-asking:

    Sometimes the ways that you learn is by learning what not to do. And so, I was good friends with a lot of my coworkers, most of whom where more experienced. And so, if we were at lunch or whatever, someone might say ‘this person on our team, he keeps pinging asking me questions for this thing even though he hasn’t tried XYZ.’ A lot of times I would heart things like that so I would take mental notes, “don’t do that”. It was more of my own kind-of observations.

    This pattern allows for learning around search failures. This includes search failures that result in part from the occupational and organizational approach to learning at the edge. It also engages with concerns of “keeping up” as individuals likewise navigate the edges of their own knowledge (Avnoon, 2021, Kotamraju, 2002). There are multiple functions of due diligence and packaging questions. Demonstration of investment encourages reciprocation. Communicating known context allows for quicker resolution. This enables people with different skills and knowledge to communicate effectively without the rudeness of assuming the contours of the other’s understanding. The pattern also integrates with, or functionally supports, the two patterns discussed next. Packaging questions in particular ways works toward legitimating the question-asker as a responsible participant, a valid member of the community. Expectations of due diligence play a role in the impetus provided in conversation (though the expectations, and their backing in humiliation or other judgement, are not the whole impetus).

    Valuations and legitimations

    I focus here on the evaluations and legitimation work of jockeying . Due diligence and packaging questions both affect how others evaluate you as an engineer and are opportunities for the engineer to legitimate themselves, and not just their question. The asking & answering of questions broadly are a place for evaluation and legitimation.

    At the time of our interview Nisha was a data engineer at a contractor working at a large tech company. She provides a clear introduction to the presence of evaluation/valuation around question-asking. I had slipped concerns other interviewees had mentioned about people perhaps being mean into my question around due diligence.86 She said:

    I’ve seen that as well.

    That’s another reason why people try to go to search groups or Google search rather than posting it internally to a group of developers that or might not know you and judge you.

    It’s a competitive world87 , right? I mean, your peers, they probably want to help themselves before they help you, right? So—[pause] it also depends on the team dynamics. Yeah, I’ve seen that as well.

    I then asked about the packaging of questions in these large groups.

    That’s how you are expected to ask questions in these groups…

    You have a problem. You’ve tried searching for it. But you’ve not had any success. You’ve tried a few things. You want to list them down because, for example: Your speaker is not working. Someone will suggest restarting the machine. You’ve already restarted your machine. You don’t them delivering the same information.

    So, its always helpful when you’re asking in a group, you ask very specific questions.

    When you’re doing a Google search it’s OK to see the same replies because you can always scroll down and see more.

    But when you’re interacting with people you want to give them very specific point to point information. So you give them exactly whatever you need to get the information out of them.

    [ . . . ]

    Q: Do you ever answer questions in these groups?

    Sometimes. If I’ve faced an issue, yes I try to answer. Because I know how it is to ask questions. It takes a lot of courage. People usually ask questions after they’ve done what they can. So I try to be helpful and mindful.

    I don’t want to be that person who does not respond when there is a problem and I know the solution.

    Q: You said it takes a lot of courage. Could you just say again why it takes a lot of courage.

    Because your managers would be on the chat group. They would be figuring out that you are not able to solve this problem. And your peers. You can also be judged by your peers.

    So, only after a person has searched, done what they can, tried things, is when they would normally post questions.

    The data engineers are not narrowly focused on demonstrating a need to ask a question, or their prior investment in trying to answer it and their knowledge. They are also looking to avoid negative judgement and present themselves as experts.

    This judgment is not evenly applied, elsewhere in the interview Nisha shared about being a woman in her organization:

    Like for me, I know that for whatever reason, women are minority in an engineering group. And they get picked on more. So we constantly have to watch our backs ensure that, even if you’re putting a comment in code, which will not be checked into the code base, but it’s only going to be run as a temporary fix, you still have to ensure that your comment, even the comments, are perfect.

    Jamie said she would generally try to answer her questions on her own because she enjoys it but that she doesn’t hesitate to ask her coworkers questions. Then she paused, and provided a fuller picture:

    Let me rephrase that. My natural state is not to hesitate. There have been times in the past where I have hesitated to do that. And that is largely due to my minority status in the tech industry and what people’s assumptions are about a women in the tech industry who is asking questions, right? Those same assumptions are not made of men who are asking questions.

    So definitely in those companies where that male-centric tech view has been more present, I have been more resistant to ask questions. Because, culturally, there has been this implication that if you’re asking questions then you don’t know what you are doing, which is ridiculous, totally absurd. Right?

    So it is not my natural state to behave that way. But in some companies, for people to think of me as the expert that I am I have had to change the way that I’ve done thing.

    She went on to distinguish her current workplace from past experiences. Megan also distinguished different workplace experiences, though her experience was somewhat reversed. I talked to her in a new participant member check, describing my initial findings and asking for her reactions. She said:

    All the women I’ve ever worked with in technical roles have been at times either effectively sidelined or people have tried to dismiss them as non-technical. That fear of appearing non-technical is a real thing.

    I just started a new job. Once I’ve been at an organization for a while, I have an established a reputation. I feel like I know people, people know me, and I’m very comfortable asking questions out in the open. And especially when I was an engineering manager, I tried to model that behavior and asking questions out in the open. But I can tell you right now, I just started the job and 100% I’m sticking my questions under private channels, because you don’t want to be perceived as struggling.

    … My horror of doing my job badly is worse than my horror of asking questions. But I will say it does push me into private channels as opposed to public ones.

    Fear of the consequences that may come from being judged stupid or lazy drives data engineers to search first and furiously before seeking advice from colleagues. Even if a question does not interrupt someone’s work, it might be seen as indicating a lack of knowledge or responsibility, or as wasting your colleagues time (which is undesirable given data engineers’ interest in speedy searching , next chapter). As Kari said, you want to “err on the side of: don’t waste people’s time.”

    The practice and place of search repair provides opportunities for data engineers to jockey for legitimation. Not just an opportunity, search repair demands such performances. As Goffman (1956) writes (p. 156):

    Audiences also accept the individual’s particular performance as evidence of his capacity to perform the routine and even as evidence of his capacity to perform any routine.

    In their answers and the quality of their due diligence the data engineers can present themselves to colleagues as capable, though some are required to demonstrate more to achieve equal esteem.

    Renewed impetus

    The web search practices of data engineers are shaped by the searcher being situated around others and the anticipations of those others’ perceptions of the searcher’s expertise or diligence. They are not normally anticipating interacting with others about a search, instead often expecting it will be quick. But when their searches start to show signs it might fail, conversations with their colleagues and the search repair practices come to mind. Search repair practices start before the question is asked. The expectation that a question for colleagues must include packaged due diligence and that it is a site for demonstrating legitimacy provides a renewed or refined impetus to search.

    Searching amidst or in anticipation of (even potential or speculated) conversation is very different from searching alone. Formulating a question to ask a colleague, or even a search strategy you might be willing to admit to a colleague, is distinct from formulating a query for a search engine.

    For help with working through problems, some even suggest formulating a question or talking through a problem with an inanimate object, like a rubber duck. “Rubber duck debugging” is a common reference point within coding work, even if it isn’t explicitly practiced.88 Though they call it “Rubber Ducking”, here it is described in Hunt & Thomas (1999)’s book for practitioners, on the craft of programming, in a chapter on debugging [p. 95]:

    A very simple but particularly useful technique for finding the cause of a problem is simply to explain it to someone else. The other person should look over your shoulder at the screen, and nod his or her head constantly (like a rubber duck bobbing up and down in a bathtub). They do not need to say a word; the simple act of explaining, step by step, what the code is supposed to do often causes the problem to leap off the screen and announce itself.

    It sounds simple, but in explaining the problem to another person you must explicitly state things that you may take for granted when going through the code yourself. By having to verbalize some of these assumptions, you may suddenly gain new insight into the problem. [internal footnote omitted]

    Rubber duck debugging achieves some of what anticipated, and rehearsed, conversations do for data engineers. Kari is a data platform engineer. Part of her responsibilities, besides systems design and implementation, include responding to support request from others in her organization. She said she, and her colleagues, will be “pretty busy” and get frustrated with the volume of requests for answers. Unless the question-asker shares enough of their due diligence she finds she has to respond first asking for more information.

    I always suggest: ‘Hey, if you’re going to ask me a question, give me a bunch of context on it beforehand, so that I can actually answer your question. So don’t just send me a stack trace. Tell me what you were doing. What is the stack trace? Link to the code. All this stuff.’

    [ . . . ] It is totally fair to just send the question back and say, ‘hey we need more information’.

    I asked her if perhaps sometimes pushing people to provide more information will help them answer it on their own.

    Yeah, lot’s of times… [chuckling]

    I repeated: ‘Lot’s of times’, and she said:

    Part of the exercise in the first place, of getting people to give us context is that as soon as you have to pose it and ask the questions and give us information, half the time you are going to realize or come up with an idea towards answering your question in the first place. Part of that [the asking for context] is actually to just get people to answer their own questions.

    While data engineers do not generally talk with each other directly about their searching practices89 , their conversations with each other and other colleagues shape and scaffold, or structure, their searching.90 Mokros & Aakhus (2002) proposes conceptualizing information-seeking behavior as “socially grounded” meaning engagement practice . They write how an information need, here an impetus to search, is “generated through social connection and the circumstances that arise from engagement with others and efforts to realize or avoid such engagement” (p. 309). A search can be renewed or refined by the prompt of an interaction. This may be the partially drafted question in a Slack chat interface or glancing at the clock or Zoom icon and anticipating your next standup or one-one-one is a socially grounded impetus to search.

    This is particularly the case for those who have received, perceived, or fear negative attention for their questions. These individuals may search excessively, rather than risk censure. Interviewees shared that even starting to ask a question helps them see their problem anew or see a new search to do. Orr (1996) describes technicians’ work of diagnosis. Orr, himself referencing Suchman (1987) regarding when “transparent activity becomes in some way problematic” (p. 50), writes:

    Becoming problematic" may mean that the activity has been disrupted by failure, that one is perplexed about how to proceed, or merely that someone else has inquired about the activity, requiring an explanation. [emphasis added]

    This ‘starting to ask’ introduces another person with the potential to inquire about the search. Interactions with others, including the ‘merely’ anticipated and imagined indirect relations, prompt activity to become problematic.

    I can only speculate on the internal mechanisms involved.91 Perhaps by becoming problematic, the problem is shifted out of the automatic search solution space. Rather than jumping to search, to try incantations to teleport to the solution, imagining or mimicking interacting with others calls upon conversational repertoires that may be well suited to aid reflection. This also may be as simple as starting to ask a question suggesting a framework for due diligence. The reliance on search may induce a sort of automation bias, a bias-to-search that frames problems in delimited ways. Alternative framings, or repertoires, includes preparing to address likely replies like ‘what are you trying to do?’92 , what have you tried?‘, or ’what do you know?’.

    Discussion: Visibility, friction, and valuing privacies


    Search repair practices provide a stage for performances that do four key things: legitimate the question asker and the answerer, legitimate web search reliance, enable participatory learning, share knowledge and who-knows-what.

    First, the data engineers openly jockey within search repair to demonstrate to each other what they know. This is one way they present, construct or perform, their legitimacy and expertise. This jockeying secures or shifts status claims. How they ask & answer questions about search provides another place for narratives where data engineers show and shape their selves (and judgements of competence and expertise) and their work. Feldman & March (1981) write on “decision making”, but search repair functions similarly (pp. 177-178):

    decision making in organizations is more important than the outcomes it produces. It is an arena for exercising social values, for displaying authority, and for exhibiting proper behavior and attitudes with respect to a central ideological construct of modern western civilization: the concept of intelligent choice.

    Navigating failed searches in these public ways (and the work of search confessions) “provides a ritualistic assurance that appropriate attitudes about [searching the web] exist” and provides opportunity for “a representation of competence and a reaffirmation of social virtue” (p. 177). Following Feldman and March, Leonardi (2007) writes that “visible aspects of information [ . . . ] practices are used as implicit measures of one’s ability to make an informed decision” and “certain individuals come to have more power in the decision-making process due to the perceptions of their information-acquiring practices” (p. 814). The repair practices provide this opportunity.

    Second, the reliance on web search is also reaffirmed, further legitimated. Though sometimes a question may be answered by directing the searcher to an internal search tool or other resource not available on the open web, the use of web search itself is not made a cause for concern. The search repair practice joins search confessions in providing informal and implicit approval for searching the web for work.

    Third, Search repair provides a place for participatory learning. At first the data engineers may only lurk and watch, gradually seeing how others ask and answer questions. Then they try asking questions and getting feedback. In the process, they may learn how their peers frame questions and problems, identify resources and search strategies, and position themselves to evaluate. These are not facts and verbalized procedures that they learn, so much as how to be a data engineer who searches to do their work. As McDermott & Lave (2006) wrote (p. 108):

    The point: the product of laboring to learn is more than the school lessons learned. Over time, laboring to learn produces both what counts as learning and learners who know how to do it, learners who know how to ask questions, give answers, take tests, and get the best grades. Making what counts and making those who seek to be counted, these together compose the product of learning-labor. [emphasis in original]

    The visibility of the work involved in these repairs, the stories, and the visible judgments (though the judgments are not always visible) create learning opportunities for those providing help as well, providing space for practice, and get implicit feedback on, problem solving and communicating (Perlow & Weeks, 2002).93

    Fourth, the repair practices allow the engineers (including those asking questions) to make knowledge and who-knows-what visible. This is beneficial for coordination and collaboration within and across teams—coordination a core requirement for this engineering work (Aranda, 2010). It also addresses a concern about the solitariness of web search. One might view web search practices as wasteful.94 People repeatedly search for the same thing that someone else in their team had searched. But the speed and ease of most web searches makes it possible to work without very much interruption (of oneself or others) and reduces the requirement for sharing knowledge. Then the difficult searches that are visibly repaired become opportunities for sharing how to search (mentioned just above, perhaps making more searches faster and easier), sharing some of that difficult to find information, and sharing who knows what across a team or organization.


    Even when questions are successfully received, and the data engineer isn’t found to be lacking, asking questions produces what Tsing (2005) calls “friction”. Tsing writes: “Cultures are continually co-produced in the interactions I call “friction”: the awkward, unequal, unstable, and creative qualities of interconnection across difference" (p. 4). The mismatches in conceptions and communicated understandings, produces friction that both further reveals the edges of knowledge and creates opportunities to see and shift or shape how those edges are engaged with.

    These sites for repair are components in the larger system of the work of data engineers. Within these sites the data engineers engage with each other in various ways and the work of repair as a whole engages engineers, prompting the visible jockeying as well as perceptions of how much more there is to know. This space for friction challenges the data engineers move beyond rigid evaluations of what they know as they work at the edge. Brown & Duguid (1998) write of “productive tension” from Hirschhorn (1997), and “creative abrasion”, from Leonard-Barton (1995). Brown & Duguid (1998) write that such tension or friction can generate knowledge. For the purposes of the data engineers, it generates new search seeds (or identifies them) and evaluations of search results—supporting the search performance of the organization. Girard & Stark (2002) write of how “new media projects” at the start of the 21st century constantly reinterpreted their provisional “project settlements” [p. 1947]:

    the unsettling activity of ongoing disputation makes it possible to adapt to the changing topography of the web across projects in time. Friction promotes reflection, exposing variation from multiple perspectives.

    A similar space for reflexive re-engagement around problem solving and search strategies appears around the search repair practices.

    A rigidly technocratized approach to search might redistribute friction to places less visible or with slower feedback loops, some friction in the repair work, from the different ways of problem solving and searching, may improve the search performance of the organization. Passi & Jackson (2018) cites Stark (2009) in describing functional benefits of friction, writing “actors use diverse ‘evaluative and calculative practices’ to accomplish practical actions in the face of uncertainty” (p. 4). The friction comes from multiple forms of valuation—of searching, problems, and solutions—, developing contextually in different situations. Conflict, Passi writes, is integral to organizational diversity in which the “productive friction” between multiple ecologies of valuation helps accomplish justification and trust—(dis)trusting specific things, actions, and worlds" (p. 4). Stark (2009) also calls this “generative friction” (p. 16, p. 19), explicitly building on Leonard-Barton (1995) and Brown & Duguid (2001).

    Valuing privacies

    Successfully navigating failed searches is entangled with privacy95. The data engineers do not, generally, ask for details of the question-asker’s search history. They do not force the question asker to show their recent browsing history. Both of those things, in the right setting, may greatly facilitate question answering. But the data engineers value privacy in searching and search repair. The valuing of privacy is an active result of visible practices. New data engineers see and participate in such valuing. This furthers their learning how to be, and search as, a date engineer.

    This respect for privacy includes a range of orientations or conceptions of privacy. I will not here analyze the privacy dimensions (Mulligan et al., 2016) involved in the search repair practices, but only suggest three broad areas relevant to search repair:

    • avoiding costly knowledge96
    • privacy allows each individual some autonomy in navigating the failed searches, rather than a surveillance that dictates the values of various knowledge from the top.
    • there is also confidentiality at more of the occupational group level, privacy conceals the workings of their expertise

    The data engineers I interviewed valued the lack of direct scrutiny of search practices. This was most visible in the strong objections and resistance to imagining sharing their search history. This privacy over the search activity is not only a matter of respecting or protecting web search as a place where they perform their expertise and professional judgement. Data engineers recognize that web search is where they regularly assemble and construct their expertise and professional judgement.

    Data engineers value privacy here because, in addition to the individualizing mythos of search and also of coding work (to be discussed further in the next chapter), this is a way of confronting what they do not know and navigating around failed searches—their inability to know. In that, privacy in the search practices protects data engineers from likely judgements of their knowledge, expertise, or skill, and affords the space to experiment and grow. Judgments of visible or perceived ignorance or lack of skill already lead to mistreatment and poor judgment of performance in evaluations. Revealing more details of searching practices, baring other changes in the conceptions of search and expertise, may lead to harsher penalties to some. The sites of search repair already involve a significant amount of openness and sharing, a place sometimes safe but often including conversation. There would likely be significant unknown tradeoffs from mandating more openness, as discussed by Turco (2016) and Star & Strauss (2004).

    The sociomaterial privacy of the web search practices of the data engineers makes space for “the play of everyday practice” at the edge of their knowledge, as “the motivating force behind creative practice, subject formation, and material practice” and a capability that promotes, with due consideration of other workplace factors, human flourishing (Cohen, 2012). Building on Cohen, Ohm & Frankle (2018) analyze a design principle, to make space for or to protect human values, that they call desirable inefficiency .
    Certain aspects of the existing configurations of data engineering work highlight apparent inefficiencies or frictions that provide opportunities for the data engineers. This include not just the broad values of some sense of privacy and autonomy, but to learn more deeply than they might if principally learned through directives rather than searching and experimenting or to serendipitously discover in their web search navigation.

    As discussed above, asking and answering questions promotes partially shared understandings and provides opportunities for jockeying (as well as the larger reproduction of workplace norms and values). The visibility mentioned above is fairly distinct from “forced representation.” Star & Strauss (2004) write that “Many studies over the years have cautioned that forced representation of work (especially that which results in computer support) may kill the very processes which are the target of support, by destroying naturally-occurring information exchange, stories, and networks” and “it should be clear that we are not recommending “more visibility”" [p. 24]. Cohen highlights the value of gaps97 in the space for play and that “maintaining those gaps requires interventions designed to counterbalance the forces that seek to close them”. Ohm & Frankle following her, present a gap imposed “in pursuit of fairness” (p. 826). This play and gaps are also relevant to the discussion of technocratization, as data engineers’ work and web search practices challenge forces such as the informating affordance of information technology and an apparent data imperative.

    Conclusion: Learning from search repair

    The search repair practices provide visibility and friction useful for the learning organization. They respect various privacies. These three all support the learning of data engineers to be data engineers reliant on search, providing examples and spaces to participate.

    The search repair practices also do the articulation or repair work that sustains reliance on web search. The organization can push data engineers to search, and rely on that despite not training them in search itself because they are immersed in search repair’s collaborative problem solving. Search repair is practiced across multiple components within the data engineers’ configuration of work and is itself a key factor in their successful reliance on web searching.

    Search repair takes various shapes but could be performed differently. The packaging of a portion of due diligence could be automated, or have a required form. Questioning and answering could be logged by management for performance evaluations. Other incentives or constraints could be introduced with the goal of improving some facets of search repair. Some changes like these might improve the rate of search repair, but perhaps at the cost of limiting the opportunities for data engineers to demonstrate and test their knowledge, share their knowledge across the organization, or challenge accepted knowledge. Or at the cost of data engineers doing less searching on the fly or being less willing to reveal what they do not know. Much more than search repair happens in these interactions. If some aspects of the search repair were compelled or the perceptions of constraints and affordances were distorted these multiple values for the learning organization may no longer be achieved.

    As it is, each data engineer within the various organizations engages differently with the sites of search repair. Some may not perceive the benefits of the visibility and friction because of where they are situated. Some may have different tradeoffs for the risks of revealing something they do not know, because they are already under heightened scrutiny, perhaps because of their minority status within the field. While others are able to use their experience and vantage of seniority to use the sites of search repair to demonstrate what they see as their expertise.


    Aranda, J. (2010).A theory of shared understanding for software organizations [PhD thesis, University of Toronto]. https://tspace.library.utoronto.ca/bitstream/1807/26150/6/ArandaGarcia_Jorge_201011_PhD_thesis.pdf

    Avnoon, N. (2021). Data scientists’ identity work: Omnivorous symbolic boundaries in skills acquisition.Work, Employment and Society,0 (0), 0950017020977306. https://doi.org/10.1177/0950017020977306

    Bhatt, I., & MacKenzie, A. (2019). Just google it! Digital literacy and the epistemology of ignorance.Teaching in Higher Education,24 (3), 302–317. https://doi.org/10.1080/13562517.2018.1547276

    Brown, J., & Duguid, P. (2001). Knowledge and organization: A social-practice perspective.Organization Science,12, 198–213.

    Brown, J. S., & Duguid, P. (1998). Organizing knowledge.California Management Review,40 (3), 90–111.

    Cohen, J. E. (2012).Configuring the networked self. Yale University Press. https://doi.org/10.12987/9780300177930

    Feldman, M. S., & March, J. G. (1981). Information in organizations as signal and symbol.Administrative Science Quarterly,26 (2), 171–186. http://www.jstor.org/stable/2392467

    Ferguson, A. M., McLean, D., & Risko, E. F. (2015). Answers at your fingertips: Access to the internet influences willingness to answer questions.Consciousness and Cognition,37, 91–102. https://doi.org/10.1016/j.concog.2015.08.008

    Freeman, J. (2013). The tyranny of structurelessness.WSQ,41 (3-4), 231–246. https://doi.org/10.1353/wsq.2013.0072

    Girard, M., & Stark, D. (2002). Distributing intelligence and organizing diversity in new-media projects.Environment and Planning A,34 (11), 1927–1949.

    Goffman, E. (1956).The presentation of self in everyday life. University of Edinburgh.

    Goldenfein, J., Mulligan, D. K., Nissenbaum, H., & Ju, W. (2020). Through the handoff lens: Competing visions of autonomous futures.Berkeley Tech. L.J.. Berkeley Technology Law Journal,35 (IR), 835. https://doi.org/10.15779/Z38CR5ND0J

    Gross, M., & McGoey, L. (2015).Routledge international handbook of ignorance studies. Routledge. https://doi.org/10.4324/9781315867762

    Gross, M., & McGoey, L. (2022).Routledge international handbook of ignorance studies. Routledge. https://doi.org/10.4324/9781003100607

    Hirschhorn, L. (1997).Reworking authority: Leading and following in the post-modem organization. MIT Press.

    Hunt, A., & Thomas, D. (1999).The pragmatic programmer: From journeyman to master. Addison-Wesley Professional.

    Jackson, S. J. (2014).Rethinking repair (T. Gillespie, P. J. Boczkowski, & K. A. Foot, Eds.; pp. 221–240). The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.003.0011

    Kotamraju, N. P. (2002). Keeping up: Web design skill and the reinvented worker.Information, Communication & Society,5 (1), 1–26. https://doi.org/10.1080/13691180110117631

    Leonard-Barton, D. (1995).Wellsprings of knowledge: Building and sustaining the sources of innovation. Harvard Business School Press.

    Leonardi, P. M. (2007). Activating the informational capabilities of information technology for organizational change.Organization Science,18 (5), 813–831. https://doi.org/10.1287/orsc.1070.0284

    McDermott, R., & Lave, J. (2006). Estranged labor learning. In P. Sawchuk, N. Duarte, & M. Elhammoumi (Eds.),Critical perspectives on activity: Explorations across education, work, and everyday life (pp. 89–122). Cambridge University Press. https://doi.org/10.1017/CBO9780511509568.007

    Mokros, H. B., & Aakhus, M. (2002). From information-seeking behavior to meaning engagement practice..Human Comm Res,28 (2), 298–312. https://doi.org/10.1111/j.1468-2958.2002.tb00810.x

    Mulligan, D. K., Koopman, C., & Doty, N. (2016). Privacy is an essentially contested concept: A multi-dimensional analytic for mapping privacy.Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences,374 (2083), 20160118.

    Mulligan, D. K., & Nissenbaum, H. (2020).The concept of handoff as a model for ethical analysis and design. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190067397.013.15

    Natarajan, M. (2016).The making of ignorance: Epistemic design in self-tracking health [PhD thesis, University of California, Berkeley]. https://escholarship.org/uc/item/5pn009mw

    Ohm, P., & Frankle, J. (2018). Desirable inefficiency.Florida Law Review,70 (4), 777. https://scholarship.law.ufl.edu/flr/vol70/iss4/2

    Orr, J. E. (1996).Talking about machines: An ethnography of a modern job. ILR Press.

    Passi, S., & Jackson, S. J. (2018). Trust in data science: Collaboration, translation, and accountability in corporate data science projects.Proceedings of the ACM on Human-Computer Interaction,2 (CSCW), 1–28.

    Peels, R., & Pritchard, D. (2021). Educating for ignorance.Synthese,198 (8), 7949–7963. https://doi.org/10.1007/s11229-020-02544-z

    Perlow, L., & Weeks, J. (2002). Who’s helping whom? Layers of culture and workplace behavior.Journal of Organizational Behavior,23 (4), 345–361. https://doi.org/https://doi.org/10.1002/job.150

    Proctor, R., & Schiebinger, L. (Eds.). (2008).Agnotology: The making and unmaking of ignorance. Stanford University Press. https://archive.org/details/agnotologymaking0000unse

    Sabel, C. F. (1984).Work and politics: The division of labor in industry.

    Shorey, S., Hill, B. M., & Woolley, S. (2021). From hanging out to figuring it out: Socializing online as a pathway to computational thinking.New Media & Society,23 (8), 2327–2344. https://doi.org/10.1177/1461444820923674

    Smith, C. L., & Rieh, S. Y. (2019, March). Knowledge-context in search systems.Proceedings of the 2019 Conference on Human Information Interaction and Retrieval. https://doi.org/10.1145/3295750.3298940

    Smithson, M. (1985). Toward a social theory of ignorance.Journal for the Theory of Social Behaviour,15 (2), 151–172. https://doi.org/https://doi.org/10.1111/j.1468-5914.1985.tb00049.x

    Star, S. L., & Strauss, A. (2004). Layers of silence, arenas of voice: The ecology of visible and invisible work.Computer Supported Cooperative Work (CSCW),8, 9–30.

    Stark, D. (2009).The sense of dissonance: Accounts of worth in economic life. Princeton University Press.

    Suchman, L. A. (1987).Plans and situated actions: The problem of human-machine communication. Cambridge university press.

    Tsing, A. (2005).Friction : An ethnography of global connection. Princeton University Press. https://press.princeton.edu/books/paperback/9780691120652/friction

    Turco, C. J. (2016).The conversational firm: Rethinking bureaucracy in the age of social media. Columbia University Press.

    What is the xy problem? - meta stack exchange. (2010). https://meta.stackexchange.com/questions/66377/what-is-the-xy-problem .

    1. An earlier draft of this chapter focused on ‘ignorance’, a polysemous concept I was using to refer to knowledge gaps or the limits of knowledge of the data engineer, their organization, the search engine, and the Internet. I was relying on concepts developed in ‘ignorance studies’, partially around how identifying, naming, and navigating ignorance can create or spread knowledge and how ignorance is socially constructed (Bhatt & MacKenzie, 2019, Gross & McGoey, 2015, 2022, Natarajan, 2016, Peels & Pritchard, 2021, Proctor & Schiebinger, 2008, Smithson, 1985) . My initial use of ‘ignorance’, a term often viewed and felt, viscerally, as a pejorative, unsteadily straddled both mixed causes and effects. My focus now on “failed searches” highlights a particular sort of situation and how data engineers address it without using ‘ignorance’ in multiple confusing registers and applied to different types of actors. ↩︎

    2. Trouble tickets are formalized and explicit requests for assistance. Ostensibly used to facilitate the repair or resolution of some problem or issue, they are also used for tracking issues and quantifying work performance of those tasked with their completion. They can be used also to increase the barrier to request help, at least formally. ↩︎

    3. I use the word ‘channel’ here to refer to various ‘communication channels’ though it is also the name in Slack and Microsoft Teams to refer to what internet chat relay services might call ‘rooms’, as in a ‘chat room’. ↩︎

    4. Direct ‘due diligence question’ of Ross:

      Q: Could we go to that a little bit? You had said ‘your due diligence’ before you ask someone. So you’ve described in part some of the due diligence that you do. But you’re also presenting to someone that you did your due diligence when you ask them a question?


      Q: Could you talk about that a little bit? How you frame different questions?

    5. Direct ‘due diligence question’ of Lauren:

      Q: When you reach out to an individual to ask this question that your initial searches were not fruitful with, do you make sure provided this context? ‘I tried XYZ, now I don’t know.’


      Q: Were there strict rules or policies or—

      No. [continued above]

    6. a ‘due diligence question’ of Nisha:

      Q: Is there— I’m curious. Some of the people I’ve talked to have talked about all the work they do to prepare or package their question that they submit. Because they’re nervous people are going to be mean or mean in the past. Do you see that or do you experience that?

    7. Nisha had worked in multiple organizations as a data engineer and noted her current organization was particularly competitive. ↩︎

    8. I introduced my Python students to rubber duck debugging when I taught the summer ‘Python Boot Camp’ to incoming graduate students at the School of Information (bringing in enough miniature rubber ducks for each of them). ↩︎

    9. See Admitting searching : [don’t talk about it] ↩︎

    10. See Extending searching : immersed in linked conversation . ↩︎

    11. Being part of a transactive memory system may be involved. For instance, Ferguson et al. (2015) ’s research showed people with access to the internet when asked questions reported a lower “feeling-of-knowing” than people without access. This might suggest that the moment someone decides to stop searching and start to ask a colleague they may shift into a different way of assessing their knowledge and gain confidence enabling a renewed search. Smith & Rieh (2019) provides an overview of several studies showing how web search is used as a transactive memory. ↩︎

    12. Some in coding-related work refer to this as the XY Problem ( What Is the Xy Problem? - Meta Stack Exchange , 2010) . ↩︎

    13. There is a stream of research on ‘advice networks’ (see Leonardi (2007) for an overview, he remarks that “the research on technology and advice networks is sparse”) that also discusses the role of status in asking and answering questions. See also Shorey et al. (2021) for a discussion of the “participatory debugging”—or “collaborative technical troubleshooting”—of young people in online environments. ↩︎

    14. A data engineer, Christina, raised this concern in a member check. ↩︎

    15. A desire for secrecy in search practices is discussed in the next chapter, under Secretive searching . ↩︎

    16. There is much to learn that only be learned at an unwelcome cost of the learner’s time and attention. Bhatt & MacKenzie (2019) say it can be good practice to be “highly selective in the things we know or seek to know in order to remain epistemically functional, particularly now that most of us are almost exclusively immersed in information-dense digital environments” (p. 306). ↩︎

    17. Cohen uses “semantic discontinuity” to refer to “gaps and inconsistencies within systems of meaning, and to a resulting interstitial complexity that leaves room for the play of everyday practice”. ↩︎