Information Power Plays

A big shift in my AI exploration is moving away from the technical aspects of AI and toward the implications for organizational dynamics. Consider the information power dynamics at play in each of these situations.

Consider the information power dynamics at play in each of these situations.

Example 1: A few months ago, a nonprofit executive told me about a long-tenured direct report—the lead data analyst on her team—who refused to provide transparency into her data. If the executive asked for information on a particular topic, the analyst would send a PDF containing only the information needed to answer the question, omitting details about the underlying analysis or access to the data. When pressed, the analyst said that respecting her position meant recognizing her ownership of the data.

Example 2: A former colleague at Capital One once described feeling confused in meetings during his first few weeks. He later realized that much of his confusion stemmed from people using jargon rather than plain language when describing analyses. They say something like, “We ran a cohort-based segmentation analysis to assess differential outcomes across customer subpopulations,” when they really just meant, “I separated customers into groups and compared outcomes.” In this case, people were sharing their data and analysis openly, but the language made it less accessible nonetheless. 

Example 3: In No Sense of Place, Joshua Meyrowitz provides an example from the world of auto repair. He writes, “auto parts are often marked with code numbers that must be ‘broken’ with a code book; mechanics sometimes use one catalogue to decode the part number, another to check its list price and repair shop discount, and yet another to locate a dealer.” By protecting information and “encoding it in jargon,” the specialist can protect his status. Meyrowitz compares the mechanic to the surgeon who brings her car to the auto shop. The surgeon may have a higher social status, but the mechanic’s privileged information gives him power in their interaction. 

AI complicates each of these dynamics in interesting ways. 

First, it provides tools for decoding specialized language. Anyone can ask ChatGPT to translate and summarize a specialist’s technical report, leveling the power that comes with their differential knowledge. That’s good for the group’s ability to share ideas and have robust conversations, but not so great for the specialist. 

Moreover, maximizing the benefits of AI likely requires organizations to build systems that make their data as discoverable and usable as possible. They’d store data where almost everyone can access it. They’d mandate that people describe what the data means in plain language. And they’d adopt policies close to “share data unless given permission not to do so,” rather than allowing a “only share when asked” regime. Those policies also enable the group while weakening individuals’ power. They create a paradigm in which individuals may be responsible for stewarding data, but no one “owns” any data or has power based on a unique ability to access or interpret the data. 

These kinds of changes may also challenge individuals’ professional identities. What does it mean to be “the finance guy” if everyone in the organization can access finance data anytime they want and leverage AI to analyze it? What does it mean for the “great presenter” when everyone else in the meeting arrives having already asked their favorite AI tool to create a five-minute audio clip summarizing the meeting materials? It’s not simply a job security issue; it’s a status issue. 

These power dynamics of AI adoption have been on my mind because I’ve been talking with leaders who are wrestling with the best way to drive AI adoption. One of the key questions is whether they should focus on getting everyone to use AI or take a targeted approach, empowering enthusiasts and working around skeptics. There’s a logical approach to the choice based on what they’re trying to accomplish and the individuals involved. But if leaders don’t attend to the complex questions of power and emotion, they're more likely to misread the resistance they encounter and be underprepared to respond to it.

Next
Next

When AI Makes It Easy to Do the Wrong Things