Research: What Companies Don’t Know About How Workers Use AI

Leaders who are exploring how AI might fit into their business operations must not only navigate a vast and ever-changing landscape of tools, but they must also facilitate a significant cultural shift within their organizations. But research shows that leaders do not fully understand their employees’ use of, and readiness for, AI. In addition, a significant number of Americans do not trust business’ use of AI. This article offers three recommendations for leaders to find the right balance of control and trust around AI, including measuring how their employees currently use AI, cultivating trust by empowering managers, and adopting a purpose-led AI strategy that is driven by the company’s purpose instead of a rules-heavy strategy that is driven by fear.

If you’re a leader who wants to shift your workforce toward using AI, you need to do more than manage the implementation of new technologies. You need to initiate a profound cultural shift. At the heart of this cultural shift is trust. Whether the use case for AI is brief and experimental or sweeping and significant, a level of trust must exist between leaders and employees for the initiative to have any hope of success.

The topic of trust surfaced in three Gallup studies conducted in 2023, each presenting different perspectives on important trends in AI adoption: the CHRO Roundtable Survey of large company (average size 80,000 employees) chief HR officers (CHROs); the Gallup Quarterly Workforce Study of nearly 19,000 U.S. employees and leaders; and the Bentley-Gallup Business in Society Report.

What follows are three key insights that we learned from this research.

1. Leaders Do Not Fully Understand Their Employees’ Use of, and Readiness for, AI

Gallup asked its global CHRO roundtable members, whose department supports most culture transformations, how often their company’s employees were using AI to do their jobs. Was it daily? Weekly? Monthly? Annually? Close to half (44%) of these leaders did not know.

This blind spot is a major factor in the erosion of trust between leaders and employees. In fact, it’s inciting many leaders to deploy a rules-heavy approach, versus a purpose-led approach, to control AI usage more tightly. After all, you can’t properly manage what you don’t understand or can’t measure.

Gallup’s Workforce Study discovered that most U.S. employees — seven in 10 — never use AI in their job, with only one in 10 saying they use AI on a weekly basis or more often.

This chart shows data from a Gallup study of nearly 19,000 U S employees and leaders, indicating that seven in 10 workers never use AI in their work, while only one in 10 workers use AI on a weekly basis or more. Source: Gallup Q2 Quarterly Workforce Study (May 2023)

See more HBR charts in Data & Visuals

Among the 10% of employees who frequently use AI, four in 10 use it to carry out “routine tasks,” three in 10 to “learn new things,” and one-quarter to “identify problems.”

A bar chart shows data from a Gallup study of nearly 19,000 U S individuals. When asked how they currently use AI in their jobs, 43% of employees say they use AI to carry out routine tasks, 32% to learn new things, 25% to identify problems, 19% to set up, operate, or monitor complex equipment or devices, 15% to interact or transact with customers, and 11% to collaborate with coworkers. Source: Gallup Q2 Quarterly Workforce Study (May 2023)<br />

See more HBR charts in Data & Visuals

As Gallup data show, leaders are often unaware of when and why their employees use AI. This knowledge gap places leaders in a precarious position: managing the unknown rather than leveraging what they know.

Imagine a captain navigating a ship through uncharted waters. Without accurate maps or knowledge of the currents, the captain must rely on intuition and guesswork. Similarly, leaders find themselves steering their teams, and the culture of the company, without a clear understanding of what AI adoption currently looks like within their organization. This blind spot hinders effective decision-making and undermines trust between leaders and employees.

Not only are many leaders unaware of how their employees use AI, but they also harbor doubts about their employees’ readiness for such transformative technologies. Gallup data show leaders believe their employees are a lot less ready for using AI than employees feel about their own AI readiness.

Employees stand on one side of this divide, fairly confident in their ability to wield AI technologies effectively. According to the Workforce Study, nearly half (47%) of employees feel prepared to use AI in their roles, yet only 16% of CHROs perceive their workforce as ready for AI adoption. This is a significant perception gap.

Here lies the crux of the matter: Some employees will use AI tools independently, regardless of their leaders’ assessments of overall company readiness. If leaders remain skeptical about AI, then employees may forge ahead without their guidance. Organizations must bridge this perception gap as quickly as possible, actively working to foster trust and ensure that both leaders and employees are navigating the AI landscape together.

The net effect of these differences in perception around AI is that many leaders are implementing stricter controls over AI usage. More than half of the CHROs (57%) in Gallup’s global roundtable say that their organization has implemented safeguarding policies for using AI. This is not in and of itself a bad thing — after all, leaders have a duty to manage how and where AI is deployed and ensure it is used responsibly to benefit the organization and its customers.

But starting with barriers to AI usage sends employees a mixed message: On one hand, leaders may be keen to promote a culture of agility, collaboration, and innovation. Yet, when these cutting-edge tools become accessible to the broader workforce, leadership’s instinct shifts toward skepticism, control, and protective measures.

When fear-based, rules-heavy strategies take root, innovation can be inadvertently stifled. What begins as reasonable safeguarding can curb the very creativity leaders seek.

2. Americans Do Not Trust Business’ Use of AI

In the 2023 Bentley-Gallup Business in Society Report, we discovered that only 10% of U.S. adults think AI does more good than harm. The report also found a staggering 79% of U.S. adults report low or no trust that businesses will use AI responsibly.

This skepticism underscores a profound trust deficit around AI at a societal level. When individuals envision AI’s impact, their minds often gravitate toward the negative. Fear and apprehension color their perceptions. Why? Because the unknown looms large. AI, with its transformative potential, remains an enigma — an uncharted territory where both promise and peril coexist.

It comes as no surprise that the report found 75% of U.S. adults believe AI will reduce the overall number of jobs in the next decade. The fear of automation displacing human workers is real. Incidentally, many leaders have this same fear but think it will happen even sooner — 72% of CHROs in the roundtable strongly agree that AI will lead to job reductions at their organizations within the next three years.

3. There is Common Ground to Build Trust

Despite these concerning findings, there is a brighter side to the story. CHROs in Gallup’s roundtable overwhelmingly believe AI technologies will drive productivity, enhance creativity and innovation, and enable their organizations to operate with greater efficiency. Nearly all (93%) CHROs anticipate that AI will reduce workloads and 61% foresee that AI adoption will enable employees to spend more of their time on strategic activities.

Some employees are also optimistic about AI’s potential: Roughly four in 10 white-collar and Millennial employees believe AI could help improve how their work gets done.

Data from a study of nearly 19,000 U S individuals is shown. Among job roles, 39% of leaders, 38% of project managers, 36% of managers, and 30% of individual contributors feel AI could be used in their current role to improve their work. In job categories, 42% of white-collar workers believe AI can be used in their current role to improve work, followed by 27% of healthcare/social assistance workers, 25% of production/frontline workers, and 25% of administrative/clerical workers. Among generations in the workforce, 42% of Millennials (born 1980 to 1996) believe AI can be used in their current role to improve work, followed by 38% of Gen Z (born 1997 or after), 25% of Gen X (born 1965 to 1979), and 20% of Baby Boomers (born 1946 to 1964). Source: Gallup Q2 Quarterly Workforce Study (May 2023)<br />

See more HBR charts in Data & Visuals

Even though Gallup found a significant portion of CHROs and U.S. adults believe that AI will reduce jobs in the coming years, Gallup’s Quarterly Workforce Study — which polled leaders and employees more broadly — found only 12% of U.S. business leaders and 14% of U.S. employees believe the job they have now will be eliminated within the next five years due to new technology, automation, robots, or AI. Because U.S. employees make up the majority of this dataset, that 14% segment is visible in the figure below — just 11% of workers feel their current job is “somewhat likely” to be eliminated, while only 3% of workers feel their job is “very likely” to be eliminated.

This data implies that when employees focus on the known (when they think of their own jobs) versus the unknown (what might happen to someone else), there is perhaps less to fear and more to look forward to when it comes to AI adoption. This data also represents a common ground on which to build trust around AI adoption.

A chart shows data from a study of nearly 19,000 U S individuals. When asked how likely it is that the job they have now will be eliminated within the next five years as a result of new technology, automation, robots, or AI, 58% said not at all likely, 27% said not too likely, 11% said somewhat likely, and 3% said very likely. Source: Gallup Q2 Quarterly Workforce Study (May 2023)

See more HBR charts in Data & Visuals

Results from the Workforce Study also show that when workers feel adequately prepared to embrace AI, they are 67% more likely to believe it could improve how their work gets done. Nearly half (47%) of all employees in the Workforce Study feel this sense of preparedness, reporting they’ve been adequately trained to use AI technologies. For this cohort, a solid foundation for trust between leaders and employees is beginning to take root.

How to Strike a Balance Between Control and Trust

Given recent findings, how do leaders properly guide the cultural shift that’s needed to successfully adopt AI into their organization’s workflows? How do they achieve the optimal balance between control and trust with respect to employees’ usage of AI? Here are three recommendations:

1. Measure and manage AI usage across your organization.

Expand what you know about the current state of AI usage in your organization. Gather information about which AI technologies and applications are actively deployed, and how employees are using them do their jobs. Measure things like AI usage frequency and AI tool effectiveness throughout your organization — this will help paint a complete picture of how AI is already being leveraged by your teams. Managing the knowns (not the unknowns) may reveal a real need for safeguarding, or it may reveal a need for greater empowerment. The goal is to remove blind spots rather than manage through them.

2. Create mutual trust by empowering managers.

Managers influence 70% of team engagement and they can ensure that your AI strategy supports the organization’s goals and expectations in terms of innovation, agility, and productivity. Managers are also in the best position to identify where applications of AI can help their teams work more efficiently or achieve better results, as well as what training and technical support is needed.

While 47% of workers in the Workforce Study feel adequately prepared to work with AI, about 53% of workers feel unprepared and say they need more training. Therefore, it’s essential that leaders actively engage with managers to understand where their teams are doing well in terms of adequate AI training and where they need more support. Are managers aware of the company’s AI training programs and other resources that may help employees understand how to deploy AI to do their work effectively? Do managers have frequent team meetings and one-on-one conversations to identify specific development needs and discuss how AI technologies can be used in various roles?

3. Use a purpose-led AI strategy, not a rules-heavy one.

Companies tend to perform better when they can establish a meaningful connection between their purpose and their employees. Gallup research shows that just a 10% improvement in employees’ connection with the mission or purpose of their organization leads to a 33% improvement in quality of work, an 8% decrease in employee attrition, and a roughly 4% increase in profitability. An AI strategy that is aligned with and driven by the organization’s purpose, rather than by a more fear-based and rules-heavy approach, will have a better chance of delivering the desired outcomes of efficiency and effectiveness.

• • •

There is a notable gap in perception between what leaders know about their employees’ usage of and readiness for AI, and reality. This gap underscores the urgent need for leaders to prepare their cultures for the imminent AI revolution. The workplace of the future is here. Leaders need to be well informed about what’s really happening in their organizations in order to cultivate trust across the company and develop AI strategies that align with high-level goals.

AI and machine learning, Leadership, Organizational culture, Corporate strategy, Leadership vision, Digital transformation, Technology and analytics, Digital Article

Jeremie Brecheisen
Jeremie Brecheisen is a partner and managing director of The Gallup CHRO Roundtable.

Illustration by Debora Szpilman