How to keep employee distrust from limiting your company’s AI strategy

Categories
AI and machine learning AI strategy employee experience Employee fears employee trust Gartner HR Leadership LinkedIn LiveCareer Microsoft

With one in five Americans reporting symptoms of anxiety or depressive disorders, the country is in an “unprecedented mental health crisis,” according to the White House and reporting from the CDC. While the origins of this challenge vary from person to person, some say that work-related factors are adding to mental concerns.

– Advertisement –

Nearly nine out of 10 employees suffer from work-related fears, and most say these fears have negatively affected their jobs, according to a study of nearly 1,400 respondents conducted by resume-building solution LiveCareer.

The findings indicated that making decisions, accepting responsibility and public speaking top the list of work-related fears. These phobias can easily intersect with directives to apply new technology at work, especially when the tech is promoted as groundbreaking. This is supported by new research from Gartner, which found that many employees fear one of today’s most-discussed topics: artificial intelligence.

‘Not a fear of the technology itself’

Duncan Harris, research director in the Gartner HR practice, says that this fear can be a barrier to tapping the full potential of AI strategy—for both employees and employers.

Gartner has found that the fear isn’t simply a generalized apprehension about new technology; instead, employees are anxious about outcomes and risks they believe might surface when AI is incorporated into their workday.

– Advertisement –

Harris identified the following five areas of employee distress:

  1. Job displacement due to AI making their jobs harder, more complicated or less interesting
  2. Inaccurate AI creating incorrect or unfair insights that negatively impact them
  3. Lack of transparency around where, when and how the organization is using AI, and how it will impact them
  4. Reputational damage if the organization uses AI irresponsibly
  5. Data insecurity because the implementation of AI solutions puts personal data at risk

Harris notes that such employee fears can impact engagement and performance, slowing down positive strides intended by AI implementations. However, employers and HR leaders shouldn’t confuse employee fear with total hesitancy toward AI adoption—people are using artificial intelligence despite their anxieties. “Employee concerns are not fear of the technology itself, but fear about how their company will use the new technology,” says Harris.

Another large-scale research project confirms that employees are enthusiastic about the benefits of AI. According to the 2024 Work Trend Index Annual Report from Microsoft and LinkedIn, “AI is being woven into the workplace at an unexpected scale.” This report, which surveyed 31,000 people across 31 countries, found that 75% of knowledge workers use AI at work today. The study indicates that users say AI helps them save time (90%), focus on their most important work (85%), be more creative (84%) and enjoy their work more (83%).

Addressing employee fears

“AI has the potential to create high business value for organizations, but employee distrust of the technology is getting in the way,” says Harris.

He found that ethics, fairness and trust in AI models are barriers business leaders face when implementing the technology in the workplace.

“Pervasive fear typically indicates a lack of open communication and the presence of unrealistic expectations,” says organizational psychologist and author Brian Smith.

He says these issues are often exacerbated by a “hierarchical structure filled with leaders who lead by intimidation and lack empathy.”

So, how to build trust?

According to Gartner’s Harris, organizational leaders can address fears with the following AI strategy practices:

Become a partner in AI education to alleviate concerns about job loss

Employees are worried about losing their jobs to AI, and even more believe their roles could be significantly redesigned due to AI. To address these concerns, offer training and development on various topics, such as how AI works, how to create prompts and use AI effectively, and how to evaluate AI output for biases or inaccuracies.

Co-create solutions with employees to reduce fears about inaccuracy

Employees are concerned that inaccuracies or biases created by AI could negatively impact their roles or performance. Companies can alleviate these fears by demonstrating how AI works, offering guidance on its potential benefits and drawbacks and rigorously testing solutions for accuracy.

Communicate context to avoid fear of the unknown

Few organizations are fully transparent about how AI will impact their workforce. It’s not enough to simply provide information about AI—organizations need to offer context and details about the risks and opportunities shaping their AI policy. They should also explain how AI aligns with key priorities and company strategy.

Democratize accountability for AI ethics to minimize reputational risk

Organizations must formalize accountability through new governance structures to show they take threats seriously. Some companies have appointed AI ethics representatives at the business unit level. These representatives oversee the implementation of AI policies and practices within their departments.

Operationalize employee data rights to ensure privacy

Companies should establish an employee data bill of rights as a foundation for their policies. This bill of rights should outline the purpose of data collection, limit data collection to this defined purpose, commit to using data in ways that promote equal opportunity and recognize employees’ right to be informed about the data collected on them.

AI strategy starts with people

Brian Smith, Ph.D. in organizational psychology
Brian Smith, organizational psychology expert

It is crucial to remain mindful of the human experience during any AI integration, according to Smith. To foster a healthy work environment, the organizational psychologist says AI should be used to augment—not replace—human capabilities, ensuring that social connectivity and team cohesion are maintained.

A securely confident workforce can bring benefits beyond just AI projects, Harris says. Employees with high trust levels exhibit greater inclusion, engagement, effort and enterprise contribution.

On the other hand: “Neglecting [employee fears] carries a high risk of isolating individuals interacting with AI, leading to workplace social isolation and threatening individual wellbeing and organizational culture,” says Smith.

The post How to keep employee distrust from limiting your company’s AI strategy appeared first on HR Executive.