AI ethics: How far should companies go to retain employees?
This article was originally published in the Globe and Mail on June 12, 2018
Canada, we have a problem.
Our companies are having trouble retaining their employees. According to a study by LinkedIn, Canada has the 4th highest employee turnover rate at 16% (versus the global average turnover rate of 11%).
As companies struggle to predict how likely their employees are to leave, organizations are turning to data at the source: employees themselves. While organizations are legally allowed to monitor our internet, email and instant messaging content, the extent of monitoring is now elevated with sentiment analysis using AI.
With GDPR bombarding news feeds globally about privacy and data regulation, a multitude of questions arise. How much data should companies be able to capture about their employees? To what extent are employees aware of the monitoring?
Broadly speaking, how far should companies go to retain their employees and where do ethics in AI fit in?
Access employee insights through sentiment analysis
One of the predominant workplace communication tools is Slack, which has grown to 70,000 paying organizations and 8 million daily active users, competing directly with Microsoft’s Skype for Business and Facebook’s Workplace.
Enter Vibe, a Slack plug-in that brands itself as a morale meter for teams.
Vibe applies natural language processing (NLP) algorithms to scan messages across individuals, teams and the company. It identifies 5 key emotions (happiness, irritation, disapproval, disappointment and stress) to generalize the mood in the workplace. By analyzing employee sentiment on public channels through keywords and emojis, Vibe can help companies identify employees who may be flight risks.
Vibe isn’t the only one though.
Emotion detection company Affectiva offers Emotion as a Service, which analyzes images, video and audio to provide facial and vocal emotion metrics. Veriato tracks workplace productivity and insider threats by monitoring internet and email content.
With a wide range of sentiment analysis available, organizations can keep track of employee stress levels through their use of positive or negative words.
If managers can track their employees a bit better, perhaps they can make a concerted effort to reach out and check in when necessary. After all, understanding employees better is the first step to improving employee retention.
Sentiment analysis is not new
Personal journal reflection sites like 750 Words have provided NLP insights for users who write daily as an introspection tool. In the background, 750 Words’ software examines keywords and phrases to identify a writer’s mindset, feelings and concerns.
The insights from 750 Words are extremely valuable for users to take an outside view of their own thoughts. The application of 750 Words is intentionally transparent; users seek to better understand themselves.
However, the contrast between 750 Words, Vibe and Veriato lies in the privacy and use of data.
With applications like Vibe and Veriato, users may not fully understand that their messages are being collected and analyzed – employers recognize that there are risks to that.
In Deloitte’s Global Human Capital Trends for 2018, ‘six out of ten [respondents] said that they were concerned with employee perceptions of how their data is being used.’ If used recklessly, there could be certain fallouts like backlash over data abuse or privacy concerns in a PR spotlight.
Ethics in workforce analytics
There is no denying that employers must better understand employees in order to better support their work lives. However, workforce analytics programmers should take three things into account in applying artificial intelligence: data collection/analysis, data use and data security.
Data Collection/Analysis: AI programs are highly dependent on the type of data that is collected, how that data is cleaned and how algorithms are coded. As a result, there is a high level of reliability on the programmers that code them. While using third-party software, companies should be conscious of personal biases and put controls in place in terms of data collection and analysis. To what extent should companies have to ask for permission to collect and analyze data? How much data should they be allowed to collect?
Data Use: Ginny Rometti, IBM’s CEO, took a stand last year in discussing IBM’s use of data, promising transparency in terms of when, where and how AI insights will be used. Transparency for employees is a great start to having corporate policies across the board ensuring proper use of data. To what extent are employees aware of tracked information? How is the data specifically being used to engage or retain employees?
Data Security: In light of the recent Facebook-Cambridge Analytica data scandal, companies need to take accountability for securing personal data. Although that scandal surrounded the leak of user data, the importance of storing employee data applies here too. If third-parties are analyzing company data, how can companies ensure that personal and sensitive personal data is not leaked?
Looking ahead, the use of AI in workforce analytics should not be discounted completely.
Instead, ethics should play a key role in the development and use of AI in workforce analytics and companies should expect to be held accountable for its implementation.
Looking ahead
In this conversation surrounding employee retention, AI-enabled workforce analytics may be more of a distraction than a benefit.
In fact, organizations may already have the answer: a LinkedIn survey notes that 45% of employees left because they were ‘concerned about the lack of opportunities for advancement.’
When it comes to retention, perhaps companies don’t need AI after all.
They just need to pay closer attention to what their employees are already saying.
About the writer
Jay Kiew is a management consultant in the human capital space, leading organizations through change in the age of AI. He holds an MBA from the Ivey Business School and a bachelor’s degree in political science from the University of British Columbia.