A wide range of tools and technologies now in use by HR organizations around the world are collecting volumes of employee-related data. Outside of HR, companies are using systems that assess employee emails for network analysis, record conference calls and video meetings, and monitor employee activities through badges. While such information can certainly be useful, it can also be a huge liability when misused. Additionally, what data a company collects and how they are used significantly impacts employee trust.
In this article, I will lay out the four dimensions of trust that comprise the ethical and fair use of employee-related data and analytics.
1. Privacy
Every day we read about companies that have gotten into trouble because of compromised data privacy. As we bring in more and more data-collecting technology into the HR function, data privacy must be an area of focus for HR leaders. Are we protecting the privacy and confidentiality of our employees? Who has access to the various types of information collected automatically? Are employees informed about what data is collected and how it is used?
As the amount and type of data being captured keeps increasing, the issues around privacy become increasingly complex. For instance, the use of virtual reality in learning is rapidly growing. VR programs capture all types of individual performance data—attention span, eye movement, one’s ability to deal with stress. Certainly, such data can be useful for training and to assess job fit, but this information could also be misused if not kept private.
I see two equally important components to privacy. The first is the establishment of rigorous privacy policies that govern all employee-related data across the enterprise. The second is to be completely transparent with employees about what data are collected, how they are collected and how they are used.
2. Security
The sister of privacy is security. Are the data stored in a way that protects them from hacking, theft or misuse? Do you have password policies, encryption and other data-protection practices in place so an employee can’t take the data home, send them to a third party or accidentally release them onto the internet?
While IT has long dealt with the security of financial, customer and competitive information, companies now must implement security practices around employee information such as pay, job history, healthcare data and performance.
3. Bias
The newest and most complex problem related to people analytics is bias. Whether you are analyzing the data yourself or buying an AI tool from a vendor, it is important to remember that all algorithmic systems are based on existing data. If the existing data are biased, the predictions and recommendations generated will also be biased.
Eliminating bias is very difficult. The following are examples of how AI-based technology can actually reinforce or introduce bias:
- A system may compare an employee’s pay to that of peers, but may ignore factors such as race, location and age when assessing fair pay.
- Systems that predict retention may inadvertently discriminate against minorities or other groups who leave the company because of corporate culture.
- Assessments for job fit may institutionalize old, discriminatory hiring practices that are embedded into hiring history.
- Systems that use organizational network analysis to identify performance may not factor in the role gender or age can play in work relationships.
- Identification of high performers could be biased toward individuals previously rated highly.
To counter such built-in bias, you need to monitor and train your analytics systems. In other words, look at the predictions and recommendations they are making, and inspect the results for bias.
4. People Impact
How the use of analytics impacts employees is perhaps the most important dimension. If employees believe they are being monitored for the wrong reasons, the impact will certainly be negative. Therefore, your team should document how data collected by every system capturing employee-related data are being used. In particular:
- Do not use monitoring data to surreptitiously inform performance reviews. Such activities will damage your employees’ sense of trust and almost always lead to poor decisions.
- Do not use any form of wellbeing data for succession planning, performance reviews or any other form of employee coaching.
- Do not use training data for performance evaluation. Doing so not only reduces trust but could put you in legal jeopardy.
- Do not cross boundaries between personal and professional data.
In fact, it would be wise to do a legal review before you start capturing data to ensure you are adhering to GDPR and HIPAA guidelines, as well as other confidentiality protections. Focus your analytics program on strategies that positively impact people. If you’re tracking people to measure work productivity to make work better, then you’re moving in the right direction. If you’re using the data to weed out low performers, you’re probably violating your company’s management principles.
Bottom Line: Use Good Sense
More and more companies are hiring chief ethics officers and other staff to help with analytics projects. Every time you start a new analytics program, just ask yourself , “How would it look if this program appeared on the front page of the New York Times? Would it damage the company’s reputation or enhance it?”
My overarching recommendation is to use the recent consumer experiences with data as a guide. Companies that expose massive amounts of consumer data have suffered in terrible ways. Today, trust is one of the most important business assets we have. Take it seriously and make sure your efforts to make management more data-driven move in the right direction.
Recent Comments