Advertisement

As You Embrace Predictive Analytics, Consider These Issues

Article main image
Nov 20, 2017

Think in the most personal sense about data you share at work. You share with your employer, and sometimes with potential employers, so many aspects of your life: details about your professional path, your personal status, health care, social-economics, legal and geographical background. You may also share information about what you do at different times and places, who you meet, what information you consume, and so on. Moreover, you leave your digital footprints on the web, social networks, and different apps, where data reveals to employers a lot about you.

Did you ever consider how data might affect you at work? How does your employer actually use the data about you? What is your employer allowed to do with your data, and what is considered crossing a red line, in terms of ethics and regulations?

AI (Artificial Intelligence) and ML (Machine Learning) are two buzz words that dominate the HR tech world today. We don’t know yet if there is a bubble in this field or it will be a huge influence on management practices. Nevertheless, the expectation is that because of AI, managers will make better, more informed workforce decisions. Predictive algorithms may enable them to better fit candidates in jobs or let them know who is at flight risk.

The benefits to an organization are many. But from the perspective of the employee, how concerned should they be about this?

Should Employees Worry?

If you tried to land a job lately, perhaps you had a video interview or you were asked to play some mobile games. These technologies, which probably offer you a nice experience as a candidate, actually enable organizations to predict your performance in certain roles, basically by comparing your results to those of high and low performers in the same roles.

As a candidate, you’ll probably consent to participate in those practices, even though you don’t know exactly what data these machines collect about you, and what is the secret predictive model they use backstage.

I’m not saying that predictive models are bad. On the contrary, I believe that in general, a machine that fits the right person to the right job, and does so better than a human whose perceptions may be biased, is actually positive. And not only for organizations but also for employees, since they may have a better chance to thrive in the right role. However, anyone who has some general knowledge about ML, can point to a confusion matrix and demonstrate that algorithms are not perfect, or more precisely, just how imperfect they are.

Garbage In, Garbage Out

There are many technical and statistical reasons for this, but the one that concerns me, in this context, is the possibility that human biases affect seemingly unbiased machines. The promise of ML and AI was that the more information we feed these sophisticated computer algorithms, the better they perform. Unfortunately, when the input data reflects the history of an unequal workplace, we are, in effect, asking a robot to learn our own biases. Garbage in, garbage out, right?

Such unfortunate effects can easily occur in the workplace. For instance, if an analyst explores people who were promoted in the organization for the last decade, and decides to use their data to predict high performance, it might result in a model that excludes  minorities. Why? Because minorities may have been only rarely promoted in the past due to social biases or discrimination. This example may be extreme, but it illustrates other, more subtle potential occurrences.

Employee Data and Privacy

Defense (and self-defense) starts with awareness. Indeed, the awareness of data protection and privacy is increasing, and influencing society in general, particularly in regulation. Employee rights are broadening these days to include their workforce data, although not evenly in each corner of the world. In the EU, a new privacy regulation, the General Data Protection Regulation (GDPR) was published lately (and will be enforceable starting May 25th 2018). It has serious implications for every company doing business in the EU.

Among many issues, the GDPR offers employees specific rights to control their personal data, e.g., extended access and rights to be informed about data usage, data transferring, and period of storage. As a consequence, in Europe, a new organizational stakeholder emerges – a Data Protection Officer (DPO) who will be involved in analytics projects involving employee and customer data.

However, in my opinion, the compliance with the GDPR is only a starting point. It will surely make HR analytics teams more sensitive to privacy issues. I believe it will also influence employees’ behavior, and HR analytics practitioners will have to respond: When people start exercising their rights, and request access to their data, people analysts must be prepared to give them comprehensive information about how their personal data is being used.

When employees start asking to correct or erase their data, employers will request more transparency and security from HR software providers. Organizations will ensure that they process only the personal data that is necessary for the specific purpose they wish to accomplish, and therefore, they’ll need long term planning and more serious considerations. This will move the field of people analytics forward. The implications for employees and candidates: Transparency!

Changing Nature of HR Analytics

I believe that eventually, even if it will take a few years, the people analyst role will take on some aspects of procurement. Analysts will do less programming, instead becoming more expert in HR tech and analytics solutions. Because of increasing  regulations, they will be the ones asking vendors hard questions and be more astute about model accuracy and data privacy. They’ll therefore contribute not only to creating a data-driven culture, but also to promoting data safety. Employees and candidates, for their part, will judge employers, in addition to the employee experience, by employer ethics in data management, deciding to what extent they’ll participate and cooperate in allowing machines and algorithms influence their career path.

Get articles like this
in your inbox
Subscribe to our mailing list and get interesting articles about talent acquisition emailed weekly!
Advertisement