Written by Natalie Cramp
First posted on HRreview
With the dust still settling on 2020, it’s difficult to know what all the long term impacts on the business world will be. What we do know is that remote working is likely to remain widespread and, following last summer’s protest regarding equality, diversity and inclusion will stay near the top of the corporate culture agenda.
For HR teams, these issues pose a range of complex challenges. How do you accurately monitor, assess and safeguard the wellbeing of employees who are no longer in the same office, and may not even be in the same country? Similarly, how do you devise a recruitment, retention and promotion system that ensures a level playing field and provides more opportunities to underrepresented groups? The answer could be provided by another trend that emerged in 2020: the rise in prominence of data science.
For the uninitiated, data science is all about extracting knowledge and insights from structured and unstructured data. You may not realise it but you experience the outputs of data science everyday. The Netflix recommendation engine being a prime example. It takes complex behavioural data e.g. the types of shows you watch, how long you watch them for and runs it through an algorithm to predict what else you might be interested in. Throughout 2020 data science played a key role in predicting the spread of the coronavirus and the public’s potential response to it. It also was less illustriously involved in helping to determine A-Level and GCSE grades (more on that later).
So what does any of this have to do with HR? Well, HR is a profession that uses a lot of structured and unstructured data to hire and assess the performance and morale of staff. A huge number of factors are taken into account, from dry statistics on productivity and qualifications to the views of colleagues and managers.
All of this information is applied and contextualised by the current business situation and commercial needs – not to mention the individual preferences, ambitions and expectations of the staff member or candidate. Perhaps without realising it, HR professionals are performing incredibly complex analysis to make decisions or recommendations. Naturally, with so many moving parts, and let’s be honest, a lot of subjectivity at play, it is difficult to always make the right and fair decision. A more data-driven approach offers a way to break this influence by creating a more objective, fair and all-encompassing approach to HR.
Many of the above assessment factors are predicated on HR people, team members and managers interacting with people in person on a daily basis. With remote working, the HR function and line managers are essentially shorn of a huge number of data points. It is much harder to know how well someone gets on with their colleagues, presents at a meeting, adds to the culture of a company or provides useful insights if you are left to observations via Slack or Zoom. The only real answer is to gather every data point that is available and apply a consistent, subjective methodology from which these traditional factors can be predicted or inferred. In short, the insights generated by using data science techniques can provide the answer. And this is where these seemingly disparate trends of remote working and inclusion intertwine.
Numerous studies and reports have been conducted into unconscious bias in HR and recruitment. One of the most common research pieces, which I’m sure many of you are aware of, showcases how changing BAME sounding names on a CV to those more commonly associated with white people resulted in a higher success rate.
However, bias is encountered in a huge range of forms. For example, according to the CIPD, 51 per cent of HR professionals were found to be biased against overweight women – and were unaware of it. Another form of bias is the so-called “beer test” – the tendency to like and reward people who seem in the ‘in crowd’. ‘Confirmation’, ‘affinity’, ‘similarity’ and ‘attribution’ bias, along with the ‘horn’ and ‘halo’ effects are all revealed and analysed in a huge tranche of reports into why diversity and inclusion is a challenge for HR. We do not need to delve into each type of bias to know that what unites them is the subjective, human factor of HR decision making. This is not to say that HR workers are themselves prejudiced, just that we all have unconscious bias and the structure and culture in which they work can inadvertently influence decisions, and these seem to disproportionately affect underrepresented groups.
The commercial imperative of giving HR teams more data tools to manage a workforce, presents an opportunity to remove this unconscious bias by applying a more analytical approach to HR. This will mean creating systematic processes for collecting data points on individual workers or candidates and then using data science to create algorithms that will provide the answers needed to make decisions. Excitingly, data science isn’t restricted by numerical inputs, 360 reviews or other assessments can also be analysed and incorporated. Wider, big picture factors such as the commercial considerations of the business can be included. Not only will this make decision making fairer – it will also make them faster.
At this point, you may baulk at the thought of removing the ‘human’ aspect from HR. However, to be clear this is not what I’m advocating at all. Indeed, HR workers and line managers need to play an essential role. This is because, although an algorithm is itself unbiased, it is only as good as the data that is inputted and parameters within which it operates. This means that HR professionals need to ensure that the factors they assess provide a level playing field and do not actually perpetuate discrimination. A very good example of this in action involved Amazon using an algorithm to both speed up recruitment and address its gender imbalance. Unfortunately, because the algorithm was trained using historic data – i.e. mostly white, male candidates who went on to perform well at Amazon – it actually overly favoured this group and continued to discriminate against women. Because Amazon had total faith in the objectivity of its algorithm it failed to adequately monitor it, resulting in the algorithm operating in this way for an inordinate amount of time. Similarly, we saw with the fiasco around exam results how taking into account factors such as the size of school classes or the location of the school, inadvertently discriminated against students from poorer backgrounds.
Therefore, for a truly data-driven HR function to work, HR professionals need to be at its heart, working hand-in-hand with data scientists to design it. Naturally, this does mean that they need to upskill and educate themselves on the basics of data analysis. This will help to create guardrails that will ensure that HR workers do not have to rely on black box solutions, can question and verify outputs, and crucially recognise how certain data points can inadvertently discriminate against certain groups.
Whether wholesale data-driven HR will provide the silver bullet that tackles the inclusion crisis remains to be seen. What we do know is that it has the capacity to provide HR professionals with a raft of new tools to provide fairer assessment in recruiting, monitoring, developing and supporting workers. With the workplace in one of its greatest periods of transition in living memory, now is the perfect time to embrace innovation to build a more equal and hopefully happier world of work.