The Psychology of People Analytics

The attention to people analytics has increased enormously over the last few years. Many organisations have established people analytics teams, and several promising start-ups have developed software that can help HR with people analytics.
The assumption is that if we have access to the right data, if we have the right analysis tools and clever people to interpret the data, we will be able to predict human behaviour – and that these predictions will be used in a sensible way in organisations. I have some doubts.
It is time to have a closer look at the psychology of people analytics.

Inspiration

Two books were a great inspiration, and a must-read for HR professionals and people analytics specialists.
Thinking Fast and SlowOn number one “Thinking, fast and slow” of Nobel prize winner Daniel Kahneman. Looking at the numbers that were sold of this book, you would expect almost everybody has read this book (or at least: has bought this book). When I studied experimental psychology (from 1975-1981) Kahneman was already famous. I still remember the famous article he published in 1974 with his colleague Amos Tversky: Judgment under uncertainty – heuristics and biases. I quote from this article: “The reliance on heuristics and the prevalence of biases are not restricted to laymen. Experienced researchers are also prone to the same biases when they think intuitively. For example, the tendency to predict the outcome that best represents the data, with insufficient regard for prior probability, has been observed in the intuitive judgments of individuals who have had extensive training in statistics”.
On number two “The art of thinking clearly”, written by Rolf Dobelli. His book is less scientific, but certainly a worthwhile read with many good lessons. In 99 chapters, he describes the most common thinking errors, with interesting examples.
I also used the list of cognitive biases on Wikipedia. A great and extensive list. This list inspired Buster Benson to cluster these cognitive biases in categories, which he describes in his excellent article Cognitive bias cheat sheet. Based on this article John Manoogian made a very interesting and informative infographic, the Cognitive Bias Codex.

Why is scientific knowledge poorly used in organisations?

Many of lessons I learned at university a long time ago (from Kahneman and others) are hardly applied in organisations. Two examples.
I learned that an interview is generally a very poor selection instrument. You can improve this poor instrument a bit by conducting structured interviews, and you can improve the selection process a lot by adding tests and assessments. Still today, in most organisations candidates are selected based on the outcomes of a series of unstructured interviews.
Scientific knowledgeThe second example concerns the use of bonus systems. There is very little evidence that, for most jobs, bonus systems help to change the behaviour of people in the right direction. Still many HR teams in many organisations spent a lot of time on the design of sophisticated bonus systems.
If scientific knowledge is poorly used, what will happen with all the analysis and recommendations of the newly established people analytics teams?
Let’s see how some of the cognitive biases are important for people analytics, and how we can improve the impact of people analytics by taking them into account.
[bctt tweet=”There is little evidence to show that bonus systems can improve the behaviours of people. ” username=”ATCevent”]

Some cognitive biases that are important for people analytics

From the various sources, I selected 15 cognitive biases that are relevant when you are working in people analytics or with the results of people analytics. I will cover the in alphabetical order. For the short descriptions, I mainly used Wikipedia and The Art of Thinking Clearly.

1. Action bias

Action biasDescription: Action bias states that when faced with uncertainty or a problem, particularly an ambiguous problem, we prefer to do something; In fact we are happier doing anything, even if it counterproductive, rather than doing nothing, even if doing nothing is the best course of action.
Implications for people analytics: Several implications. The people analytics team must work fast. Managers want to DO things, and they better base their action on facts. It also helps to implement a rigid process of A/B testing, and to discuss and communicate that actions should be based on facts and evidence.

2. Algorithm aversion

Description: People tend to trust human judgment over algorithms. Even when an algorithm consistently beats human judgment, people prefer to go with their gut.
Implications for people analytics: This is a bias that will be difficult to overcome quickly. It is also related to the ‘Overconfidence effect’. It might be wise to use the algorithms inconspicuously, so that the human factor cannot intervene. When you sat next to someone in a Tesla you probably know the experience: if the ‘driver’ does not keep his or her hands on the steering wheel, you feel very uncomfortable. If you cannot see the driver, you care less (like in a metro without a driver). Algorithms are increasingly used in recruitment. If the final human selectors only get to see candidates that are suitable for the job anyway, the human decision will have less impact (only on the selected candidate).

3. Assuming a normal distribution

Description: People often assume that people data follows a normal distribution.
Implications for people analytics: The assumption that human behaviour and characteristics follow a normal distribution has done a lot of harm. Many HR instruments are built around this assumption. Example: if performance of people follows a normal distribution, it makes a lot of sense to design a salary system and a performance rating system with normal distributions. Unfortunately there is evidence that performance often follows a power law distribution, not a normal distribution. People analytics teams can do a lot of good work here, by unveiling the real distributions of relevant people data elements. When presenting the facts, the (lack of) imagination of people must be considered. A normal distribution is easier to imagine than a power law distribution. When people assume a relationship, it is generally linear, not exponential, as exponential relationships are very difficult to grasp.

4. Authority bias

Authority biasDescription: People tend to belief and follow authorities.
Implications for people analytics: Is HR and the people analytics team seen as an authority when it comes to data and data analysis? Will managers belief HR when they come with surprising insights? It might be better to put the responsibility for people analytics outside HR, with people who have more credibility when it comes to data analysis.

5. Availability bias

Description: We create a picture of the world using the examples that most easily come to mind.
Implications for people analytics: The availability bias is at work in most organisations. Carla is an excellent trainee, she has studied at Delft University, so we must hire more students from Delft, as they are good. Most talent detection processes in organisations hamper from this bias (read: Finding hidden leaders).
People Analytics can help to create an unbiased view of the world, and by doing that increase the effectiveness of many HR programs.

6. Confirmation bias

Description: It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs and convictions. We filter out any new information that contradicts our existing views.
Implications for people analytics: Everybody is vulnerable to this bias, also people working in people analytics. This is an argument for a real professional people analytics team, with people with a good scientific education. Amateurs can do a lot of harm in people analytics, because they are probably more prone to some of the cognitive biases, and because they often lack solid knowledge of methodology and statistics. People analytics professionals must take the confirmation bias into account when presenting analysis to their clients.

7. Hindsight bias

Description: In retrospect, everything seems clear and inevitable. It makes us believe that we are better predictors than we are, causing us to be arrogant about our knowledge and consequently to take too much risk.
Implications for people analytics: The hindsight bias is connected to the ‘story bias’ and to the ‘overconfidence effect’, because we like good coherent stories, and we very much like to believe that we are very good at making predictions. Managers like to be in control, and if you can predict the future, you are in control. The people analytics team should also be conscious of this bias, as they will be very keen to show that predictive analytics can really be predictive.

8. Information bias

Information Bias Description: The tendency to seek information, even when it cannot affect action.
Implications for people analytics: “We need more information” is a phrase heard in many Boardrooms. Often this is a delaying tactic, as while new information is gathered, nothing happens. The people analytics team should not always be happy when new information is asked, but point out, when appropriate, that adding new information will not add value, and only delay the decision-making process.

9. Fallacy of the single cause

Description: When it is assumed that there is merely one cause of a phenomenon, while other possibly contributing causes go undetected, are ignored or are illegitimately minimised.
Implications for people analytics: Life in organisations would be easier if many things would only have one cause. Often the questions asked to HR and the people analytics team hint in this direction: What is the cause that talent is leaving? What is the main characteristic of good leaders? What is causing the low level of engagement of our people in the USA? The people analytics team should dig deeper, and should find ways to present a complex reality in a simple way.

10. False causality

Description: The argument generally looks like this: Event A happened. Event B happened after A. Therefore, A caused B. The false cause fallacy is sometimes summarized and presented under the slogans “correlation is not causation” and “sequence is not causation”.
Implications for people analytics: This is again a good reason not to compromise on the quality of HR and the people analytics team. We can see false causality in action all the time. Our company was doing well. A new CEO arrives. The results are getting worse. It must be the CEO. So, if we appoint a new CEO, the results will get better.
False causality, in combination with small samples and HR professionals who are poor at statistics are a very dangerous mix.

11. Observer-expectancy effect

Observer expectancy effectDescription: When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.
Implications for people analytics: Still many people want to please their boss, and probably people analytics people are no different. Be aware!

12. Framing effect

Description: Drawing different conclusions from the same information, depending on how that information is presented.
Implications for people analytics: The framing effect (of which “how to lie with statistics” is a part) is of course very relevant for people analytics. You can assume people, including people analytics professionals, unconsciously interpret the facts in such a way that it confirms their views, and this will affect the way they present the facts. What is the use of a thorough analysis, if the results leave a lot of room for different interpretations? The people analytics team should be able to present the facts in a clear unambiguous way.
[bctt tweet=”5 cognitive biases that are relevant when you are working in #PeopleAnalytics – check them out” username=”ATCevent”]

13. Generalising one’s personal experience

Description: The tendency to assume that one’s personal experience is also the experience of other people.
Implications for people analytics: One of my former bosses was very money driven. He could not imagine that people exist who are not so money driven.
An executive I worked with had ample experience in the oil and gas industry, and she assumed every industry was like the oil and gas industry. People analytics adds a lot of value, e.g. by gathering and analysing the preferences of employees and people in the market.

14. Insensitivity to sample size

Description: The tendency to under-expect variation in small samples.
Implications for people analytics: People analytics must often work with small data, and unfortunately not with big data. It is difficult to draw conclusions from small sample sizes, but often conclusions are drawn anyway (even if the sample is only one, see bias number 10, ‘generalising one’s own experience’). In the HR domain samples are often small (a group of ten trainees, a course with 24 participants), and the pressure to draw conclusions from the evaluation of interventions is understandable (“80 percent of the trainees is happy with the program”; when there were only ten trainees out of a bigger group who completed the questionnaire, this might be a poor basis to draw any conclusion). The ability to explain basic statistic principles is important, as well as a very professional approach.

15. Overconfidence effect

Description: People systematically overestimate their knowledge and abilities. “Be skeptical of predictions, especially if they come from so-called experts”.
Implications for people analytics: This bias can be seen a lot in the workplace. Read also the entry at ‘Algorithm aversion’. This bias can be a real hurdle for people analytics. Senior leaders are often appointed because they are confident, and some of them need a lot of convincing before they belief new facts that are not in line with their thinking. Presenting and selling facts and analysis is an integral part of the job of people working in people analytics.
There are many more interesting cognitive biases, that are important for HR and people analytics. The Cognitive Bias Sheet gives an excellent overview.
Cognitive bias codex
Ignoring psychology is no option. Assuming that ‘the facts will speak for themselves’ can be considered naïve. There is a benefit for people analytics professionals to learn more about psychology and important cognitive biases, because no one is exempt from these biases. Awareness is a first step, using and neutralising some of the biases is an important condition for the success of people analytics.
Images: Original article, cover image Shutterstock

This article was first published on Analytics in HR.

Related articles

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Sign up to our newsletter

Get a weekly digest on the latest in Talent Acquisition.

Deliver this goodness to my inbox!