Your Data is Lying to You: Pitfalls in People Analytics
People analytics is undeniably a hot topic today. Few would argue that the availability of HR data and the tools to manipulate it enable stronger decision making, development of better people strategies, and increased influence with business leads. Putting aside the many benefits that data can offer, users should watch for mistakes that can result in bad decisions. Here are 5 common issues that people working with people data should understand and avoid.
False causation. Consider your latest employee engagement survey. You had an uptick in engagement over the one you did 2 years ago, and when you look at your financial data, you see that profitability also increased. Engagement caused profitability, right? Or is it that the healthier financial situation caused an increase in engagement? The tricky thing about one-to-one comparisons is that it is often difficult to tell which is the driving factor and which is the dependent variable. To reduce the likelihood of this error, eliminate other possible causes. Regarding profitability, has the general economic climate improved? Were there one or two big contract wins? Was there a reorganization or process reengineering effort? Any new vendor renegotiations? Be sure you’re taking a complete picture of your situation.
Hasty generalization. Consider exit interviews. You review your data for last year and see that 3 of 8 people leaving under a specific manager cited “poor management” as a reason for leaving. Should you be concerned? Probably. Should you use this data point to decide that this is a manager in need of training, demotion, or firing? Certainly not. When organizations are just starting to gather data, they are often in a hurry to use it quickly. Big mistake. Especially when there are limited data points available, it’s important to gather enough data before using it to make big decisions. Consider additional factors that might be at play – location, life stage, commuting distance, performance ratings – and test additional hypotheses.
Confirmation bias. So you’ve got a new baby – congratulations, it’s a sales training program you helped create. It has all the bells and whistles – blended learning, role plays, interactive video. You pilot it with a group of ten salespeople. In the next month, their numbers improve by 20%. You show the data to everyone. Your training was a massive success! Not so fast. What you ignored was that across the company, the average salesperson’s increase during the same time-period was 18%, seven of the ten people who took the training had just come off a bad month, and the cost to develop and administer the training was incredibly high. You made the mistake of picking only the data that supported the position you wanted to support while ignoring the rest. Confirmation bias plagues us all in all aspects of our lives (not just work). To overcome, seek out independent opinions, particularly when you have an emotional interest in reaching a specific conclusion.
Regression to the Mean. You’ve recently implemented a monthly pulse survey, and you’re measuring participation. Good news! Response rate is 50% after 3 months, so you shift your attention elsewhere. Bad news. The initially high response rate will likely taper off after time. Performance and participation tend to regress to the mean, so whether it’s an average performer having a killer year or a new initiative that has a great deal of early enthusiasm, keep measuring and prepare yourself for a return to expected levels.
Data dredging. You have a massive HRIS export. You have detailed financial results by division and geography. You run a massive correlation analysis and find a statistically significant relationship between last year’s salary increase and this year’s absenteeism. You deliver the finding to your head of compensation and proudly report that “There’s only a 5% chance this is random!” Ah, but let’s do an experiment. Do an Excel coin flip test – type “=randbetween(1,2) in Cell A1 and copy it through G100, then sum each row in Column H. How many times is the sum a “7” (the equivalent of 7 tosses of heads)? Statistically, there’s a 1 in 128 chance of this occurring in a given row, but do it a hundred times, and your odds of an all-heads row increases to 55%. What this says is that if you randomly test hundreds of variables for statistically significant relationships, you’re going to find them somewhere, but be careful. Data dredging is useful for identifying hypotheses, but don’t make the mistake of believing that the relationships that you find are, in and of themselves, meaningful. Find additional ways to test your hypotheses.
People analytics is an exciting field, filled with possibility for driving decision making at the organizational level. Keeping in mind common statistical and analytical errors and how to avoid them will help you steer the organization in the right direction.
Are you interested in learning more about People Analytics? Feel free to get in touch with us to learn how Aspirant can help you achieve your organizational goals.
Judy partners with executives and leadership teams to engage and inspire employees in a way that delivers sustainable strategic results. She brings deep expertise and creative ideas to solve organizational effectiveness issues and closely collaborates in a way that builds internal capabilities. Judy has spent over 25 years consulting in a variety of industries, bringing her expertise in behavior to a wide range of organizational issues including organizational behavior change, leadership, change management, culture and engagement.