Is it time to kill engagement surveys? Some prominent figures in the space would say so.

The Deloitte Human Capital Trends 2017 report describes a transition toward the employee experience, which is a more holistic view of what it feels like to work every day in your job. Engagement is described as a silo within the experience that is often disconnected from the actual experience. In a recent podcast on the HR Happy Hour, Bersin even went so far to say that the engagement industry and concept simply needs to go away. Another podcast from Human Capital Institute’s Nine-to-Thrive podcast (episode link; full podcast feed) from about a year ago spoke to the same issue, through an interview with renowned author Jacob Morgan.

I tend to come from the place that all data is good data – it’s more about what you do with it and how much you need to invest given what it can do. I still find value in the engagement survey work done in my company. That said, there are few underlying methodology problems I see with employee engagement surveys and related work that are worth understanding as we consider a better way forward.

Surveys are based on a point in time.

The normal cycle of an engagement survey is that it takes at least a month to plan the survey and communicate it’s coming, a month to administer, a month to get results back, a few more months to dig in and understand the data, a month to determine how to talk about the results, and then at least a few more months to (maybe) make some changes to respond to the data. At which time you are almost ready to start planning next year’s survey, so you feel compelled to report back on what you’ve done to respond to last year’s survey. In essence, the survey cycle takes a whole year. It’s no wonder the typical engagement survey occurs annually, and many organization have or are considering doing them even less often.

Doing the survey even less often challenges the interpretation even more. Here’s the problem. To accurately reflect an employee’s experience, an engagement survey would require an employee to think broadly about everything they feel… over time… disconnected from any short term emotions. Unfortunately, humans simply don’t work that way. We are prone to short-termism, and how we would talk about our feelings on a given day can be wildly variant.

My favorite comparison example comes from my home life. On a (relatively small) number of days through the year, my wife can be heard telling me “I love you, but I don’t like you very much.” If she were to fill out a survey on those days, the conclusions and implications would be very different than if she were to be surveyed even a day later.

It’s this same phenomenon that drives why engagement vendors will advise you to complete the survey at a time separated from major events like bonuses, raises, major announcements, reorganizations… you know, when things are calm and not changing. I think that may have been possible in a different millennium.

We tend to focus on levels rather than change.

The focus on the “level” of engagement starts with our fascination with benchmarks. The rise of the engagement industry came from a few companies building benchmarks across companies by using the same questions. This sounded great – “you mean I can see if my employees are more engaged than other companies? Fantastic!”

Well, that was the promise. But it does not deliver.

The makeup of companies is very different, but benchmarks typically don’t reflect this. Even within the same industry and country, the diversity of a workforce, the purpose of the company, and the mix of job types can be very different. Research has found that such characteristics can lead to different response patterns in employee surveys, so we should expect variance in an engagement survey result. As such, a benchmark becomes less relevant if it does not control for demographics.

Beyond benchmarks, though, the analysis of engagement still tends to focus on “where is it high” and “where is it low.” Analysis is also done to identify drivers of engagement, which is really code for which other questions tend to correlate with how people answer a set of index questions. Those other questions, now known as “drivers,” suffer from the same constraints of being a point-in-time view, and may have bias built in from demographics.

I argue that a better lens is “how has it changed” since this naturally controls for the underlying characteristics of the employee. It also provides more clarity into what changes lead to changes in engagement. We could observe that when someone changes their response about feeling supported by their manager, their response on level of engagement also tends to change, controlling for all other changes known about the employee. That more rigorous analysis requires a time series and more data… but would be notably more actionable.

We over simplify “importance” of an issue.

I love it when engagement surveys ask an employee how important an issue is.

No I don’t. I can’t stand it.

The main challenge nearly all issues end up being cited as important or very important. Who is going to say that being paid fairly is not important? Or that trusting your manager is not important? And what does “important” mean compared to “very important?” Further, importance in what way? Important for me to not quit, or important for me to feel like I want to run through the wall for my company? Because of these challenges, nearly all issues end up being cited as similarly important on average.

But more importantly, we have the data to discover importance – so why do we need to ask about it? We can see how responses and patterns lead to changes in engagement or changes in attrition, and thus can determine what responses are actually important to an outcome measure. Let’s use the data we have, and then efforts to take action can focus on what actually matters.

We ask about and review results “by manager” – not recognizing how teams really work.

Today’s organizations are really networks, not traditional structures. Members of my team interact constantly with people outside of our reporting structure. Leadership comes from within the team on projects and initiatives, and that leadership and coaching is not done by the traditional manager. The shift to agile practices further mixes how people work outside normal org charts.

So when we ask about how someone feels about “their manager” who should they be talking about? And when we review results for a manager, how certain are we that the manager is actually the one with any influence or impact on the underlying sentiment?

Comments without context

Some companies largely ignore the comment questions. Others do a word cloud and move on. And yet others read every single comment and attach even more meaning to them than the other survey questions.

Out of context, comments often mean nothing. Simple word counts fail to recognize sentiment (the word “challenging” can be a good thing or a bad thing).

Survey comments cannot replace feedback and dialogue. This is why so many engagement efforts end up with focus groups as a follow-up task.

So what’s the better way?

I do not write this with a silver bullet. I do see a flood of pulse survey and experience platforms coming to market looking to address this problem, and the traditional vendors (as well as the platform HRIS companies) are investing in addressing these underlying challenges. We will see how the market shapes and how smart people do smart things to respond. That said, a few features or concepts I do see as part of a better way forward:

  • More frequent, simple, globally relevant data gathering
  • Expansion of data gathering beyond surveys to incorporate text-based suggestions and concerns, using enhanced intelligence techniques to identify sentiment
  • Tighter integration with other data about the employee to provide more context and enable more robust analytics
  • Usage of modern analytics to more specifically identify “what matters”
  • Added focus on employee-level changes over time

I will say that I’ve personally experimented with  other methods, and have talked to many who have killed the annual engagement survey. I’d love to hear from more of you. Comment below or send me an email with your thoughts.