Skip to main content
  1. All the Posts/

Managing Data Stakeholders

·2310 words·11 mins
Data Analytics People
Table of Contents

In small businesses, one of the most common conversations I have in my work is how do we manage our stakeholders.

More specifically:

  • How do we decide what to work on?
  • How do we make sure it’s driving value?
  • How do we handle the projects we don’t have time for?

These are common problems for almost any team in every organisation, but there are a few specific compounding problems which face analytics and data teams in small organisations which make them particularly challenging and a common point of conflict.

Context & status quo
#

Data teams in small businesses are often staffed, by design, entirely with analysts, hired to be great logical problem solvers, and to get satisfaction from helping people. They almost always come from analytical individual contributor roles. They’re almost always in the earlier part of their career and still learning both the soft and hard skills. Their motivation for being in these roles is the freedom and engagement they can get to allow them to learn faster and develop faster. Being in the small business is an investment in their career (that’s why they accepted the slightly smaller pay package 😉).

The other hallmark of smaller businesses is often that the rest of the organisation is time poor. There is more work to do than there is available time. In a startup environment, they’re also likely ideas people, in that they are always thinking about new ways of doing things, and are curious about what their customers/users are doing.

The combination of these two groups often results in a very common pattern:

  • The non data people in the team have a never ending list of questions, which change regularly. Plans change quickly, and they’re often led by different people “hustling” to get their projects moving.
  • The data people like saying yes, they don’t like disappointing people and importantly, they’re looking for validation and to prove that they know what they’re doing. They want to sell people on using data.
  • The combination is that the data people say yes to too many things. In the worst cases, they don’t say “no” to anything, what isn’t prioritised and the top of the list goes on “the backlog” and the stakeholders are sold “soon”.
  • The data people then try really hard to deliver on what they’ve committed to. They start to burn out from doing too many things. Many of the things they deliver on aren’t used (which is demoralising), and lots is not delivered at all.
  • The stakeholders lose confidence, because most of the things they were told would happen, don’t. Even the things that do get delivered feel like small potatoes in comparison.

Nobody is happy.

If this feels familiar it’s because it is so common. And what makes it so damning is that most of the people involved know that it’s happening. The analysts know that most of what they do isn’t that important, the stakeholders don’t feel engaged and nobody is happy with the results.

What is tricky, is how to break out of the loop.

What does good look like?
#

Before we talk about how we get there, let’s take a moment to think about where we want to get to.

Fundamentally you all work for the same business. In theory everyone would agree that scarce resources would be spent on the things which drive the best outcomes for the whole. This usually isn’t a hard sell. The hard bits are:

  1. Do we agree on what will deliver the most value for the organisation?
  2. Unless you’re in a very small organisation, most people are incentivised not just by the collective goals, but by individual goals. Are we all evaluating outcomes by the same criteria?
  3. Are we aligned on the size and scale of the options available?
  4. Are we aligned on what else needs to happen around the rest of the organisation to deliver on that opportunity?

If everyone was aligned on each of these factors, prioritisation discussions are easy. But given they’re not, there must be some uncomfortable truths that we’re not talking about.

The uncomfortable truths
#

1. We’re not all pulling in the same direction
#

Even in businesses with very meaningful incentives to try and get people to pull in the same direction (like share options or similar), it is impossible for everyone to be working in precisely the same direction. Factors like career growth, pet projects, information asymmetry and individual goal setting mean that everyone is always at least slightly misaligned.

This isn’t a bug, it’s a feature. A diversity of viewpoints and goals is good for creativity and innovation because it stokes debate. We only get that bonus however if we have the conversation.

If you’re responsible for building a data roadmap, make sure you have that conversation with each of your stakeholders individually. What does success look like for them? What are they trying to achieve?

2. Your stakeholders don’t ask for what they need
#

…they ask for what they think is the first step (if you’re lucky).

Most of your commercial stakeholders (even the senior ones), don’t know precisely what they want most of the time. Even with the fairly data savvy ones like finance leads, they usually don’t know how to get what they want.

This is a fundamental feature of working in data. Because it’s exploratory and because things are unknown, they can’t pin down precisely what they want and write it in a scope, because they don’t know.

Compounding that problem, is that because they aren’t professional data people, they often don’t have the language or understanding to even phrase what they want precisely.

If you’re responsible for building a data roadmap, don’t expect your stakeholders to be precise (because they’ll just narrow down onto the wrong thing). Ask them to talk about what they want to do with your work, ask them what good looks like, or how they’ll evaluate success. Listen to what they say, and then you have to do the work of synthesising that into a meaningful goal.

3. People think that pipelines are hard, and analysis is easy
#

All data teams are by definition resource constrained. There are always more questions, not all can be answered, and so your stakeholders will always have to accept that some things aren’t going to get done.

As a result, they’ll pre-filter to the things they can’t do themselves. That usually means asking for pipelines and dashboards. Things that they don’t have the skills or tooling to build themselves. They won’t ask for (or at least they’re less likely to ask for) analysis.

The unfortunate truth is because they think they can do it themselves. They’d rather have twice as many data sources and do their own analysis rather than have less data and get you to do it.

In some cases, this is true. Some stakeholders are great problem solvers, they do have good critical thinking skills and they do understand statistics. However, most don’t, even though they think they do. This is a great example of the Dunning-Kruger effect where people who know a little about a subject, tend to vastly overstate their actual knowledge in that subject. Put simply: They got taught basic statistics in school, but have no context on how much they don’t know, so assume they know basically everything they need.

For seasoned data professionals, we know that while data pipelines can be time consuming, we know that the place that experience and judgement really matter is in the analysis. That’s where you make real decisions about what the information means, how much you trust it, how representative it is, and to what confidence we can reach certain conclusions.

If you’re responsible for building a data roadmap, you need to have discussions about how much time we’re going to spend getting data and how much time analysing it. If your stakeholders are trying to minimise the amount of time on analysis, have a frank conversation with them about how important it is for that analysis to be good, and whether someone in their team has the skills to do that reliably.

4. Analysis which doesn’t have an effect is a waste of time
#

(this is my favourite)

For most of our stakeholders, asking questions is cheap. It looks good too. They appear curious, supporters of data driven decision making. They show lots of graphs and charts in their presentations, and they keep large teams of analysts busy.

…but…

How much of the analysis which their teams consume matters. If you took it away, what would happen?

  • Would any decisions change?
  • Would the pace of decision making change?
  • Would the overall results suffer?

I’m not going to argue that analysis which doesn’t change a decision isn’t valuable. There is value in confirming a course of action or triangulation of one result against another dataset. BUT if a significant proportion of the analysis done for a team appears to be busy work, and is ultimately inconsequential then there’s a problem.

This can be quite an insidious problem, because these stakeholders can help you justify larger teams, they can sometimes help you out in other ways to keep the data flowing, but unless they have an incentive to deliver the right work, they will only ever care about quantity. They will also sometimes fight tooth and nail to keep data flowing, even when that data has absolutely no effect.

What is necessary is an understanding that analysis is done, on the expectation that it will be used. If you can’t pinpoint how when planning something, there is a good argument that it’s not yet well scoped. If there was a good plan, but the team still isn’t acting on it, there should be a good case for not doing that kind of work again.

This importantly takes the relationship from an “assumed service” (something which a team just takes for granted, and therefore puts relatively low value on), to a mutual respect. One where both parties are invested in making sure the right questions are answered, because there is an understanding that there is a cost to getting it wrong (easy to see for the data team: wasted time, but also true for the requesting team: their judgement on what is valuable would be called into question).

If you’re responsible for building a data roadmap, you need to have proactive conversations with your stakeholders about how data & analytics is being used. At the planning stage, how it will be used; and after delivery, what effect it had. Favour the demands of teams that actually hold up their end of the bargain.

5. They’re not responsible for your teams effectiveness. You are
#

Finally, and most simply. This is your job.

If your team is working on the wrong things, is demoralised because it’s overworked, and is missing deadlines across the board: that’s your problem. Any attempt to blame this on “bad stakeholders” is just an admission that you’re letting others plan what your team does and letting others do it badly. It’s not their responsibility, it’s not their fault.

We already saw above, that it’s unlikely all of your stakeholders are going to agree on the roadmap without your input. Even if they did, there is absolutely no guarantee that they’ll agree on the highest leverage items. There’s almost no possibility that they’ve planned an appropriate amount of work even if it’s prioritised perfectly.

A good principle in management more generally is that you can’t hold someone responsible, without also giving them decision making power. If you feel like you don’t have the power to say “no” but someone is holding you responsible for the effects of that, you need to have a frank conversation with them:

  • If I genuinely don’t have decision making power, why are you holding me responsible? It’s not my decision.
  • If you’re holding me responsible, let me make decisions, so that there’s something to hold me responsible for.

In most of the conversations I have, data professionals often don’t like stepping into this decision maker role, because they’re so used to advising on other decisions. In almost all of those cases everyone else was expecting them to make decisions, it’s only because they weren’t, that they’d started stepping in.

Step into your power
#

So if the fundamental problem is that data teams aren’t taking ownership of their roadmaps, and the solution is to step into that decision making power, how do you actually do that? How do you have these uncomfortable conversations with stakeholders who may not want to hear them? How do you say “no” without burning bridges, or negotiate priorities without alienating the people you need to work with?

The answer lies in understanding that this isn’t just about being right about what should be prioritised. It’s about influencing people to come along with you. That’s a different skill set, and one that most analysts haven’t been trained in. The good news is that there are well established frameworks from other fields which we can borrow.

With all this in mind, the call is clear, which is to step into this power. Nobody will do it for you, and unfortunately, they may not see the issue clearly enough to tell you to do it. Almost always, they will however thank you for doing it when you do.

What we haven’t covered here are some of the pointers and tools to use when doing this. In particular, negotiation styles, which I’ve written about in the sister post to this one:

Negotiation Styles for Data People
·2360 words·12 mins
Data Analytics People
How do I get people to do stuff?

If you (or someone in your team) want help working on this as a skill, you might want to check out the Mentoring section of the A14K site.

Alan Cruickshank
Author
Alan Cruickshank
Director @ A14K, Author & Maintainer @ SQLFluff & ex Head of Data & Insights Director @ tails.com. Alan likes building things, breaking things and measuring things.

Related

Negotiation Styles for Data People
·2360 words·12 mins
Data Analytics People
How do I get people to do stuff?
Big data doesn't mean it's the right data.
·943 words·5 mins
Data Analytics People
The information you have is not the information you want. The information you want is not the information you have.
Analytics is Subjective
·985 words·5 mins
Data Analytics People
…or at least can never be “purely” objective.