Soft Skills Matter
Soft skills are essential, even for the more data-centric professionals. At the start of this chat, Bruce shared his views on some of the most common missteps data analysts make.
“I think the important thing to understand is that analytics, data science, and machine learning mean nothing unless we’re producing something actionable or informing decision-making at some level. There’s naturally a disconnect between the people who are really good at gathering data, doing smart things with it, and those who can leverage modeling in the greater business context. The real experts in the domain can translate data into next steps.
“We, as analysts, need to ask things like:
- What does this newly identified pattern mean for the business?
- Can the data be meaningful to the VP of Marketing or Sales?
- What does this data mean to a CEO?
“The ability to translate data into next steps happens to be the hardest part of the job for any analyst. A level of curation needs to occur to make insights mean something compelling to the rest of the business. It takes an understanding of the greater business context, the ability to ask the right questions to the right people, and experience.
“How you present is probably as important as the results of your analysis. If you want career growth:
- Focus on visual and complementary storytelling.
- Make sure people listen to you when you’re talking and not just nod along to a bunch of words on a document.
- Invest your time understanding how to tell a complicated story in a simple visual manner.
“Taking the time to learn those skills will pay off in dividends.
“Presentation skills and clear communication add more credence to your capabilities than someone saying the exact same thing without those soft skills.”
Requirements Gathering Is a Two-Way Street
No one can be an expert at everything. Even brilliant analysts can’t be expected to understand every element of the backstory behind why people are trying to answer a specific question and what they want to do with the data. Bruce pointed out that delivering the right insights at the right time depends on both parties asking the right questions.
“I wouldn’t blame a junior analyst for coming back to the person asking for insights with data that doesn’t really address the underlying issue we’re trying to solve for. That information should be presented upfront by the wanting or needing the insights. There’s a responsibility on their end to ensure whoever is doing this work, whatever team or whatever management structure that’s at the core of it, the core need has to be broken down and understood. At the end of the day, I should be able to go to a junior data analyst who might not have the full picture of where they sit in the business world and communicate the business drivers in a way that makes sense and informs their discovery process.
“The junior analyst may not need the strategic vision behind the question we’re trying to answer. What they do need to know is how the business will use the statistical breakdown. Often, leaders are bad at doing that. They say, ‘Go use data, go find data, there’s value in data!’ But, then, they don’t give a crumb of guidance on what problems would be interesting to solve with the data they’re asking the analyst to analyze.
“If you let someone take a run at analyzing the data without a backstory, it can still yield great things. You may uncover insights you didn’t even think to ask for. However, expecting that to happen quickly and efficiently is asking someone to be both a business expert, a domain expert, and a data expert simultaneously. The expertise comes over time. But if I’m sitting in an executive role and want the answer to something, it would help if the people responsible for finding the answer know enough context. I also need to make sure they truly understand what I’m asking for.”
Often, data analysts are given a very broad direction without understanding the intent behind it. The responsibility should be carried by both parties involved in the discussion. An essential skill analysts need to develop involves asking the right questions to get at the heart of the issue. They need to understand the why. At the same time, it’s the leader’s responsibility to gracefully accept those questions and not treat that person like they are minions–in other words, they shouldn’t expect them to know the answers and do as they’re asked blindly.
For example, the question “Is marketing working?” isn’t particularly helpful. A more helpful question is, “Which channels are converting into opportunities and how well are marketing and sales working together on that hand-off?”
Even more helpful? Tell the analyst that you’re concerned about how effective marketing and sales are at working together. You want to understand the quality of the leads being passed from one team to the other, how quickly those leads are followed up on, how each channel converts, and if there are any other indicators within that data set that might be interesting to look at.
Sometimes leadership doesn’t know what to ask for. But if the leader is transparent about their originating concern (“I want to make sure marketing and sales are working together as efficiently as possible.”), they run less of a risk of analyzing too many data points and coming back with a garbled mass of insights that don’t really address the underlying concern.
“I need to know how success is measured. What is the baseline stat that indicates efficiency? It’s really easy to get caught up in nuanced conversations, but there are so many different ways to measure ‘marketing performance.’
“One of the quotes I’ll repeat throughout the entire time we work together is a famous one from George Box. ‘All models are wrong, some are useful.’ We don’t let the perfect be the enemy of good.
“Start really simple and then use simplicity as the basis for expansion. If you give the analyst a broad directive, they’re going to be lost, and you aren’t going to get something that’s useful. What usually happens is that a couple of weeks later, you meet with the analyst, and they’ve pulled out their hair trying to give you something. Maybe it’s good, maybe it isn’t. It’s not the most efficient way to realize ROI for either party.”
How Can We Better Prepare Analysts for Executive Meetings?
Analysts fresh out of college, or even experienced analysts new to a company, are understandably nervous when asked to present findings to an executive team. It’s important to understand that leaders are people, they don’t know everything, and they’re not there to tear you down (and if they do, maybe it’s time to look for a new place to work). When you present insights, you should expect executives to apply business context to the data and decide whether or not they use your insights to drive business decisions.
It’s critical to understand that an executive ignoring your advice isn’t a reflection of your ability to analyze data. It may, however, be a reflection of how you framed your findings.
“After my Ph.D., I did a bunch of job talks. I had an hour to talk about my research, which was essentially four years of my life. I look back at that presentation, and it’s so crammed with math and graphs and defenses for the methods I chose to use.
“That’s not a good presentation. Instead of highlighting my findings, I created an argument for why I deserved to be standing there.
“It’s critical to learn how to give an impactful presentation. There’s a level of coaching that goes with learning that skill. It’s the manager’s job to help tee up what the presentation should look like. For example, if I had a junior analyst presenting to the executive team, I would boil it down to 5 minutes and one slide to make a point. Then we would expand on your presentation from there.”
One of the most vital steps to take prior to presenting to executives is to validate your findings and communicate anything out of the ordinary to the impacted teams. These should be done before the presentation!
There should never be any surprises in an executive meeting, and the courtesy of sharing results with impacted teams before a presentation can go a long way in preserving a working relationship.
For example, if you find a marketing tactic that didn’t have the desired results, talk to the marketing team before escalating the findings to management. Get the context around the data (perhaps it’s a tactic that’s hard to measure, so they’ve developed additional means for gathering input). But before circulating your findings, triple check the numbers are correct.
“Being critical of your own results is always important. If I do something and the results are unexpected, my first inclination is to assume I did something wrong or I broke something. That’s an important layer for an analyst because it’s very easy to tell a story and watch it spiral out of control as people run to fix the issue. People move fast, so be certain you’re right.
“I like to treat unexpected findings like experimental results. I need to prove that the numbers have merit with some degree of certainty. That might mean being a little more introspective about my methodology, getting the numbers and approach vetted by others, and seeing if others come to the same findings. This is especially true when conflicting ideas or stories have been told about a perceived problem.”
Finally, provide a document as a pre-read for your meeting that gives an executive summary, details the process for uncovering your findings, and recommends next steps or action items based on the findings. For an in-depth example of such a document, we’ve provided a template for you here.
For more on introducing a more human element to data-driven presentations, listen to the full Revenue Marketing Report episode at the top of the article or anywhere you podcast.