Log In

Marketing Data Mythbusting 101

Posted May 18, 2022
Sasha Samoilov

What’s With the Data Rumor Mill?

Whether you’re in a specific department or part of a cross-functional team, operations professionals will inevitably field a lot of questions that are a result of a gut feeling or suspicion. Cycles are dedicated to tracking the issue down or trying to replicate a bug, and then a PR campaign ensues to correct public opinion.

Where do these wasteful cycles start, and why do they take up so much of our time? Why is it so hard to consistently speak the same data language in a company?

“Honestly, it’s usually a discrepancy in filters. It’s something benign. It’s never like the movies. It’s never a spy or a ploy to make my numbers look better.”

Data-Discrepancy-CaliberMind-Revenue-Marketing-ReportOkay. People do (rarely!) distort the results to look better, but it never ends well.

“There’s more transparency now. There’s more data available, so it’s harder to hide the facts.”

While the reason for different numbers being reported for the same metric is benign 95% of the time, the impact is not.

“People on the board often have a scary ability to point out things. If you have a good partnership with your board, they will be the first people who will say a slide doesn’t make sense. They may feel nervous about investing more money or time giving advice.

“What’s painful about these deltas is that even if we’re off by 10 or 100 in the scope of tens of thousands, people still get fixated and want to throw the entire analysis out of the window. It just doesn’t make sense to me.

“People get so fixated on a tiny detail. We should try to achieve perfection. We should try to get good data and follow best practices, but perfect shouldn’t be the enemy of good.

“I could understand if it’s a large factor like high double digits. We should investigate because that delta could mean that the overall story could drastically change. It could be the difference between doubling down on a tactic or reducing the investment.”

These data disagreements aren’t just frustrating for the analyst. They can impact the reputation of the executive reporting the numbers. 

“There should be a culture of flexibility and forgiveness with respect to small discrepancies. We should understand why some of these variations occur. Come with some ammo to explain some of these variations.”

Avoiding Data Debacles: Leaders Should Set Up Analysts for Success

You wouldn’t believe how many arguments over data in executive forums could be avoided if the executive presenting the data asked the analyst clearer questions.

“What is your end goal? We always come back to, ‘Is marketing working.’ That’s a nebulous question. At the top of the strategy pyramid, we ask ourselves if a department is working well, but I can tell you when you bring that question down to a person who’s trying to pull numbers, you’re setting them up for failure. Unless they’re experienced enough to ask you why you doubt the department’s effectiveness and dig deeper into what you’re actually trying to show, you might not get the correct answer. 

“Leaders expect analysts to translate the meaning of numbers. Leaders should also be able to translate their questions for analysts.”Data-Analyst-CaliberMind-Revenue-Marketing-ReportIn order for a marketing leader to be prepared to field questions about a presentation, they need to have the humility to ask questions and be certain they understand the implications of the data. At the very least, they need to coach their analyst to anticipate questions that may come up. 

“Be curious about why there is a variation in the numbers. It’s okay to have a collaborative conversation with your analyst. As a leader, I need to at least have some cursory understanding of why numbers can change. 

“Also, realize simple terms have different meanings depending on what company and department you’re sitting in. For example, when I say, ‘Let’s email all customers,’ that’s not an easy task. I have never known a finance team, a marketing team, and a CS team define ‘customer’ the same way. And if we have multiple users at an account, do we consider all of them to be our customer? What if we have never talked to them? What if they’re not opted in? 

“If you’re new and don’t know to ask those questions, it’s hard. It’s hard because there’s so much context that has to be given and CMOs just don’t know how to also translate down. They don’t know what they need to ask for. They don’t have the time or prioritize learning from the ops folks who are in the weeds.”

Avoiding Data Debacles: Invest in Data

It’s easier to make the right decisions with clean data. That may seem like an obvious statement, but only 10% of B2B marketers have prioritized investing in technology that can create a unified account and contact timeline, like a customer data platform. 

Why is this so important? According to Dun & Bradstreet research, 91% of CRM data is stale or missing. Combine that with dozens (and according to some, 91) marketing applications that aren’t made to integrate, and you’ve got a hot mess.

So why are so few companies investing in data projects?

“I think it comes down to a few factors. The attitude towards data comes from the top down. The executive suite, the CEO, or your board of directors are the people who decide whether or not it makes sense to allocate budget to data projects. Unless you understand and see value in clean data, these projects are hard to sell. People need to see the value of data in driving a business forward before they can become data-driven.

“Another big hurdle is the volume and complexity of data that we are now dealing with; the volume, variety, and frequency of this data have grown exponentially. Wrangling a massive amount of data is difficult, even if we’re talking about first-party data. Think of the number of visitors, sessions, et cetera that could be collected from your website. That’s a massive amount of information.”

People who haven’t been responsible for gathering data for a presentation can’t know how much work goes into stringing information from an array of systems. If they are told how much time goes into data compilation, they see this time lost as an acceptable or even expected use of time. 

We need to stop communicating the cost of not investing in data as a cumulation of hours wasted by analysts and operations. Instead, we should frame the argument around what those resources could uncover or do with all of that time. How many efficiencies could be gained? How much revenue could be uncovered?

It’s much easier to incentivize people to invest in a project when you can point to revenue gains than when you point to inefficiency in something considered part of the job.



For more insights on how businesses get their data wires crossed and how to fix it, listen to the full Revenue Marketing Report episode at the top of the article or anywhere you podcast.