Without data analysts and advocates, critical avenues for success may be easily overlooked or ignored.
As any organization that has worked with data knows, helpful findings cannot do much if they are simply discovered and then stored away in a document without strategic discussion followed by action. This article is about that strategic discussion. It’s about data interpretation – and getting the word out so folks within an organization fully understand what the findings actually mean.
Last week, we started a series about the four steps to becoming a data-informed cultural organization. We started with data collection. After all, it’s difficult to become a data-informed cultural entity without data to help make high-confidence strategic decisions.
The link between collecting data and accepting it as a valid basis for making strategic decisions lies in data interpretation. These roles are carried out by leadership teams in some cases, whole departments in others, or – sometimes – one tireless and nerdy data champion who probably deserves a raise.
(Please give them a raise.)
Here’s why data needs analysts and advocates:
1) Data needs an insider (who knows what the findings mean)
For somebody to accept data findings, there often needs to be someone else who presents the findings and explains what the data means. For readers of this website, that’s me and my team at IMPACTS. (Hi there.) We aim to provide analysis to help put the findings into context and explain its broader implications.
Similarly, individual organizations benefit by having someone who understands the data they’ve collected and can easily share the findings with others. This person – or these people – know how the data was collected; how big the sample size was; how the participants were found and contacted; what questions were asked; what key insights the organization was aiming to uncover, and the like.
But equally important is having someone who explains clearly what the data doesn’t mean, and what still needs to be measured. For instance, we see folks translate their own “length of stay” data to mean whole hosts of things that overlook the complexity of how people engage with cultural organizations. How long somebody stays at an organization or within an exhibit is helpful information for traffic flow and determining where to fill out experiences… but it doesn’t necessarily tell the whole story of visitor satisfaction.
Single metrics tell important stories, but it is difficult for them to tell full stories if your organization isn’t continuously in-market collecting a whole bunch of information (and most aren’t). A data set may show that entertainment value motivates a visit (it does), but that doesn’t mean education value is unimportant. You need to have more information to understand that relationship. Some data sets require insider information as to the back-end of the data in order to understand what the heck is going on.
Data helps answer important questions that allow us to move forward with confidence, and it also raises additional questions to help us get even better over time. That’s a good thing! Understanding something about our audiences often leads to understanding even more about them. (And, critically, having more questions need not mean an entity should stall in acting on the answers it does have!)
We’re often asked to “show” big topics using only one data set and the results are composite metrics like this one showing that organizations that highlight their missions outperform those marketing primarily as attractions. We can certainly do it… but we’re also a big data company tracking 224 organizations, with a survey of 132,000 individuals, and conducting research using a multiplicity of advanced technologies and open-ended questions.
So be nice to your own in-house data detectives.
…And be willing to listen and ask questions while they gather the clues to help inform your strategy.
2) Data needs a storyteller (because data often tells a story)
Data does not always speak for itself because people often see what they want to see. (Yes, people even see what they want to see in data.)
I learned this early in my career with IMPACTS while working with a cultural organization client that was market testing possible new exhibits. The organization’s exhibits team conducted the brainstorming work and developed four concepts for us to test in order to assess how likely people would be to visit the exhibit.
What came back loudly and clearly was that none of the concepts were going to increase attendance or even have much impact on the organization’s reputation. Our team presented this bad news to executive leadership and the exhibits team. Though there were some helpful hints for a new direction, it was clear that none of these four concepts would be worth the cost. It was time to take what we’d learned and come up with new ideas to test that would be successful in meeting our goals!
Of course, one of the four concepts tested least-badly. Let’s call it Concept A. You can imagine our surprise – and the CEO’s surprise, amazingly – when we returned to collect concepts for a new round of testing several weeks later, but the exhibits team told us it wasn’t necessary. As it turns out, they’d spent those weeks fleshing out Concept A – the “least bad” option. “We are so glad that this concept tested the best,” the director of exhibits said, “We’d hoped it would. Here are our detailed renderings. We’re excited to share them with you.”
The exhibit team was in the room when we went over the data. They saw the same data as the CEO and other executive leaders. But something human happened: They only heard what they wanted to hear. Because sometimes it’s the most helpful data that also stinks the most.
He was half right, after all. Concept A did test the best… It tested the best of the unsuccessful concepts that were tested, all of which were proven to have little payoff for the organization.
Data tells a story. It needs somebody to tell it and, sometimes, to reinforce it.
3) Data needs a translator (because it may be easily misunderstood)
People can “go blind” to critical aspects of data – especially when it’s inconvenient or about something they personally care about and may have feelings toward. I’ve previously shared three dangerous misunderstandings about data among cultural executives.
A common issue is thinking that sample size is more important than having a representative sample. Indeed, sample size is very important, but consider this: Data from 20,000 Republications in the US may yield different opinions of Donald Trump than a sample of 2,000 representative individuals in the US. A representative sample closely matches the characteristics of the population as a whole, or the characteristics of the group you’re trying to learn more about. One has a much bigger sample size, but if a goal is to understand US sentiment, those 20,000 folks do not provide a helpful snapshot. Similarly, when cultural organizations ask folks in their email lists or onsite about their reputations and how welcoming they are, they are collecting a skewed sample – even if it’s large. An organization’s reputation is determined by non-visitors and how it’s perceived as much as it’s determined by the folks who go, if the aim is to figure out how it is viewed overall.
Folks also tend to go blind to key take-aways when there are comparisons, as the exhibits team did in the story I told in the last section. Here’s another common example: Sometimes folks will see that millennials are extremely connected to social media compared to older generations, but overlook the fact that all generations right now are extremely connected to social media and the web – and that’s the real point. Yes, millennials are even more connected, but digital platforms are still primary sources for baby boomers as well!
Data often needs someone who can competently “speak data” and also “speak leadership” in order to help folks focus on the things that matter.
4) Data needs a champion (because it is best kept front-and-center)
There are other strange, data-related cognitive tendencies as well – like the want to disregard any data that’s not a “magic bullet” for success. I dive into market research on perceptions and behaviors surrounding cultural organizations for a living and here’s some data-based bad news: There isn’t a single magic bullet.
Running a successful cultural organization relies upon effectively managing a multiplicity of factors. Don’t write off clues to be more effective simply because they aren’t the elusive, magic-bullet of the cultural sector that is going to singlehandedly triple tourism to your city. Cultural executives are not dummies. If that single, universal unicorn solution existed, our visitor bases would not be shrinking, but they are.
To that end, data needs a champion. It needs someone to make sure that we don’t forget about important findings, because forgetting about them is often much easier than tackling them head-on. Data needs someone to keep findings top-of-mind, someone to say, “don’t forget…” and someone to keep asking the hard questions.
Being a data advocate, champion, translator, or storyteller – whether you are a CEO, board member, staff member, or volunteer – is tough work. It’s work at the front of the boat, cutting through the strongest tides as executive leaders aim to turn big ships around to better educate and inspire people in a more intelligent and connected world. It’s rough and it’s challenging, and it’s too often overlooked.
Data needs advocates – or it risks being misunderstood, unused, or forgotten.
And your organization risks getting lost in the tide.
I am a data interpreter myself. Though I speak from experience on these rough waters, I may be biased. Speaking of biases, we’ll be talking about them next week when we cover the topic of “data acceptance.” Subscribe here so that you don’t miss it.