Gathering data does not make an organization a data-informed entity.
There seems to be a great deal of buzz within the visitor-serving industry about data collection. And for good reason: being a data-informed cultural organization is a smart idea. Where many executive leaders in the past were forced to make “I think” decisions, there’s now more data available to help ensure that these leaders can increasingly make “I know” decisions.
My colleague and I received an interesting question along these lines as we were carrying out a strategic workshop for executive leaders at a museum last week: Are organizations that collect data more successful than those that do not?
I want to say, “Yes, hands-down!” But I have worked with enough organizations with gobs of their own data that are still battling the biggest challenges facing cultural organizations from attendance not keeping pace with population growth, to inclusion difficulties. To say, “yes” to this question would be a lie.
Understandably, part of being data-informed may mean having data to inform strategies and operations in the first place. Here’s the problem: Simply collecting data is useless.
In fact, it may be worse than useless. Data that is collected simply for the sake of collecting data and with no aim to challenge, test, or integrate change can be a waste of time and money. It risks being an entirely symbolic and defensive gesture that may allow leaders to say with sincerity, “We have data,” without actually having anything meaningful to aid an organization in moving forward at all.
Collecting data is not the same thing as understanding audiences.
Collecting data means gathering information, but it doesn’t mean that an organization is necessarily collecting helpful information to inform a strategic way forward, and it certainly doesn’t mean that an organization is actually acting upon the findings.
Some organizations with robust data collection and evaluation processes are still flailing or making poor audience engagement decisions. Despite data collection efforts, some of these organizations still manage to know miraculously little about how their audiences think and behave, what they expect from a visit, the behavioral economics running their institutions, or how to spot trends in order to make smarter engagement decisions. Data is best utilized to spot opportunities, challenge assumptions, and inform decisions because it tells us something about how to best educate and inspire the living and breathing humans who we aim to educate and inspire.
A mistake organizations make is focusing on data collection without attributing greater or equal institutional weight to data interpretation, acceptance, or integration.
Becoming a data-informed cultural organization has four, distinct parts – three of which may be even more important than data collection, but are rarely discussed, comparatively.
1) Data collection
Let’s start here.
The first step in becoming a data-informed cultural organization may be having data to inform the direction of the cultural organization in the first place! More important than even collecting your own data may be having data sources. They may come from your own institution, a reliable publication (oh hey, there!), community partners, universities, or the convention and visitors bureau. Data collection may involve all of these kinds of resources and several others. It doesn’t always mean deploying on-site intercepts to ask about the visitor experience, for instance – but it could!
In the “data-informed cultural organization” conversation, data collection is only one step.
Data collection itself is a complicated topic – and perhaps more complicated than we care to admit.
For starters, data collection is a science best carried out by experts who understand things like biases and how to create effective survey instruments that do not frame findings. A well-meaning Jack or Jane-Of-All-Trades (a common type of person within the nonprofit, visitor-serving industry) may unintentionally skew findings or carry out protocols that accidentally render entire data sets useless. We humans are biased in our assessment of what works and what doesn’t work within cultural organizations, making the creation of survey instruments a bit difficult at times. Our data collecting capabilities also tend to get better over time (hallelujah!), but that can mean that we operate with incorrect baselines if we don’t adjust.
This is all workable! But to work with it, institutions need to know that it needs constant consideration. It’s someone’s job and it’s a serious and important one.
Another inconvenient reality is that audience research and market research are not the same thing. Cultural organizations often aim to collect audience research, but our industry’s reliance on audience research may exacerbate some urgent problems.
Consider this: The US market is growing increasingly diverse in folks who do not profile as being interested in visiting cultural organizations. The people that these organizations need to reach are not following them on social media, and they aren’t onsite to fill out a survey. They are not in an organization’s email list. They are not a cultural organization’s audience so they are not included in audience research.
Instead, we may benefit by capturing the perceptions and behaviors of these audiences through market research – which is often less cost effective for many organizations to carry out on their own.
Today, cultural organizations are not succeeding in diversifying their audiences. Rather, they are reaching traditional audiences better than ever. This may be the outcome of an over-reliance on audience research instead of market research. We need them both.
When some organizations consider becoming a data-informed institution, they may consider only the issues under this heading, and they may forget the following two-thirds of the process of this important process.
2) Data interpretation
If you’ve worked with me or if you’re a client of IMPACTS, then you know that we do not send or distribute our data decks more than an hour before we will be going over the findings. Why not? Because we’ve learned that people think that data speaks for itself – but if they tune out when we share the analysis, they can miss the entire point. Indeed, data does speak for itself, after there’s context provided.
I remember sharing data related to the favorability of exhibit concepts for one organization when I first started working with IMPACTS. The concept testing came back loud and clear: none of the tested options were favorable. Resources spent on any of the tested ideas would not be worth the investment, and they would not move the needle on any meaningful metric aside from draining the budget. We shared the findings with executive leaders, and then they distributed the data without interpretation to the exhibits team, thinking – of course – that the data spoke for itself.
When we checked back two weeks later, the lead exhibit designer thanked us profusely for the data and declared that he’d already made major moves forward on his pet project of the exhibit ideas, which – to his credit – was the least unfavorable of the bunch – but only slightly. “The data is clear!” He said smiling, This idea was the best!”
The CEO was dumbfounded. We were baffled. Indeed, that idea was “best,” but it certainly wasn’t good.
Even when looking at data, people may see what they want to see and ignore inconvenient information. This isn’t humans being stupid. Sometimes its simply humans being human.
Uninterpreted data is misinterpreted data
Data needs a translator. When it’s divorced from context, it can be meaningless or worse – misleading. The data on this site comes with analysis, and yet leaders make common mistakes and sometimes take the data vary far out of its own lane on the grounds of wishful thinking. Sometimes, folks will see data and think the take-away is the exact opposite! I’ve seen people use this data saying that all generations are connected to the web as a justification that baby boomers don’t use the internet. (Huh?) I’ve seen this data about museums being trusted sources used as a case for an entity taking polarizing political action (What the…?). I’ve seen this data that free days do not attract underserved audiences used as justification for more free days so that organizations may attract underserved audiences (I cannot make this up.)
Here’s some of what’s going on when these mistakes are made. Mostly, it’s human to seek out findings that defend our biases (confirmation bias). When data challenges our assumptions, sometimes it’s difficult to even compute these realities.
Data needs a person or people or a division or an internal culture to interpret it fully and state directly what the data means. Data needs a champion. It needs people to keep it in its own lane.
Too often, organizations forget that data cannot easily be collected and distributed without presentation. And when it is, the data risks not only confusion – but confusion masked as ignorant confidence.
3) Data acceptance
While all four elements of data integration are important, this one takes it home because it’s frequently forgotten: Data can be difficult. If you’re doing it right, there’s bound to be bad news.
And that’s good news!
Data informs strategic decisions. If you’re collecting data to affirm strategic decisions, then you may be collecting skewed vanity metrics, or simply not maximizing the power of data. The good stuff gets us questioning. It makes us think. Sometimes, it makes us feel stupid for not knowing something before seeing the data. If the data’s not difficult, then perhaps it’s not helping your organization to grow. And then what’s the point?
If you cannot hear the data, then you cannot use the data.
Here are the three data-informed topics that I write about that receive the most hate mail. They don’t get hate mail because they are untrue! Data is data. They get hate mail precisely because they are true and especially inconvenient.
Good data can make leaders defensive – no doubt about it. Here are three phrases that leaders commonly say when they feel defensive about data. If you hear them or say them, think twice. It means that a finding is hitting a nerve. It means that someone is being challenged. It means that an institution may be on the brink of necessary evolution. That is, of course, if the leader can recognize these phrases as personal defenses!
I’m often asked how to help cultural organizations create a culture of questioning – one that overcomes these defenses and encourages leaders not to say, “That doesn’t apply to this organization.” And instead, ask themselves, “To what extent does this apply to this organization?” This is done by creating a culture that asks questions and values data-informed decision making. It’s not done by sharing data once a year, but by revisiting findings and discussing their implications on an ongoing basis at every level of leadership.
4) Data integration
Only after data is gathered, interpreted, and accepted can the findings be integrated into strategic decision-making.
Data collection does not magically transform into data integration. This is a process that requires intention and – for more and more organizations – a cultural shift in how decisions are made.
What if my colleague and I been asked last week, “Are organizations that integrate data (instead of simply gather or see data) more successful than those that do not?” Then the answer would be, “Yes.” Data needs to be acted upon in order for positive change to take place – not simply gathered, presented, and discussed. Although, indeed, it needs to be gathered, presented, and discussed in order to be acted upon.
Data can be integrated once. For instance, an organization might take into account the digitally connected nature of cultural organization visitors and integrate changes by hiring or supporting social media managers.
To be a data-informed organization on the whole, however, data integration happens more than once. It might happen every day. Being a data-informed cultural organization means having a culture that values data-informed decision-making.
Some organizations symbolically cross “data collection” off of their “to-do” list and forget that it is just as important to put data into context, communicate it, integrate it, and make curiosity a cornerstone of institutional culture. Data needs to be collected, but it also needs to be clearly interpreted, presented, and integrated into strategy in order to provide value.
Cultural professionals educate and inspire people. I like to think that our secret is that, at our best, we are all curious. To share a quote that I famously admire…
“Do what you love. Know your own bone. Gnaw at it, bury it, unearth it, and gnaw at it still.” – Henry David Thoreau.