Oh, being human…it means having opposable thumbs, the capacity to love, and a whole host of cognitive biases that mess up nearly everything we try to accomplish.
In our four-week series on the key elements to becoming a data-driven organization, we’ve covered data collection and data interpretation, but the topic we’re covering today – data acceptance – may be the most difficult. Luckily for us, it’s also probably the most fun to shine a light upon! Simply put, our brains play tricks on us to avoid effortful thought, and those tricks are called cognitive biases. When data is overlooked or misunderstood, cognitive biases are a common culprit.
This week’s Fast Facts Video For Cultural Executives sheds light on three common cognitive biases that can make data-driven decision making more difficult.
In order for an entity to become data-driven, it must first accept the data as valid – even if the findings are inconvenient. While some may think data collection is the trickiest part of becoming data-informed, our experience is that this is far from true. Accepting uncomfortable truths is just that –uncomfortable. Our brains want to avoid the discomfort associated with challenging preconceived notions – even if those preconceived notions are baseless.
But an organization can’t shift its culture to be data-informed until we get comfortable being uncomfortable.
My colleagues and I are increasingly asked to address cognitive biases in speaking engagements and workshops. We’ve written about cognitive biases many times before (though without a snazzy video), so today we’d like to revisit three especially sneaky biases undermining efforts to lead successful cultural organizations.
Confirmation bias is the tendency to interpret evidence as confirmation of one’s preexisting beliefs or hypothesis. Essentially, it means that we see what we want to see, and pay special attention to “evidence” that may prove the things we already believe. You may notice it most easily in regard to political stances. We tend to believe the good things that align with our party, and disregard good things on the other side. We find what we seek out, and we snuff out what makes us uncomfortable. Confirmation bias is one of the most common biases within the cultural industry as well, and perhaps in general.
This is my favorite example of prevalent confirmation bias within the visitor-serving industry: Data shows that broadly publicized free admission days result in attendees that are more educated and have higher household incomes than regular, full-price admission days, but many organizations consider free days to be affordable access programs aimed at attracting low-income individuals!
IMPACTS collected data from 48 cultural organizations that have a regularly scheduled free admission day and found that the average household income of those who visit cultural organizations on free days is $4,668 higher than the average household income on a paid admission day. The average level of education is also higher on free days than full admission days. Moreover, there are also more repeat visitors on free days, meaning that free days generally do not encourage visitation from new audiences, either.
If you’re thinking, “No way! I see plenty of low-income folks at our free days,” hit pause.
Since when do these folks wear signs? Also, if you’re thinking this, it may be confirmation bias at work. Confirmation bias – in this case – may make professionals think their organization is exempt from this well-proven finding because they saw some people who looked “low income” on a free day when they were looking for people who looked a certain way. That same person may find a similar number of “low income”-looking visitors on a regular admission day.
Embedded in this example of confirmation bias may be another dangerous bias: stereotyping.
Looking at the bigger picture, the fact that broadly publicized free admission days do not attract lower income visitors shouldn’t be surprising at all. Organizations tend to publicize these free days on the same communication channels they use for everything else (targeting a generally wealthier bunch), and who doesn’t love a deal? Moreover, the type of people who go to cultural organizations are the type of people who go to cultural organizations and being perceived as unwelcoming or unworthy of a person’s time are bigger barriers to visitation than cost.
This is but one example of confirmation bias in action, but it commonly takes place when we evaluate the success of programs without hard data. We may believe a program was engaging, perceived as welcoming, or successful because “we saw it with our own eyes.” In reality, our own eyes often see what they want to see, and collect evidence to underscore what we already believe to be true… Even if it’s not.
Before we get into availability cascades, I’d like to introduce their close cousin: Availability heuristics. These are mental shortcuts that rely on immediate examples that come to mind. The media can play an interesting role in availability heuristics, too: they are a reason why people overestimate the frequency of shark attacks, for example. They are relatively uncommon, but the idea of a shark attack is graphic and cases are reported by news outlets. Thanks to availability heuristics, you may find yourself steering clear of the water in order to avoid a shark attack. Availability heuristics cause us to misjudge the frequency and magnitude of events.
In the world of visitor-serving organizations, availability heuristics are a big reason cultural organizations often overestimate the rate of membership fraud they experience. (They remember the person they caught trying to sneak in without a membership, and not the hundreds or thousands of people who came in with active memberships who were not trying to defraud the institution.)
An availability cascade is another mental shortcut based on how easily something comes to mind. An availability cascade is a self-reinforcing cycle that explains the development of certain kinds of collective beliefs. Essentially, the more often we hear or say something, the more likely we are to believe it is true – even if it is false.
The thing is, even if insider professionals tell themselves over and over that something is true about audience behaviors and perceptions, it doesn’t actually make it true. There are enough examples of this within our industry to fill an article on its own. (“If we build it, they will come” is based upon an excellent movie quote, but tickets to cultural organizations are not generally bought. They are sold.)
One once-popular availability cascade (saying things so often that we believe they are true) is the idea that mobile apps are a cure for all engagement ails. There was a time in which leaders told themselves at conferences and amongst one another that mobile applications were the key to success. On the whole, they aren’t. This is an example of an availability cascade, but it’s also an example of the bandwagon effect in action. This is another cognitive bias in which the uptake of ideas or beliefs increases the more they have been adapted by others.
Conservatism bias refers to the tendency to revise one’s beliefs insufficiently when presented with new evidence. As humans, we tend to over-weigh past perceptions and under-weigh new information. This can make decision-makers slow to react to new information and place too much weight on past ideas.
One example is the weirdly stubborn misconception that only millennials use the web and social media as a primary information source. This has been disproven multiple times by multiple sources. Social media, web, and mobile web are the top information sources for likely visitors to cultural organizations – regardless of age!
Still, many organizations have been slow to embrace the importance of web-based platforms among non-millennial audiences.
And, from a conservatism bias standpoint, this makes sense. Consider this: before the web, professionals targeted Baby Boomers on traditional media channels for decades. Entities sunk or swam based on how effectively they attracted Baby Boomer audiences (the then-largest generation) on channels such as television, radio, and print advertising. Social media and digital platforms are still comparatively new. We may associate success in reaching Baby Boomer audiences with traditional media channels, and that may be why the fact they now use the web as a primary source of information is still proving difficult for some leaders to wrap their minds around. It forces people to evolve their thinking after decades of knowing that these folks primarily use traditional platforms (because that’s all that was around), and that new idea is difficult.
But it doesn’t change today’s reality.
Cognitive biases are human. We all have them. They’re arguably especially relevant for cultural leaders given that our own opinions generally don’t represent those of potential visitors. As insider experts, we have skewed perspectives. We know what our institutions are aiming to accomplish, we tend to know our missions by heart, we generally know a whole lot about our content areas, and – critically – we generally already think those subjects are cool and relevant.
Shining a light on our own blind spots is challenging because they are blind spots. In fact, the bias of not recognizing your own biases is a bias: Blind spot bias.
It takes a lot of evidence and effort for us to change even our own minds! When you think you know something based on your own, anecdotal evidence, look into it. You might be onto something…
…or it might be a cognitive bias.
Nerd out with us every other Wednesday! Subscribe here to get the most recent data and analysis on cultural organizations in your inbox.
This is Part 3 in the series on becoming a data-informed organization.
For Part 1, see Let’s Talk Nerdy – Collecting Good Data.
For Part 4, see Seven Things Data-Informed Organizations Do Differently.