What I learned from CADA’s 2018 Accountability Report
Last year, when Calgary Arts Development released its annual Accountability Report, I found that the numbers told an interesting story that wasn’t necessarily reflected in the communication around the report. You can read the whole thing if you like, but the main takeaway was this:
While Cornerstone organizations get the bulk of the attention (and the bulk of the funding), the other organizations in Calgary’s arts landscape aren’t just filler. They hold more events with fewer staff and less funding, and bring in more of an audience per dollar invested by CADA. The arts landscape is complex, and I wasn’t in any way saying that one scale of organization is more worthy of funding than another, just that it’s worth digging behind the big, bold red numbers of the report for a more complete picture of what’s happening in Calgary.
Well, it’s been a year, and CADA has released its newest numbers around what’s been funded, along with enough information about events, audiences, staff and volunteers to put together a pretty interesting picture of what’s been happening. I encourage you to dig into it yourself, but here are a few things that jumped out at me.
(Also I’m just one person that basically did this over a pair of lunch hours, so if any of my numbers/interpretations/etc are blatantly wrong, please let me know and I’ll happily fix it.)
Everyone did more with less — but especially festivals
Talk to any arts organization and they’ll tell you they’re being stretched thin. It’s the nature of the field. But Calgary’s festivals have a pretty compelling case to make. Their overall CADA funding dropped by 12% (an average decrease of about $2,000 per organization), and so did their staffing. But they held 25% more events than they did in 2017, and drew in about 10,000 more Calgarians to their events. That’s only about a 2% increase in attendance, which means attendance per event dropped overall, but still.
Festivals are actually the only one of the four categories of CADA to see decreases in average FTE staff, and the only one to see a significant drop in overall CADA funding (although Cornerstones also saw a 1% drop in that category). But the picture changes slightly when you look at the average operating grant per organization. CADA added a net change of 20 organizations to its operating grant pool in 2018, which shifted the amounts available per organization, and meant that all four streams saw a drop in average per-organization funding. In terms of percentage change, Community organizations were hit the hardest, with a 16% drop for the average organization, and Cornerstones saw the smallest drop at 1%. But in terms of absolute value, Festivals’ $2,092 drop was by far the largest, especially considering the total number of festivals competing for funds actually dropped. It’s easy to see how a 28% increase in the number of community organizations would lead to a lower share per group, but less obvious why that would happen when there’s one less fest in 2018 than in 2017.
(It’s worth mentioning here that the hit to Cornerstones in total funding is probably understated in these numbers. In 2017, there were a number of project grants and bridge funding that disproportionately went to Cornerstones and weren’t offered again in 2018, which means their total funding almost definitely dropped more than 1%. But for this comparison, I’m only looking at operating funding.)
Fewer people came to events — but especially Community events
*Since I posted this, it was pointed out by the always-astute Claudia Bustos that 2017 would’ve included Canada150 celebrations, which explains some of the shifts seen below. It especially explains why there’d be 1,300 fewer total events, and why overall attendance would drop year-to-year. It doesn’t exactly explain the drop in Community attendance since they held the same number of activities, but it makes sense that Canada150 might’ve inspired more attention and more attendance at Community events in particular. (This is why it’s always nice to have other people check your work)
Total attendance at arts events in Calgary dropped by about 430,000 people, or about 13% of total attendance. Here’s the crazy thing, though: 400,000 of that attendance drop came from Community organizations, whose overall attendance went from 1,009,990 in 2017 to 600,229 in 2018. Cornerstone organizations also saw a drop of around 146,000 total attendees, while Professional orgs and festivals had increases of 109,000 and 10,000 respectively to make up some of that gap.
Part of that drop might’ve been from the fact that there were about 1,300 fewer arts activities held in 2018 than 2017, although that doesn’t really explain the drop for Community organizations, as they held basically the same number of events in both years. Part of it might be related to the economy, or really any number of external factors. It’s also worth noting that two years of data really doesn’t show a trend, as either year could be an anomaly [as noted above, 2017 was almost definitely the anomaly]. In any case, it’ll be interesting to see how any funding increases in the 2019/20 year affect those numbers.
(Since it’s been asked on Twitter, Community orgs are organizations that are mostly volunteer run, with a focus on community engagement over professional arts presentation. You can see the full list of CADA-funded Community organizations here.)
Volunteers are the lifeblood of the sector — but especially outside the Cornerstones
Every organization intuitively knows this, but seeing it put into numbers paints a stark situation. The average non-cornerstone has a staff of just 2.6 Full Time Equivalents, compared to 2.9 FTEs last year. Professional orgs and festivals come in at just over 3 FTEs, and Community organizations average just 1.6. Cornerstones are a massive outlier here, with an average staff per organization of 41 FTEs, or almost 16x the average non-cornerstone.
Without volunteers, the average non-cornerstone holds about 50 events per staff member, with Festivals pulling up that average with their whopping 98 events per staff. For cornerstones, it’s just five events per staff member. In terms of audience size, for the non-cornerstones, there’s one staff member per 8,000 attendees on average — Festivals again pull this up with their 13,143 attendees per staff member.
Volunteers are the only way to make those numbers work. When you add in volunteers, the shift is dramatic. Instead of 50 events per staff member, it’s 0.6 events per combined staff and volunteer pool. Instead of 8,000 attendees per worker, it’s 107. For Cornerstones, the difference is less dramatic, but still significant: events per staff goes from 5 to 0.5; audience per staff goes from 2400 to 218. Volunteers are the only reason organizations can take on this many events, and their contribution absolutely cannot be overstated.
Non-Cornerstones are still underfunded in terms of outputs
This one is admittedly more subjective, and my biases could be coming into play. As I discussed last year, with their funding, Cornerstones generate more economic activity and create more employment. They attract a larger audience per event, and create works on a scale that attracts more media coverage and more attention overall.
But there are some stats that don’t sit right with me. CADA invests $810.61 per Cornerstone event, but only $327 per Professional event, $123 per Community event, and $120 per Festival event. Even factoring in audience, that’s $1.80 per Cornerstone attendee, compared to $1.67 for Professional orgs, $0.83 for Festivals, and $0.63 for Community organizations. Adding in attendance at educational activities (it’s not super clear from the CADA report if those are included in the total attendance numbers) doesn’t change much: $1.53 per attendee at Cornerstones, versus $1.47 for Professionals, $0.81 for Festivals, and $0.59 for Community organizations.
Does it make sense that the average Cornerstone, at $178,172 in CADA funding, gets 5x the funding of the average Festival ($33,672), 7x the average Professional org ($25,020), and 30x the funding of the average Community organization ($6,322)? There seems to be a chicken-and-egg situation, where more funding is attracted by the larger audiences, more prominent marketing and higher audience numbers, which are a direct result of (among other factors) significantly higher budgets. But when the largest organizations receive 4x the funding per event compared to other organizations, that absolutely has an impact on the ability to hire artists and attract audiences, which in turn has an impact on ability to attract sponsors and other funding.
On the other hand, are non-Cornerstones shooting themselves in the foot by holding too many events, rather than putting more resources into fewer, higher-impact activities? And if they tried that approach, would they still receive the same funding they are now, or does the current model encourage overreach by all but the biggest organizations (and even by some of the biggest)? What part of these discrepancies is organic, what is systemic, and what is just an illusion caused by different organizational goals that aren’t accurately reflected by things like attendance, audience and funding?
Not every organization needs more funding to achieve its goals. Not every organization is aiming to attract larger audiences or hold more events. Not everyone defines success through a narrow, economic lens — in fact, I’d argue that no organization defines success that way, and no funder wants them to. It’s just the easiest lens to discuss, because it has concrete numbers that look objective, where most other measures are much more abstract.
Last takeaway: I’m really curious how this will change
Here’s the thing about all the stats, all the numbers, all the questions I’ve just asked: they’re all a bit moot now that CADA’s funding has doubled, the Cornerstone program has been scrapped, and the future of that funding is being sorted out. Sure, if the proportion of funding that organizations receive don’t change, the disparities I pointed out will still exist, even if the program title doesn’t. But the scale of the changes taking place means there’s a unique opportunity here to actually reflect on those questions, and to think about how different approaches might make them better — and what other complications that would cause.
This post is just one take on those numbers, filtered through my own biases of which numbers to combine, what jumps out as interesting, and what I’d like to know more about. I’d be really curious what other members of the community think — do these numbers reflect your experience? Do you have a different interpretation of them, or an explanation for some of the bigger changes? What would a better system look like, or is this one just about right? And maybe most importantly, should more of us be talking about this? Not in terms of public advocacy, just in terms of sharing, within the community, what our organizations are going through, and making sure we’re telling our stories properly? There’s a lot more happening than can ever be seen in adding up averages from a community report, and we won’t fix our problems until we actually understand them.