No More Magic Algorithms: Cultural Policy in an Era of Discoverability

Fenwick McKelvey
Data & Society: Points
6 min readMay 9, 2016

Call it Discoverability. That’s what Canadian cultural institutions have labelled the new paradigm of broadcasting. Discoverability will be the focus of a summit happening in Toronto this week. While the venue might be local, the summit deserves international attention as a point of intersection between cultural policy and algorithms.

Discoverability refers to the “technological changes [that] have transformed how audiences consume content and made it possible for content from any source to bypass the regulated broadcasting system.” Bypassing the regulated systems poses significant challenges to the historic institutions of Canadian cultural policy like its main media regulatory institution, the Canadian Radio-television and Telecommunications Commission (CRTC). The CRTC has a mandate to “safeguard, enrich and strengthen the cultural, political, social and economic fabric of Canada” according to one of policy objectives of Canada’s Broadcasting Act.

How do algorithms relate to enriching Canada’s social fabric? CRTC regulation actively promotes Canadian content. CanCon is a secret language spoken by Canadians and protected by the CRTC. In its first blockbuster decision on the future of broadcasting, the CRTC briefly touched on the influence of algorithms. These computational routines running in the background of clouds and computers “may be used by services to suggest content that may be of interest to their viewers based on their historical preferences” and may influence “how Canadian programming is discovered and promoted.”

Algorithms influence the discovery of Canadian content as they decide what is trending or make suggestions on online content platforms. Their influence subtly guide how we discover content. Eli Pariser, CEO of Upworthy, famously voiced a concern that online we each live in our own personalized media filter bubbles. What we can see, what’s brought to our attention, and what we discover now largely depend on the work of algorithms mining our habits and making recommendations. Our search results in Google or Facebook vary depending on our user profiles. Our Netflix and Amazon homepages display lists of tailored recommendations. Finish a YouTube video, and Google is ready with bespoke suggestions about what we should watch next.

As the CRTC examines its role in an era of Discoverability, algorithms may be more important to the regulator because they manage attention similar to the traditional influence of television and radio broadcasters.

Our traditional broadcasting system is just one solution to what cultural theorist Jonathan Crary describes as an “ongoing crisis of attentiveness” where “the changing configurations of capitalism continually push attention and distraction to new limits and thresholds… and then respond with new methods of managing and regulating perception.” Canadian broadcasting is seeking a way to manage attention. Ads, programs, and Canadian content share the schedule to create the uninterrupted flow we watch habitually. The paying of our attention, literally, has been a profitable business in Canada especially with genre protection and other ways to limit distractions.

The Discoverability Summit offers a chance to imagine an algorithmic cultural policy and ask: What could one look like? The CRTC, in the past, used quotas as a way to ensure Canadian content had a place in prime time television. The Let’s Talk TV decisions deliberately shifted the CRTC away from quotas so we won’t likely see a call for one in five algorithmic recommendations for content to be Canadian in origin. But quotas, for all their faults, struck a balance between technical transparency and regulatory accountability. Quotas recognized the influence of the television schedule and provided clear guidelines how the schedule had to be designed to promote certain Canadian content.

Nothing as simple appears in an era of Discoverability. The Canadian Media Fund has admitted that “there exists no magical algorithm to make English Canadians prefer Canadian content”.

While algorithms might be influential, they’re apparently not influential enough to make anyone watch another season of Murdoch Mysteries (CanCon). New policies, instead, should focus on the link between algorithms and content discovery.

Could policies exist to guide what algorithms find relevant? Algorithms depend on data to make recommendations. Could certain cultural metadata be embedded to ensure Canadian content has other pathways of discovery beyond a platform’s usual recommendations? Or could the Canadian industry agree to relevance standards that would help them qualify influence just like Upworthy and NPR have developed new ways to identify stories that matter to their readers.

While we debate Canadian content algorithms, we also have to take seriously the much broader accountability issues raised by algorithms. It’s not just a matter of whether algorithms should promote Canadian content, but how we can understand this black box of new media power in the first place.

Algorithmic recommendations can have troubling outcomes. Mike Ananny, Assistant Professor at the University of Southern California, wondered what Google’s App Store was suggesting when it recommended a Sex Offender Search tool to someone browsing the page of Grindr, a gay men’s dating app. Google later called it a bug.

Facebook sparked public outrage after revealing that they’d conducted experiments on their users without informing them. Facebook tweaked about 700,000 users’ NewsFeeds to be more positive or negative to find out if a happy Newsfeed meant happier users.

The popular dating site OkCupid ran a small study where it altered the compatibility or match rating between users. Only after a few bad interactions did users find out they were part of an experiment. It’s OK, Christian Rudder, OkCupid’s co-founder and data scientist, explained on the company’s blog, “if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.” That might be true, but that does not mean we can’t expect some greater democratic oversight in the future. Could the CRTC identify inequities of power in the same way it shows concentrations in the Canadian media landscape? Could it help make algorithms more accountable?

The CRTC does have some experience addressing algorithmic accountability. An Internet Traffic Management Practices Hearings in 2009 dealt with how Internet Service Providers configured their routing algorithms to discriminate or manage congestion (a choice of terms that largely situates what side of the debate you’re on). In the Let’s Talk TV Decision, the CRTC started to look into how broadcasters should share data collected by their next-generation set-top boxes. Tackling data sharing would be a good start since it is data that drives these new technical systems and their algorithms. What would the CRTC’s annual Monitoring Report look like if it included descriptions of recommendation algorithms from major content platforms or reports on changes to major social media platforms’ trending criteria? Going forward from the Summit, could the CRTC expand its inquiry beyond traditional broadcasters and into the data analytics driving algorithmic recommendations?

Solutions abound and finding ways to make algorithms accountable has attracted major research attention internationally. The Discoverability Summit could provide a chance for the CRTC to raise these issues domestically. The CRTC has to find new means to understand the influence of algorithms in a Discoverability regime or else democratic oversight will evaporate into the cloud.

For anyone at the Summit, please attend the panel I’m moderating on May 11 at 9am entitled “Algorithms: On How Content Finds ‘You.’”

Thanks to Dwayne Winseck for inspiring this article.

Points/spheres: In “No More Magic Algorithms,” Fenwick McKelvey unpacks the impetus for and challenges of the Discovery Summit, co-hosted by the Canadian Radio-television and Telecommunications Commission and the National Film Board of Canada: What could an algorithmic cultural policy look like? Fenwick’s piece was also prompted by the Who Controls the Public Sphere in the Era of Algorithms? workshop, which Data & Society hosted in February 2016 as part of our developing Algorithms and Publics project. More posts from workshop participants:

Fenwick McKelvey is an assistant professor of communication studies at Concordia University, Montréal. He frequently appears in the media as an expert on information and communication technology policy in Canada.

--

--

Fenwick McKelvey
Data & Society: Points

Internet Research / Assistant Professor, Concordia University