Data, the Sharing Economy and Canadian Federal Regulation

Katherine Reilly
Feb 17, 2018 · 12 min read

Theory tells us that data is central to digital platform capitalism, so there is growing interest in creating policies that protect consumers and foster innovation given new data-driven business practices. But data is not new to digital capitalism, and there are existing policy regimes in place that address privacy, intellectual property, competition, and consumer and human rights. Establishing new data policies is not as simple as updating the old data policies. This article explains why, with reference to Canadian federal regulations.

Data and the Digital Platform Economy in Theory

Data is central to the sharing economy. Srnicek (2017) explains that digital platform business models are distinguished from analogue platforms by their intensive use of data. A digital platform positions itself “as the ground upon which [users’] activities occur, which gives it privileged access to record them” (ibid., p. 43). Platforms also grow rapidly and establish market dominance through network effects: “the more numerous the users who use a platform, the more valuable that platform becomes” (ibid., p. 44). Companies then assert “control and governance over the rules of the game” (ibid., p. 47) by manipulating the algorithms used to manage user interactions (Grossman 2015; Just & Latzer 2016), a phenomenon known as ‘design-based regulation’ (Yeung 2017). Finally, the data is a commodity that can be leveraged or sold (Schiller 2007).

This means that data and algorithms are key to understanding how the digital platform economy (DPE) is reshaping participation in economic and social processes. Emerging research has identified several DPE digital rights issues including discrimination (Ert et al. 2016; Edelman et al. 2017; Rosenblat et al. 2017), growth in inequality (Schor 2017), anti-competitive outcomes in markets (Reilly 2017), and a new balance of power between platform owners and platform users (Calo & Rosenblat 2017), including workers (Rosenblat & Stark 2016; Becerril 2017).

Of particular concern, the reputation management systems that govern users’ participation in digital platforms decide who gets to take up jobs, unlock cars, or access accommodations (Hearn 2010). These findings raise questions about the transparency of reputation systems, access to personal data, and the portability of reputational data across platforms (Stewart 2014). In addition, data may be used for anti-competitive service and pricing practices, that discriminate against new market entrants, or reduce consumers’ freedom of choice in the marketplace.

As a result, governance of the DPE “is increasingly seen as an important regulatory issue” (ITforChange 2017, n.p.). The new business practices introduced by the digital platform economy are putting stress on existing policies, but it is not clear how to update them.

Policy Context for the Digital Platform Economy in Canada

You would be forgiven for thinking that the sharing economy is a new ‘sector’ of the economy, given how digital platform enterprises are talked about. This is likely an effect of early research on the sharing economy, which has tended to focus on case studies of iconic new companies. But conversations with experts at Statistics Canada convinced me that we are actually facing new types of business practices that cut across the economy at different scales of activity, to different extents, and with subtle variations in form. This makes regulation of new data intensive practices very complex.

You would be forgiven for thinking that the sharing economy is a new ‘sector’ of the economy. We are actually facing new types of business practices that cut across the economy. This makes regulation of new data intensive practices very complex.

In Canada, ‘data intensive sharing practices’ interact with a variety of different regulatory frameworks.

Canada’s most obvious data policy was established in the early 2000s to support our trading relationship with the European Union. The Personal Information Protection and Electronic Documents Act (PIPEDA) addresses the collection, use and disclosure of personal information by private sector actors in the course of commercial activities. PIPEDA made Canada a ‘safe harbour’ for European data. Basically it establishes a set of principles for use of private data, particularly the need to obtain consent as it pertains to specific uses.

Canadian competition policy is also relevant to data-intensive platform-based practices. Companies have long gathered intelligence about their client base. But platform-based practices may make it possible for companies to gather or acquire information about their clients in new and concerning ways. For example, the knowledge that companies amass about their clients may enable them to engage in anti-competitive pricing practices, or forced loyalty schemes.

Surprisingly, intellectual property law is also relevant to digital platform practices. In 2016,, a Canadian court upheld a decision to apply copyright protections to a compilation of data, as if this data were a cultural product (such as a book or movie). The application of copyright law to databases of consumer data would provide companies with a tool to avoid public scrutiny of their data practices.

There is also the suggestion that federal actors are taking an interest in updating Canada’s innovation, labour and taxation policies to accommodate new business practices. Statistics Canada recently published the results of a Labour Force Survey that sought to . StatsCan is also considering how to to better capture the economic value of these activities. And finally, the 2018 Canadian Internet Use Survey will include questions about the use of platforms to supplement or replace traditional labour activities. This research raises new questions, such as: What is the value of data assets held by Canadian companies? Should they be included in corporate income tax filings? What supports or protections should the federal government offer to Canadian workers? How is innovation changing, and should incentives change to reflect new and different approaches production?

Finally, we cannot forget about international and trade law. The EU’s new privacy policy, plus ongoing NAFTA and TPP negotiations will have . In particular, the two trade deals will prohibit any form of data localizations or any types of restrictions on the transfer of data between countries. This will allow Canadian companies to hire foreign firms to work with their data, and it will also allow foreign companies to hold Canadian data in foreign cloud banks or data bases.

Note that Canada does not have data localization laws except in the case of health data. There are also some small pockets of data localization policy, for example as it relates to research and some legal practices. Instead, Canada uses the accountability principle — data can be transferred anywhere, but the ‘owner’ is accountable for that data, wherever it is.

In total, updating federal policies for the sharing economy is not as simple as re-writing PIPEDA. In fact, there is a quiet movement afoot to engage in a more meaningful review of Canada’s data policies. In a January 2018 , Jim Balsillie of RIM fame said “It’s critical that Canada designs and implements a National Data Strategy to protect our prosperity, security and values.” And in February 2018, Rohinton P. Medhora, President of the Canadian Centre for International Governance Innovation with Chrystia Freeland, Minister of Foreign Affairs and David Lametti, Parliamentary Secretary to the Minister of Innovation, Science and Economic Development to spark interest in the development of just such a strategy.

Regulatory Considerations

If it even gets off the ground, we don’t yet know what Canada’s National Data Strategy will look like. But after interviewing a dozen Canadian experts on the theme, I’ve identified four issues that should be taken into consideration in the discussion.

Data versus Algorithms — Algorithms are fundamentally changing the game where data is concerned. The PIPEDA framework was created to protect private information, and is based on a fundamental distinction between private and public information. Unfortunately, it has been relatively simple for private sector actors to circumvent PIPEDA by anonymizing data. If the data is anonymous, they argue, then it no longer infringes on privacy. The difficulty here is that algorithms make it relatively easy to de-anonymize data. This means that algorithms take us into an entirely new realm, requiring entirely new approaches to privacy law and digital rights.

Types of Data — When creating data policy, we need to categorize data according to its functional purpose. Based on my observations about the Canadian case, I believe it is important to distinguish between data that is used to influence people’s preferences, and data that is used to intercede in intermediation of goods and services.

Data policy needs to distinguish between data that is used to influence people’s preferences, and data that is used to intercede in intermediation of goods and services.

In the Canadian case, the networked media economy, , is the most vocal when it comes to regulating private data. This group includes both traditional ‘broadcast distribution undertakings’ (Bell, Rogers, Corus/Shaw, Quebecore) and new media actors such as Facebook and Google. What unites them is a desire to to influence people’s preferences so that they will be more inclined to engage in particular types of buying or voting behaviours. But the final choice is left up to the consumer or citizen, in a market place the offers many different options.

To my knowledge, businesses that engage in platform-based intermediation of goods and services have stayed quiet in Canadian policy circles that address data-related issues. Why is this? These actors view their data as a source of competitive advantage, so they are not inclined to share it with others. In addition, the data is collected under conditions that will necessarily engender the user’s consent, so PIPEDA is a generally agreeable policy framework for these actors.

This does not mean that the data-intensive practices of these companies are beyond regulatory scrutiny. Its just that the regulations would need to be quite different in this case. Trade policy is a more likely target for data intensive market intermediaries, because they are deeply affected by data-localization. And since digital platform intermediaries use data and algorithms to govern the exchange of goods and services in the economy, the primary concern is not privacy, but ensuring competition and digital rights. As so eloquently put it in a recent conversation, we need policies that prevent digital actors from “thinking that they can construct the marketplace just because they own the network.”

One last word here — it may seem strange to combine media actors with intermediaries in this section, but I think it is essential to look at their data use practices in tandem. Data service companies like may be capable of creating a bridge between data for intermediation and data for persuasion. Also, it is somewhat artificial to separate persuasion from market intermediation; the two processes may be more interlinked than I have suggested.

Purpose, Scale and Reach —Given that novel business practices can be adopted throughout the economy, purpose, scale and reach are also important considerations in establishing new regulatory frameworks. Consider just the goods-sharing services active in the Vancouver area. They range from and , which support new forms of community economic development, to and which are online marketplaces that help people capitalize on the unused capacity in their consumer goods. Purpose matters because it influences how initiatives collect, evaluate and leverage data.

It has been suggested to me that scale is essential to this discussion, because scale creates new or additional affordances for companies. I think that the effects of scale will depend on a number of factors. Clearly there are network effects attached to the size of a platform enterprise (as discussed in section 1 above). Larger networks may be better positioned to achieve market dominance, which would give them special kinds of influence. But the value of a data set is not necessarily proportional to its size. And small companies are just as likely as larger ones to engage in third party authentification, or work with credit card companies or cloud platforms. This means that their data footprint will be much larger than the number of people in their client base might suggest.

The same set of considerations also applies to reach, particularly in reference to data localization. Larger companies will often reach across regulatory jurisdictions, however they are also in a better position to adjust their practices to meet local regulatory demands. Smaller companies, on the other hand, are just as likely to work with third party service suppliers located in other markets, making data localization a significant issue for them. Very local companies, however, may be doing everything in house, which would make them fly under the radar of scale or reach issues. Clearly this set of considerations is ripe for empirical investigation.

Types of Regulation — A final consideration concerns the enactment of data regulations for the platform economy. Are data-intensive platform-based activities best regulated by a government authority, through industry self-regulation, or some other approach? The current approach in Canada favours industry self-regulation overseen by a government watchdog within a particular set of legal parameters. For example, Canada’s Federal Privacy Commissioner (Daniel Therrien) can investigate privacy concerns that fall within the ambit of PIPEDA, and he can recommend criminal investigations, but he has has no powers of enforcement.

A similar approach seems to be taking shape with regards to the data-intensive activities of platform businesses. According to Grossman (2015; also Hazenberg & Zwitter 2017) platform data could be regulated through data audits, mandated open access to one’s own data, protections for data pooling efforts, and the like. However, this is a major challenge since most private enterprises eschew data transparency, given that data and algorithms are central to their competitive advantage (Schiller 2007). Workarounds to this problem are emerging. Several Canadian experts I spoke with suggested that companies could be required to inform clients about the scope and intent of their algorithmic practices, without revealing the contents of their codes. And regular data audits by government approved third party practitioners could certify compliance with federal legal frameworks. Audits would then provide the basis for oversight, and if necessary, disciplinary measures.

Most Canadian digital rights watchdogs would decry this approach, however. They have been dissatisfied with the limited powers of the Privacy Commissioner, and feel that the case law solution puts too heavy of burden on public ‘Davids’ to pursue private sector ‘Goliaths’ in court. Instead, they would favour greater enforcement powers, plus regulations that empower citizens and consumers to access, share and audit private data themselves.

Concluding Thoughts

In total, establishing new approaches and frameworks for regulating data use in the sharing economy will almost certainly open up a new policy cycle in Canada that will cut across a range of different federal institutions and issues. The results of this policy cycle will establish the legal, cultural and social foundations for the digital economy in the coming 20 years.

Thank you to and for their generous support of this work.

Bibliography

Becerril, Josemaria. 2017. “The Sharing Economy Comes to Mexico.” Jacobian, January.

Calo, Ryan, and Alex Rosenblat. 2017. “The Taking Economy: Uber, Information, and Power.” SSRN Electronic Journal.

Edelman, Benjamin Gordon, Michael Luca, and Daniel Alejandro Svirsky. 2017. “Racial Discrimination in the Sharing Economy: Evidence from a Field Experiment.”

Ert, Eyal, Aliza Fleischer, and Nathan Magen. 2016. “Trust and Reputation in the Sharing Economy: The Role of Personal Photos in Airbnb.” Tourism Management 55:62–73.

Grossman, Nick. 2015. “Regulation, the Internet Way: A Data-First Model for Establishing Trust, Safety, and Security.” White Paper. Regulatory Reform for the 21st Century. Data-Smart City Solutions, Harvard University.

Hazenberg, Jules L. J., and Andrej Zwitter. 2017. “Network Governance in the Big Data and Cyber Age.” Zeitschrift Fur Evangelische Ethik 61 (3):184–209.

Hearn, Alison. 2010. “Structuring Feeling: Web 2.0, Online Ranking and Rating, and the Digital ‘Reputation’ Economy.” Ephemera: Theory and Politics in Organization 10 (3).

ITforChange. n.d. “The Emerging Power of Platforms.” Decoding Digital Rights: A Primer for Activists and Practitioners. Accessed November 8, 2017.

Just, Natascha, and Michael Latzer. 2016. “Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet.” Media, Culture & Society 39 (2):238–58.

Reilly, Katherine M. A., and Juan P. Alperin. 2016. “Intermediation in Open Development: A Stewardship Approach.” Global Media Journal — Canadian Edition 9 (1):51–71.

Rosenblat, Alex, Karen E. C. Levy, Solon Barocas, and Tim Hwang. 2017. “Discriminating Tastes: Uber’s Customer Ratings as Vehicles for Workplace Discrimination.” Policy & Internet 9 (3):256–79.

Rosenblat, Alex, and Luke Stark. 2016. “Algorithmic Labor and Information Asymmetries: A Case Study of Uber’s Drivers.” International Journal of Communication 10:3758–84.

Schiller, Dan. 2007. How to Think about Information. 1st Edition edition. Urbana, Ill.; Chesham: University of Illinois Press.

Schor, Juliet B. 2017. “Does the Sharing Economy Increase Inequality within the Eighty Percent?: Findings from a Qualitative Study of Platform Providers.” Cambridge Journal Of Regions, Economy And Society 10 (2):263–79.

Srenicek, Nick. 2017. Platform Capitalism. Cambridge, UK: Polity Press.

Stewart, Patrick J. 2014. “Why Uber Should Let People See Their Own Passenger Ratings.” Business Insider. October 23, 2014.

Yeung, Karen. 2017. “‘Hypernudge’: Big Data as a Mode of Regulation by Design.” Information Communication & Society 20 (1):118–36.

Katherine Reilly

Written by

Associate Professor in Communications at Simon Fraser University, Canada. She researches ICT4D, OpenDevelopment and the CollaborativeEconomy.