Corporate Social Responsibility for a Digital World

Corporate Social Responsibility is not just about managing, reducing and avoiding risk, it is about creating opportunities, generating improved performance, making money and leaving the risks far behind. — Sunil Misser, former Head of Global Sustainability Practice, PwC

The ideals of Corporate Social Responsibility (CSR) are grounded in the civil rights movement, environmentalism and consumer protection. Not too long ago, questions about where products were made, by whom, for how much and who harmed who or what were not center stage, nor part of societal dialogue. Forbes did not rank companies on the basis of their CSR reputations. Fast forward a bit and we see companies like Unilever, purpose-driven to reduce their environmental footprint and increase their social impact simply because sustainable development goals are the best long-term way to grow the business. Unilever is not alone, of course; McDonald’s, Microsoft, Starbucks, Google all have formal CSR programs prioritizing what is culturally important to their stakeholders (whether volunteerism, fundraising, sourcing, reducing waste or diversity). These policies are geared to impact customers, employees, communities, society, the Earth — and, critically, grow business.

Corporate Data Responsibility is Awesome

A new and relatively unused term — Corporate Data Responsibility — is a bit different, yet very much a part of CSR. The 21st century is riddled with big data, analytics, machine learning and millennial stereotypes. Increasingly, all companies are recognizing or reimagining themselves as technology companies. They understand that they are data-driven enterprises, even if they aren’t typically part of the tech sector. Putting aside traditional types of trade secrets and proprietary data, the most valuable commodity upon which to base business decisions is consumer data — personal information, behavioral histories and customer intent. Indeed, there is an abundance of information and innovation hubs that are leveraging data to provide services for finance management, healthcare, retail and changing our daily habits and lifestyles. For example, Apple’s new Wake Alarm is designed to help improve your sleep schedule through data monitoring and reminders — an artifact of the quantified self movement.

In such an abundance of data, what responsibilities go along with the feast? If businesses are stewards to our communities and ecology, are they also stewards of the data they collected? Yes, we should expect them to be.

Creating a Data Culture

In January, Accenture published “Guarding and Growing Personal Data Value,” identifying five principles to guide responsible data management: stewardship, transparency, empowerment, equity and inclusion. While Accenture does a good job of combining each of these ideas into a single framework, it does not go far enough. Creating an organizational culture of data ethics and responsibility requires that the right stakeholders be empowered with the correct mission. Specifically, the Chief Information Officer function — or the top-level executive responsible for information governance — must be integrated into the organization’s CSR function, and the business must identify that Corporate Data Responsibility is really, in fact, a CSR issue.

Expanding the Principles to Create Business Value

Stewardship should go beyond legal compliance to craft a formal process in which the ethics of data management are paramount. Simply focusing on representations made to consumers on data security and privacy is not enough. Data stewardship should look at the full lifecycle of data in the enterprise, implement data minimization in product development processes, and focus on the expectations of privacy, confidentiality, exploitation and trade that a reasonable consumer would have in each specific transaction. Companies will increasingly be stewards of the underlying value of data and not just the bits and bytes. Each exchange of data has a value, even if no money changes hand.

Without trust, people share less information, bad information, or no information at all. They become anxious, bewildered, and suspicious. They lie or self-censor otherwise beneficial information. If people don’t trust a company, they are more likely to switch to a competitor or resist or fail to become fully invested in the commercial relationship. — Taking Trust Seriously in Privacy Law, Stanford Technology Law Review (2016).

Transparency only comes with good stewardship. It is incredibly difficult to provide transparency to consumers about internal business processes and information governance when the data lifecycle of different products and marketing strategies is either poorly understood or undocumented. Consumers should have full access to clear data collection policies and data use, and the visibility should be easily accessible. The goal is to inure consumer trust to the benefit of the company’s bottom line. More consumer trust means access to more relevant data, leading to more revenue opportunities.

The bar is on us to give enough value that people trust us. — Sundar Pichai, CEO of Google

Empowerment should focus on changing consumers into stakeholders in a value transaction. Data exchanged for a comprehensible and fair return on investment is a good deal, and consumers will provide accurate information if they are incentivized. Be additive, not subtractive. Name, social media profiles and where a particular person ate on Taco Tuesday has value, but only if the data is correct and up-to-date. Returning value to consumers means different things across industries: healthcare (premiums, diagnoses or outcomes), finance (savings, planning and security), shopping (recommendations, discounts and tracking). There is a real fear among consumers that their data will be used against them in some manner. Address the fear, return value and create a channel for them to participate and make sure that they are getting the correct value. Fail to do so and face the consequences; advertisers are reported to have lost $24 billion in revenue globally because of ad-blocking in 2015.

Equity simply means that if you ask for more data, you need to return more value. It runs hand in hand with transparency and empowerment. Similarly, inclusion goes to the traditional concept of social responsibility and for companies to be stakeholders in local and global communities. Big data analytics offer incredible opportunities in urban planning, identifying under-utilized resources, epidemiology, and contributing to global research efforts in medicine. Open Data initiatives can make companies stakeholders and contributors to global innovation. Incorporating an opt-in social benefit strategy may be the return on investment the consumer wants, which incentivizes information accuracy.

Machine Bias, Ethics, Retaining Talent and Product Quality

Ultimately, human bias creates machine bias. Algorithms make decisions about the news we read, the ads we see and, potentially, the jobs we get and the opportunities we deemed ‘qualified’ or ‘suited’ for. Algorithms are not free of human bias simply because they are designed to apply cold, sound logic. Human errors, biases and flawed assumptions color the logic and behavior of these algorithms. For example, take a look at this great piece by ProPublica on Machine Bias. Their analysis focused on software used in the criminal justice system to calculate the likelihood of a particular person committing a future crime. The software has a real world impact, including denial of bond and parole. The algorithm used, however, was determined to be racially discriminatory, remarkably unreliable in forecasting violent crimes, and, thus, ineffective in meeting its big data objective.

As part of a data responsibility strategy, companies must formally acknowledge the risk of machine bias in their products, and empower designers and engineers to flag logic that incorporates biases or creates a discriminatory or disparate impact. Of course, this is much easier said than done. At its core, CSR focuses on changing human behavior in organizations to achieve a larger goal. We need to use the knowledge from CSR programs to help humans change data-driven companies. . For example, it is crucial to rely on a diverse workforce that can bring together different perspectives and experiences more likely to identify flawed algorithmic logic. Diversity (across all dimensions, including gender, race, religion, nationality and economics) is the key driver required to identify and challenge algorithmic disparate impact.

All companies are competing for the same Millennial talent to design and develop their algorithms and systems. According to the 2014 Millennium Impact Report, millennials who stay at their jobs for more than five years are passionate about their work and feel bonded with their co-workers by a belief in their company’s mission and purpose. CSR programs — and Corporate Data Responsibility — are tools to get the best millennial engineering talent to stick around and develop the next fantastic product. Millennial talent goes to companies with strong values and ethics. Declare your Charter of Data Ethics to all candidates and employees — We do not accept Machine Bias in our products, and neither should you! Remember, if an algorithm is unintentionally discriminatory, then it’s at the very least a poorly designed product.

Data responsibility is a strategy for better design, innovation, thoughtfulness, talent retention and also improving product quality. If the system is intentionally discriminatory — well, that’s a different conversation altogether and ripe for another posting.

Organizational View on Privacy, Security and Ownership

At the heart of any data responsibility strategy is a perspective on privacy and security. While privacy and security often are conflated into a single concept, these are two different terms with very different meanings. Privacy is the degree of control that a company gives to a consumer over the data collected and stored by the company. Security, on the other hand, is the philosophical view of access control (who should see it) and the real-life implementation of that viewpoint (how do we keep others from seeing it). Instead of solely focusing on regulatory risk, business choices about privacy and security should be outputs reflective of a cultural approach to data.

How does enabling privacy grow new business value? What level of security is expected by the consumer? These are tough questions that are industry specific and require a lot of fact-finding and study. There is a balance to be struck between opportunity, cost and specific consumer expectations. There is plenty of room for companies to take leadership roles in this dialogue (like Apple and Fitbit).

Closely related to privacy and security is the less often discussed concept of ownership. Do consumers have legal rights and complete control over a single piece of data? Can consumers control the distribution of data sets across systems, companies and borders? As governments and financial institutions continue to research blockchain and distributed ledger technologies, we will see data ownership take a more central role over privacy and security.

We Need to Change Our View of Customer Data

Corporate Data Responsibility is a part of CSR and a value creator for businesses. Forward-thinking strategies require the breaking down of organizational silos that restrict creation of a new data culture. Chief Information Officers must be integrated into the CSR function to cause change and become stakeholders in the discussion. Businesses are becoming largely data businesses — and governments are becoming data governments. If consumers and employees measure companies by their CSR, impact and care of people and the environment, then establishing a data ethics culture is the next hurdle.

Successful Corporate Data Responsibility requires a new way of thinking that combines business, legal and technical understanding of the internal and societal impact of data governance. Over the next five years, the building blocks and best practice guides for Corporate Data Responsibility will be established. Companies that work hard now to define better data ecosystems will claim more marketshare with less investment. By analogy, as critical as it was for Nike to incorporate CSR, it will be just as critical to define a data culture of stewardship, ethics and reducing the risk of machine bias. Corporate Data Responsibility cuts across the entire enterprise, from product development teams to marketing, from legal to the C-suite and from the CSR function to the Board of Directors.

Originally published on This is the first of a series of posts on Corporate Data Responsibility and organizational data ethics impacting society and business.

Khurram Nasir Gore is former General Counsel and Chief Strategy Officer of Personal BlackBox Company, a public benefit corporation focused on putting data ownership in consumer’s hands to build mutually beneficial data relationships with brands and increase consumer engagement. In Spring 2016, Khurram was recognized by the National Law Journal as one of “America’s 50 Outstanding General Counsel” for his innovation and leadership work, and also as a “Rising Star” by the Minority Corporate Counsel Association. He is a frequent speaker on Corporate Data Responsibility, privacy and cybersecurity, intellectual property and diversity.