Why Data Ethics and Auditing Won’t Solve Everything

Lauren Toulson
CARRE4
Published in
8 min readMay 17, 2021

These experts discuss the problems, challenges and benefits of auditing AI and creating ethical principles to guide tech development and create fairer outcomes, as well as the issues of lack of competency in many smaller businesses when it comes to spotting risk.

In Conversation with..

Also joined by our host Bill Mew, a top campaigner for data ethics, and Humayun Qureshi the Founder of Digital Bucket Company.

Many businesses want to work ethically but don’t know how to translate principles into practice.

Merve: A lot of companies are committing themselves to ethical policies, and a lot of these have come from scandals in the media — so some are thinking ahead. GDPR has definitely impacted this conversation, but we are at the very beginning of these conversations of mainstream practice and application inside organisations.

Shea: A lot of organisations don’t have the attention, time or even resources in this space. The two things they need are — external help from field experts to get them started, and to establish within the organisation who can be owners of this effort — without this it is difficult to ask “are we doing the right thing with the data?”

Ilana: I agree we are early days of awareness, over 100 companies have released ethical principles but we need concrete practices to back that up.

There’s a push for ethical principles with non-revenue-focused metrics that we can hold ourselves accountable.

Humayun: From an organisation point of view we should not just be approaching it from the top down but bottom up, in the sense that everyone gains an understanding of what data and AI is and the ethical challenges. Making sure all the stakeholders within the organisation develop something where everyone knows what those principles are.

Photo by Leon on Unsplash

What advice would you give to businesses new to the industry to address the ethical issues?

Merve: Take a wide perspective, starting with company principles and responsibilities, and translate those into workforce composition. How are the employees encouraged and incentivised to challenge those propositions and be trained in this area. Translating these principles at front-end design but also implementing into continuous feedback and ensuring top down and bottom up feedback.

Widen development and principles to the people it impacts and get their involvement in the feedback for the data practices.

Be transparent about how you are collecting, processing and using data inside the company for product development — document the process.

Sophie: There’s a need for openness and transparency. In terms of working with business its about developing a shared understanding throughout the organisation about the ethics central to that specific company, building this into company DNA, and constantly reviewing it. This includes Privacy by design and also inclusive design. They need to review and adapt on an ongoing basis.

Ilana: An element that needs to be clarified is who owns these tasks, developers don’t always know the expectations. We’ve been emphasising the three lines of defence method, the first line being the developers, the second the management overseeing the development, and the third line auditing checking the management checks — and making sure everyone knows their responsibilities in that process.

How do we ensure transparency and enable auditing to be effective?

Merve: Firstly not preparing towards an audit but embedding principles in everyday practices and products.

It’s not about ticking a check-box to pass an audit, it’s your practice to be responsible every day.

Education and training the workforce to understand this is essential to build the capacity to self-regulate and it becomes an embedded practice.

Shea: I advocate to tackle the things we can already do by taking law that already exists like GDPR and translate that into principles that you can audit against. Existing law is there to protect human rights, so you can translate that into criteria you can audit against. It might be that it doesn’t work everywhere in the world, but it could work in places where those laws already exist and the laws are enforced.

AI seems to exist in a landscape where these laws don’t work because they are black boxed.

Photo by Markus Spiske on Unsplash

Should companies public and private be required to submit annual reports informing about their social responsibilities and ethical compliance policies?

Ilana:

Annual reports from a lot of companies are happening already, however it can have a positive glow effect, showcasing the things that are going well and not those that didn’t go well.

Can auditors protect society? How do we know we can have confidence in these reports?

Shea: Auditing is just one tool and it won’t solve all the problems. There are opportunities for bad actors to subvert the process, and it’s hard to avoid. It’s important to have a separation between the people who do the audits and the services that prepare people for audits, and you can’t have these two overlap. Those doing the audits can’t be making the rules, it needs to be consensus driven and transparent, made by people other than the auditors. It helps minimise the subversion of the process.

Sophie: There will always be a risk, even with attempts to minimise it. With ESG, there’s a financial and investment hit if you get caught out, like if they’ve been greenwashing. They need to demonstrate that the costs are so great they don’t take the risk in the first place.

Humayun: Auditing has to be at every stage of the development, not just at the end, and thats how you get that safety net, that’s one way AI auditing can be different from financial auditing.

Bill:

There’s definitely a place for transparency, but there isn’t enough maturity, education and awareness in most organisations in their use of AI at this moment in time.

I think companies are unintentionally failing to address bias and failing to secure their algorithms which are open to external manipulation without even being aware of it.

We are assuming a level of competency that perhaps doesn’t exist at this moment in time.

Merve: Audits is one of the best governance tools in the toolbox but it is not a solution to anything and everything. When audit is done in the legal definition of independence it needs to have liabilities, and working for the public not the company hiring them. It’s done for public benefit. Audit will not solve all your problems but it will guide you in the directions towards what mechanisms, accountability and responsibility you should be aware of inside the organisations.

Photo by JESHOOTS.COM on Unsplash

Bill: If we are talking about companies who are moving towards transparency and accountability we are talking about organisations that are sufficiently competent.

As AI becomes a package product, there will be a lot of companies using it that aren’t as competent.

To what extent is there a lack of skills preventing us from the reporting at an adequate level?

Merve: If you don’t have that competence yet you should work towards establishing it in the organisation or getting external support for that, and not bringing in products that you have no idea how they work or bring results and outcomes. Don’t open yourself up to fraud risks.

How do lay people push for tech regulation?

Cansu: When you talk about the ivory tower of AI we should also think about how we can bring in the ethical conversation to the public. The debate in ethics is not just about the complexity of AI but also the complexities of ethics. We need to decide what we want to push for, and the consequences of potential regulation. When we are talking about the public pushing for certain outcomes it’s important to think about how we can have a deliberate and detailed conversation about what we want.

What is the best outcome for the society that we should push for?

Sophie: Engaging people from the outset so you can develop systems where the hope is that throughout that process you don’t end up with outcomes that are unfair or inequitable.

Ilana: There’s a big push to decolonize AI tech especially to include emerging economies like Africa and the Global South as contributors in the development process, and consumers should continue to support organisations advocating for that kind of global community building.

Closing comments:

Merve: There is legislation out there that requires companies to report on things, but allows the company to cherry-pick disclosures, and doesn’t provide much guidance.

It becomes a PR practice rather than transparent reporting. How do we ensure the things that matter are reported?

While there are some principles across the EU, companies are collecting the life out of data subjects in areas not regulated. How do we ensure consistency of practice and hold companies liable, not just in select jurisdictions?

Sophie: There’s a need to consider how to report a company’s approach to data and AI. There’s always room for improvement and a need to respond to new and emerging environments.

Ilana: I want to again raise the issue about ethics-washing. We need to build on systems that already exist within the organisation and ensure there are clear roles and responsibilities, mechanism for accountabilities — going all the way from principles to policy and concrete practice that are evaluated periodically. And consumers should be a significant part of this and help shape the company product.

Photo by LOGAN WEAVER on Unsplash

Shea: One thing smaller companies can focus on is slow down — take your time, make sure to reflect on core values, take the time to open up in the organisation about why you are doing the things you are doing. Think about ways it could potentially harm people. This builds the capacity needed for when regulations do eventually come online.

Cansu: It’s not just about how to follow the principles but what is the right principle to follow? These principles clash. So in addition to regulation and audits, we can’t forget that ethics is a problem-solving discipline, what we are doing here is a design, from the justice, autonomy and societal perspective.

Humayun: Ethics alone won’t solve all the problems and nor will auditing. The challenge moving forward is finding that uniformed approach that the ethical principles can be applied to whatever industry, country or framework. It needs more discussions.

This discussion was hosted by Digital Bucket Company at their Bucket List Summit on 30th March 2021. You can watch the discussion here.

--

--

Lauren Toulson
CARRE4
Writer for

Studying Digital Culture, Lauren is an MSc student at LSE and writes about Big Data and AI for Digital Bucket Company. Tweet her @itslaurensdata