Is “Design” enough for privacy- and ethics-by-design?
The DEVELOP project is researching how to apply privacy- and ethics-by-design to the development of the project’s learning platform. Our objective is to make data privacy a key component of the learning environment, and to help balance the privacy interests of employees with the talent management desires of employers. We want to avoid nightmare scenarios that could scare away potential users, and thus prevent future use of the learning platform. We also want to make sure that the learning platform, including its use of social network analysis and artificial intelligence, corresponds to the project’s ethical commitments. The new legal environment created by the EU’s General Data Protection Regulation (GDPR), and in particular the obligations to data protection by design and data protection by default, makes this an opportune time to engage with these challenges.
Privacy by design is often advocated for and has a set of high-level principles, but these need to be put into practice in very context-dependent ways. Through this process in DEVELOP, we’ve learnt a few key lessons about doing this in practice:
First, the importance of sensitising the team to ethical and privacy issues. Whilst we hope everyone we work with has ethics, it is a skill to understand the ethical impacts that can emerge from technology development work. It’s important therefore to create a shared understanding of privacy and ethics within a project team, so that these topics become part of the regular conversation and are not siloed away with one “privacy expert”. Within DEVELOP we’ve worked on this through dedicated workshops, regular team calls on ethics and privacy matters, the internal Privacy Impact Assessment reports, and the type of accessible scenarios set out in our previous whitepaper.
Second, getting involved in the design process on an ongoing basis. This includes being an advocate for privacy and ethics at the point, and in the place, where design decisions are being made. In DEVELOP we’ve had to adapt privacy impact assessment approaches to the demands of an Agile software development workflow. We’ve taken high level legal and ethical commitments and helped to transpose them into software requirements, then design decisions and eventually into real features you can see in the DEVELOP software (see image below for some examples). This requires a responsive way of working, as privacy and ethical issues can emerge from within the development process, and which need quick answers.
Fourth, we’ve needed to consider the wider political and ethical environment and get involved in conversations about business models and eventual exploitation plans. We’ve needed to understand how the context of use for the software might pivot or change in the future. As a research project, we’re not under pressure to commercially exploit the personal data which DEVELOP uses, meaning we have the opportunity to innovate and test some new approaches to privacy and ethics by design, but if these are to have impact beyond the project, then we need to be aware of how they might be applied in industry. It has also been important to keep an eye on emerging new legislation and case law that could also impact the product.
In the next year, Trilateral Research will be publishing a guide to good practice in developing data-enhanced teaching and learning platforms, that will include more detailed accounts of these findings.
Interested in staying current with the topic, then sign-up to Trilateral Research’s mailing list.
For more information about the DEVELOP Project, sign-up to our newsletter.