Watch Out.

Angela Adams
3 min readAug 7, 2018

--

I remember my first car accident well. I was in my freshman year at Florida State. Running late to something, though I don’t remember what. I turned right and next thing I knew, I’d pummeled into a line of parked cars that should not have been there.

I made my case to the officers: The hedges were out of control, blocking my view. It was a blind turn. I couldn’t see what was up ahead of me. How could I possibly be held responsible for the wreck when I couldn’t see what was up ahead of me? There were never cars there! I couldn’t predict the future!

Their response was something along the the lines of “Yes, it is a blind turn, so you should take extra precaution. You don’t know what’s up ahead, you don’t know what other people are doing, so you need to be in control of your vehicle in a way that doesn’t cause harm. Watch out.”

In my technology consulting practice, I am aware of the power — and the responsibility — that comes with my role. I am responsible for knowing and understanding best practices and following best practices when making recommendations to my clients. I am responsible for the architecture decisions and configuration choices I make. I am responsible for training my clients on the solutions I’ve provided. My advice, after all, may directly impact their operations.

But am I responsible for what my clients ultimately do with my advice or solutions?

I am fortunate enough to work for a consulting firm in the social impact sphere. We often say we help our clients do “more good better.” As former non-profit practitioners ourselves, our consultants usually connect deeply with our clients, rallying behind their missions and supporting them through tech. If we know right off the bat that there is not missional alignment, we will typically recommend another consultant. We want to work with clients doing good in the world, and help them do it better.

What if my clients use my solutions for harm?

What if the solution I provide enables them to raise funds in what I would consider an immoral way? What if my strategy for effective delivery of services means a client is able to more effectively deliver services that do not align with my ethics?

Am I responsible for that? And if so, what are my choices? Should I walk away? Can I?

This is not merely a thought experiment.

Activists and employees alike are actively asking tech companies including Google, Amazon, Microsoft, and Salesforce to be accountable for the role of their technology in separating families at the US-Mexico border. Perhaps the technology is not actively being used at the border to separate families, but even if technology is only being used in a back office somewhere, are the companies ultimately responsible for what clients do with their offerings? Who is watching out for ethical use?

Where do they draw the line and say “Not this. Not on our watch.”?

Behaving in a way that is consistent with our values is the very definition of ethics. As the Biblical adage goes, “You’ll know a tree by its fruit.” If tech companies consistently act in a way that is inconsistent with their values, or do not act to interrupt customers who use their products in a way that are inconsistent with their values, one must ask: are those their values after all?

Each tech company must define their company values, act in accordance to them, and adopt a framework for compliance and accountability. (Consider this article regarding the Social Solution Design (SSD) developed by Ayori Selassie.)

We are increasingly using tech in personal ways and inviting tech into our private lives via social media, Internet of Things, biotech, and AI.

Knowing what tech companies truly value and knowing to what lengths they will go — or will not go — in pursuing ethical use of their products is a deeply personal concern.

As stakeholders and as consumers of tech, it is our role to let tech know when they get it right, and hold them to account when they do not. This is especially true for folks like myself, those of us who work at the intersection of social impact and tech. We need to watch out, and speak up, for ethical use.

--

--

Angela Adams

Wayfinder | Social Impact Alchemist | Guerrilla Editor | Executive Vice President of Now IT Matters. All words are my own.