Ethical AI and Data Trust by Design Practices — Part 2

Our curated recap of the February 2019 Bots & AI Meetup plus videos

Alec Lazarescu
Bots + AI
4 min readMar 7, 2019

--

Join our group to be notified of our next event. Over 120 enthusiastic attendees joined us for this one.

Joe Toscano

The following is a summary of the presentation by Joe Toscano, founder of BEACON. It’s preceded by a Part 1 article covering Nathan Kinch’s presentation on designing for data trust.

Joe creates a much richer and compelling story arc for how we reached this dubious place in ethics with technology companies and we’d highly encourage you to view his presentation video linked at the end of this article. The material is approachable enough to also watch with your family and friends not just share amongst your tech peers.

Privacy is what allows us to have dissenting opinions and be free.

It’s a federal offense to open postal mail yet online email providers have analyzed or even modified mail.

Large technology companies are skirting both notions of privacy as well as influencing via indirectly manipulative methods or dark patterns.

Dark Patterns

Surveys

Statements made about survey results are hollow without digging into the data capture methods.

“At least 80% fair or better”

At face value that sounds compelling, but when digging into the data collection details and seeing this rating scale it’s clearly biased.

Embedded ads

Ads today are also less intrusive than the big banners of the past or clearly highlighted sponsorship. They are interspersed in our search results or timelines in between news from our actual connections and in dizzying jumps in time.

Engagement or addiction?

Psychology based methods to drive higher engagement including the popular Hooked can be used for positive change like healthy habits or creating cycles of distraction that aren’t valuable for the user but the company instead.

Where is anti-trust?

Google and Facebook each grew to more members orders of magnitude faster than any organization in history including the most widespread global religions.

Much US anti-trust focus has been on price. Since many services are free (Facebook/Google) or often lower priced (Amazon) there hasn’t been any meaningful action.

We’re in an attention marketplace where despite fiat currency costs being zero, the attention costs are rising. Systems are designed to be addicting and engaging to capture attention which is hard to price or regulate unlike a financial price.

The risk is that once all the data and attention is owned, competition is thoroughly hampered. Over half of Bay Area startups have a goal to be acquired and only 16% believe they have a chance to IPO.

Due to the indirect path to harming competition regulation hasn’t caught up since in an attention economy:

Attention -> Data -> Money

Potential Solutions

Data collection is the modern tax in our world

The American revolution shouted No taxation without representation. Joe wants us to consider No implementation without representation.

What if there were an actual data collection and processing tax?

More mindfulness of data collection needs could arise from higher costs of capturing more data than absolutely critical. Large data collection would be proportionately taxed and building giant data moats that can stifle new entrants would have financial costs.

The funds from this task could help fund retraining and safety nets for many displayed by job automation.

Incentivizing Ethical Behavior

Under current market conditions, at least in the US, it’s a difficult step to willingly forego data troves and potential profits of leveraging gray ethical areas of growth and engagement hacks. Careful regulation could introduce some clearer lines for consumer safety baselines and not disadvantage companies already trying to be more consumer transparent and mindful with their data practices.

Much like tobacco companies had to fund PSAs on the dangers of tobacco and provide clear and unmistakable warnings, companies leveraging engagement driving systems and collecting user data could fund public education and transparency.

Creating and publicizing organizations that are watchdogs for privacy and data concerns can offer another outlet for those concerned to report fears and be heard.

You can find Joe’s presentation second in this playlist and be sure to also check out Nathan Kinch’s quick overview on the business value of designing for trust and a panel with both Joe and Nathan.

--

--