The analogy is great, but I do not see the personalities as exclusive but as traits a future AI…
joako
11

Firstly, thank you for your response and perspective Joako!

I’ve been juggling for a long while with balancing the levels of empathy and the relationship between both tech and people and for sure, my overview is simplistic in its categorization to make a point. You’re completely correct that if we were to go deeper into this and truly consider the real ‘Personalities’ of products and services around us we are looking at the potential of multiple layers and agendas — the kind of complexity we see every day in our own moods and emotions.

The next challenge lies in the arenas of trust and ethics — the barrier of giving away enough trust that I allow my vacuum (to build on the analogy further) to make decisions and ‘grow’ into something more valuable versus the dilemma of then creating an ‘aware’ or intelligent servant constrained by my own will. This also leads to the later point you raise regarding a more humanized language and dialogue — is it hungry or full? vs. battery levels etc.

More to come for sure but thanks for the comment — I really appreciate it ;).

Show your support

Clapping shows how much you appreciated Mark Weedon’s story.