“Tell her what you want.”
Siri. Alexa. The voice in your GPS system. This practice of asking a female-gendered object for assistance — and receiving a warm, kind response, an immediate response — which is to say, a response that is designed to consider and tend to the user’s emotional needs.
No matter how trivial or abusive or joking the question asked, or the tone of voice in which a question is asked. . . the response is always within a limited range of emotional care-taking.
There is a hidden emotional labor built into technology.
You know how back in the day there were these little saucer dishes that you could put your pre-walking kid in? And the kid would be suspended over the floor, but feet could touch, and they could move around?
Turns out that’s not a great idea.
What if Siri, Alexa, the GPS voice, and all the female-gendered voices that are installed in objects. . . are also impeding or degrading development of one’s emotional awareness and ability to relate to other humans?
What if they are eliminating the desire to communicate or connect with female-gendered humans?
Much has been written about the practice of regarding a female human and objectifying her.
What consequence of the reverse?
What is the effect of taking a non-gendered object and casting it as female-gendered?
Might it reinforce a belief that it is “natural” to disregard, devalue, and ascribe low status to that which is coded as “female”?
Might it foster an expectation of emotional care-taking from that which is coded as female? In this case, because female voices in our devices are built to be consistently warm, kind, and immediate in their response?
Might these female-gendered objects become curious role models, as kids observe the role these objects play in their homes and families?
Or perhaps so.
In any case, regardless of one’s opinion or speculation, I am confident that we will find out just what effect this has on us, in due time.
And likely sooner than expected.