Exploring SIRI technology as Artificial Intelligence and its impact on people

Katarzyna Pohorecka
DesignStudies1
Published in
5 min readJul 17, 2019

Not everyone knows that Siri existed way before it was introduced on the release of iPhone 4s in 2011. It was founded in 2007 and kicked into higher gear when launching as iOS application in App Store in 2010. Even though at the time it was still in the early phase of development it quickly gained in the circles of supporters and opponents. While some raved about its spectacular ability of the voice recognition and personal context awareness, the others criticised it for the lack of flexibility and the issues with understanding atypical accents in English. So what exactly is Siri that we all hear so much about?

Siri is a voice-controlled personal assistant designed to improve human interaction with Apple products. Its working builds on the Artificial Intelligence and the way it allows the software to learn automatically from combined and analysed data. By having the access to applications built-in or installed on a device, it helps a user to manage it with less effort. Every vocal command you issue to Siri is sent to Apple’s servers for interpretation and returned to trigger an action. Basing on the given orders it can help you manage your calls and texts without the necessity of touching your device. It can organise your calendar and alarms, and send reminders concerning made plans. When connected to smart appliances at your home it can control them the way you like. In compliance with your needs, it provides you with entertainment, including playing your favourite music or expanding your knowledge by answering questions. Siri’s functioning is based on characteristic for AI constant machine learning. Its improvement is based mostly on collecting offline data directly from your device without violating your privacy regarding the online searching history. Even though Apple claims that they do not share any personal information with any outside organisations, according to the studies by PvC Consumer Intelligent Series in 2018, 28% of consumers asked why they refuse to use voice assistants said they are concerned about their privacy. Moreover, according to the study conducted by The Microsoft Market Intelligence and Bing Ads Marketing, similar concerns were presented by 41% of digital assistants users. It is understandable considering that nowadays people are aware of the way social media gathers data to use it in personalised advertising. There are algorithms responsible for picking adverts and information that you might be interested in. Their functioning builds on the previously mentioned machine learning from data we share with a chosen medium such as, for instance, Facebook. The algorithms help to analyse it and understand characteristic of a person — its likings, dislikings, interests. In that social media picks information, adverts, articles, photos etc. that you might be interested in to make you devote more time to make use of its services. In the era of vocal assistants people are afraid that companies like Apple might use their devices to spy on them whenever they want. When now the algorithms’ purpose is predominantly targeting the right commercials to the right people, it might become way more influential in time. In her TED Talk, Zeynep Tufekci refers to the statement of Donald Trump’s social media manager in which he confirmed that they were using Facebook posts to discourage certain groups of people from voting in the presidential election. There are legitimate reasons to be careful about the privacy and the fact that people try to look after their rights is very promising. There must be changes made in the way companies employ personal data for their gain. Hence with the voice assistants, tech builders must ensure that their products contribute to overcoming trust issues presented by the users. It is especially important considering benefits offered by Artificial Intelligence of this kind. If we use the technology wisely, we could expand our possibilities and accomplish things that nowadays we could only dream of. The devices that are currently in use already offer a wide range of facilities for people with physical impairments. The voice assistants support human memory, enhance work and social lives. They are intelligent helpers serving as an agent in the human-phone interaction.

So how far can the development of voice assistants go? We can easily notice that they get more intuitive with time. So the technology of the future should be just as much human as it is artificial. It should better recognise emotions and promptly give adequate responses. It is hard to achieve considering the psychological and behavioural differences between people. Creating the algorithm containing all patterns of being a human seems impossible now. But as the machine learning continues, in the future they will be able to translate that complexity to codes and figures. Now they differentiate just basic emotions and, as noticed from the example of Siri, generated responses are not always accurate. The study done by Stanford University and the University of California in 2016 compared some vocal assistants’ responses to questions concerning mental health and physical violence. They researched answers and found them inconsistent and incomplete. Fortunately, Apple improved their systems and worked on the responses so current Siri version can help the user with offering hotline numbers and navigating to specialised websites, where people can get an advice. In the era where people turn to their phones with all kinds of issues it is important that the devices facilitate providing those in crisis with the right means of help.

A different question is, how far the development of this Artificial Intelligence should go? With the constant development of the technology we can get to the point where we lose the sense of what is ethically right. With the fast-growing development of AI rises the threat of manipulation and the low quality of human interaction. Stephen Hawking once said that “Artificial Intelligence could end mankind”. However, it is essential to remember that the development of it resides in people’s hands. It depends on us how the AI employed in, for instance, the voice assistants will look like in the future. We can protect ourselves from unfavourable consequences and use the technology to expand our experience in many profound ways.

[1]: Beebom. (February 2014). The Story Behind Apple’s Voice Assistant Siri. https://beebom.com/story-behind-apples-voice-assistant-html/

[3]: Apple. Siri. https://www.apple.com/uk/siri/

[4]: TED. Zeynep Tufekci. (September 2017). We are building a dystopia just to make people click on ads. https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads/transcript

[5]: TED. Sam Harris. (June 2016). Can we build AI without losing control over it?https://www.ted.com/talks/sam_harris_can_we_build_ai_without_losing_control_over_it

[6]: TED. Margaret Mitchell. (October 2017). How we can build AI to help humans not hurt us. https://www.ted.com/talks/margaret_mitchell_how_we_can_build_ai_to_help_humans_not_hurt_us/transcript#t-173822

[7]: TED. Raphael Arar. (December 2017). How we can teach computers to make sense of our emotions?https://www.ted.com/talks/raphael_arar_how_we_can_teach_computers_to_make_sense_of_our_emotions/transcript

[8]: Pocket-lint. Britta O’Boyle. (December 2018). What is Siri and how does Siri work? https://www.pocket-lint.com/apps/news/apple/112346-what-is-siri-apple-s-personal-voice-assistant-explained

[9]: PwC. Prepare for the voice revolution. https://www.pwc.com/us/en/services/consulting/library/consumer-intelligence-series/voice-assistants.html

[10]: Microsoft. Christi Olson. (April 2019). New report tackles tough questions on voice and AI. https://about.ads.microsoft.com/en-us/blog/post/april-2019/new-report-tackles-tough-questions-on-voice-and-ai

[11]: CNN. Emanuella Grinberg. (March 2016). ‘Siri, I was raped’: Study compares smartphone responses in crises. https://edition.cnn.com/2016/03/14/health/smartphone-responses-rape-violence/index.html

--

--

Katarzyna Pohorecka
DesignStudies1

Edinburgh Napier University | Product Design student | Poland, Kraków & Scotland, Edinburgh