The Age of the Emotional Machines

Before affective computing came in, intelligent systems were blind to human emotions. Now with the right technologies, affective computing could disclose our innermost feelings to businesses eager to exploit them for profit.

In recent years, a fledgling Emotion Economy has been emerging as human emotions are increasingly collected, evaluated and subjugated for various purposes. Equipped with various technologies such as natural language processing, software for speech, facial and gesture recognition, machine learning, big data, automated reasoning and emotion analytics, an affective computing system is capable of analysing complex data sets that can interpret human emotions in real time (BioPortfolio, 2015).

There are three main approaches in affective computing. The first is to track where an individual’s eyes are pointing at, in order to spot what the individual is most attentive to, and hence finds the most engaging (The Economist Intelligence Unit Perspectives, 2015). It is frequently used by advertising and user experience sectors to ascertain the best way to hold consumers’ attention.

(Source: Affectiva, 2015)

Another approach is simply to observe facial expressions. Affdex, a product of Affectiva, is a cloud-based, SaaS-model that uses an ordinary webcam to perform this automated analysis of facial expression. A free online demonstration of Affdex can be tested over here.

The third approach is to infer an individual’s mood just from the conductivity of his or her skin, which increases as the individual’s stress level rises (The Economist Intelligence Unit Perspectives, 2015).

How will Affective Computing change the world?

Provide personalized products and services

Traditionally used techniques like surveys and interviews, being prone to biases, fail to detect truthful and spontaneous feelings. By leveraging affective computing at their disposal, businesses can track human emotions and use the insights garnered to improve products and services (, n.d.). By interpreting emotions of consumers, online products can be adapted in real time depending on the user’s responses.

Revolutionize education

(Source: Alamy, n.d.)

By the end of 2016, Massive Open Online Courses (MOOCs) has garnered over 50 million registered learners and provided learners easy access to diversified prime learning materials at a low cost (Marsh, 2017). However, educators and researchers have highlighted concerns about “the low completion rates, high in-session interruptions, and lack of interactions among students and instructors” (Pham and Wang, 2015). Such concerns could effectively be addressed by affective computing.

Due to the emotional domain addressed by affective computing systems, such systems could potentially offer an online companion that feels more human. For instance, students facing learning difficulties can use machine tutors that offer colloquial feedback and learning plans that are personalised according to the students’ emotional states, and not merely the adequacy of their knowledge and skills (Educause, 2016).

With affective computing, learning problems could also be more accurately diagnosed. Quantitative information collected from affective computing systems can offer a more detailed insight into students’ learning progress and attentiveness, and hence allow the faculty to work towards a more effective curriculum redesign. Intelligent tutoring systems (ITS) would also be more effective in maximizing students’ learning goals when the virtual tutor is able to analyse the learner’s gestures, nods and facial expressions in real time and respond accordingly by altering its teaching style or materials if the student seems to be showing negative signals towards the lessons (Hoque, Kaliouby and Picard, n.d.).

(Source: Evan Kafka, 2012)

Predict Political Elections

One unique application of affective computing was to predict political elections. During the 2012 Presidential elections, Affdex was used to track more than two hundred people watching clips of the debates between Obama and Romney (Connor, 2015). The software was eventually able to predict the election results with 73% accuracy.

Evaluate entertainment material

Television producers such as CBS and the BBC have also used Affdex to gauge audience reactions to new television shows, while Sony has used it to assess movie trailers. Tons of volunteers were recruited to watch video clips via their webcam, while their emotional responses were amassed to pinpoint the funniest characters. “One particular sitcom for CBS had six characters who were all supposed to be funny. But there was this one couple who, every time they showed up, annoyed people; they were just not funny. CBS ended up swapping the characters out,” CEO at Affectiva, El Kaliouby mentioned in an interview (Murgia, 2016).


“Affective wearables” that collect social, emotional data and on sleep patterns could potentially prevent the onset of conditions such as depression. Patterns from children’s basic emotional cues could even be analysed to detect the onset of diseases (Frost & Sullivan, 2016).

Barriers to Adoption

Compromise of Personal Information

Personalisation in products and services fundamentally adds value to customers. However, such customisation can only be attained by transferring personal information to the infrastructure, storing the user’s preferences in the individual’s profile, and delivering services according to such preferences. As the sensor technology improves, very personal information such as physiological signals, from galvanic skin response to brain activity will be traded in return for such services, using the ideas of affective computing.

Galvanic Skin Response (Source: MbientLab, 2016)

Although we can say that users are willing to trade personal information in return for value-added services, it opens up the possibility of every behavioural data and emotion being recorded digitally. The fact that this data could then lead to invasions of privacy, reduction or removal of rights, unwanted advertising and even unexpected loss of the data, compounds to the barrier in the adoption of affective computing technologies.

In order to overcome such a barrier, companies tinkering with affective computing should prioritise the user’s privacy and consider ways to share data appropriately, prevent unintentional leakage of personal data, and ensure accountability when collected data is compromised (Daily et al., 2013). Such data collection should be heavily encrypted in order to keep users anonymous. Additionally, users should first have the choice whether or not they want to feed their information to the system.

Barriers in Affective Computing Research: Costly and Obtrusive Data Collection Methods

The speed of affective computing to automatically recognise spontaneous and fleeting facial movements in real-time is pivotal in propagating the faster adoption of affective computing. Due to the various nations and cultures that users are from, there would be high inconsistency levels in the physical, verbal, and other cues being used to evaluate emotion. In other words, an affective computing sensor is feasible for detecting spontaneous emotional states in real time, but would be less sensitive to social and cultural differences (Wagner, Kim and Andre, 2005). This variability therefore makes affective computing tools inaccurate in assessing emotions in real time.

In order to reduce the variability, affective databases should first be spontaneous, big and comprehensive. However, it has been known that in the affective computing arena, the lack of ‘big’ labelled affective databases hinders creating deep models for research (ACII 2017, 2017).

The method of collection of a large-scale and credible organised data of human affective expressions is an important prerequisite in designing real-time emotion recognizers.

Even though crowdsourcing provides a promising method to generate large-scale affect databases, it is either prone to errors or very costly because willing participants need to be compensated in monetary form. Moreover, because participants are aware of being monitored either in person or by the webcam, affective events occurring in daily lives fail to be monitored, resulting in less spontaneous affect databases.

At the moment, present spontaneous emotional expression databases are formed and organized by manually labelling movies, photos and voice records, making it time consuming and expensive. Additionally, the obtrusive ways of collecting data also limit the effectiveness of the research. Present technologies should be further explored, with various possible data sources being assessed to bring new opportunities of affect data collection in an unobtrusive way.

One solution would be to install software systems upon smartphones that run continuously in the background to monitor users’ mood and emotional states. With a proper design of mechanisms to collect feedback from users, large-scale spontaneous affect databases can be established cost-efficiently.

Field studies that gathered keystrokes as users performed their daily computer tasks were previously conducted by various researchers and underscored the opportunity to perceive the presence of cognitive or physical stress using keyboard interaction data (Vizer, Zhou and Sears, 2009). Businesses interested to adopt affective computing should be able to use keystroke dynamics, a method to infer the emotions of computer users by analysing their typing patterns on the keyboard in order to pave the way for making less obtrusive affective computing systems that can continuously monitor stress in a spontaneous setting. Recognising the user’s emotional state through keystroke dynamics thereby addresses the problems of using costly data collection method which is also nonintrusive to the user.


(Source: REUTERS/Francois Lenoir)

There is also an ongoing heated discussion over the use of affective systems as ‘companions’ for the elderly and children, due to the fact that friends and family could misconstrue it as a way to evade their responsibilities (Cowie, 2014). It is very tedious to come up with solutions to ethics. But just like the Three Laws of Robotics, companies that venture into affective computing ought to stay informed and have built-in ethical principles in the way they innovate and build affective computing systems.

Companies that commercialise affective computing

Microsoft is one of the large corporations who have adopted affective computing early. In October 2012, the company filed a patent for emotion detection and feedback system that checks out the emotions of people in a room and projects feedback on the lens of Hololens (Mizroch, 2015). Based on the real-time feedback, the user wearing the Hololens can then adapt his or her style of communication with others accordingly. If implemented, Hololens could change the lives of many, especially those with social anxiety or suffering from diseases like autism that are hurdles to socializing (Kumar, 2015).

Additionally, Microsoft has also released a cloud-based Emotion Application Programmer Interface (API) that has drawn the interest from developers, Fortune 500 companies and start-ups alike who are eager to tap into the API capabilities and start training their systems to identify emotions from human faces but are lacking a team of machine learning and AI experts (Murphy, 2015). The pricing details are shown below.

Figure 1 Microsoft Emotion API Pricing Details (Source: Microsoft Azure, 2017)

Other than Microsoft, there are various early start-ups working in this fledgling field.

(Source: KQED, 2015)

Brain Power, based in Cambridge, Massachusetts, specializes in a range of Google Glass compatible software and hardware solutions to assist children with autism spectrum disorder. Its unique software suite, the “Empowered Brain”, contains powerful data collection and analytic tools allowing for customized feedback for the child, and production of quantitative reports on their social behaviours (Crunchbase, n.d.).

Another affective computing start-up, NuraLogix, founded in 2015, has also developed patent pending technology its Transdermal Optical Imaging™ technique that makes use of “a conventional video camera to extract facial blood flow information from the human face” (Thomson, 2016). Aided by machine learning algorithms and neuroscience, it then makes it possible for human emotions to be detected regardless of the presence of facial expressions.

MediaCom, the media agency with clients like Volkswagen, has formed a strategic partnership with Realeyes, a London-based start-up that specialises in emotional analytics and testing of adverts before they are broadcasted on television or online. This move corroborates that algorithmic analysis of facial expressions has become adequately accurate to permit digital measurement of the advertisements’ quality. Realeyes examines anonymised recordings in the cloud and as shown in the diagram below, a dashboard will then display the results using “a matrix of six core emotions: happiness, surprise, sadness, disgust, fear and confusion” (Newton, 2016).

Figure 2 “In the Volkswagen commercial The Force, a child dressed as Darth Vader uses his magical powers on the new VW Passat. The dominant emotion is the neutral state (light green curve at the top of the picture below) i.e. there is no obvious emotional reaction and the software had classified the facial expression as neutral.” (Source: decode, 2013)

Competences required to commercialise affective computing

In order to acquire the most value from affective computing technology, companies wanting to adopt this technology would need to first assess the business problem that they want to solve.

Most importantly, due to the limited affective data sets available, they would have to research on the available affective data that would best suit their business needs. Existing sources of gathered data could already be mined efficiently, and inexpensive integration with additional data sources could already take place (Noga, Saravana and Overby, 2017). If companies intend to amass affective data sets on their own, they should also consider cost-effective methods of doing so in a way that the data would be spontaneously collected, as previously mentioned in the section about the methods of data collection. The solutions such as keystroke dynamics, are suggested to not only highlight the importance, but also to give suggestions to interested adopters of affective computing to leverage everyday computer interactions in order to establish a cost-effective spontaneous affective database.

Companies should decide how accurately the inferences that they wish to make can be done with the available data inputs. Perhaps, certain non-emotional data would be required to get value from the emotionally aware system.

Additionally, technical challenges would need to be scoped out. Companies’ technology departments would need to work out the challenges of integrating with existing systems to assimilate and analyse large volumes of emotional data (Noga, Saravana and Overby, 2017).


Competition to these affective computing start-ups are already coming from big corporations who are doing acquisitions in their efforts to diversify and keep up with the latest technologies. For instance, in 2016, Apple has already acquired a San Diego-based company called Emotient which uses artificial technology to detect emotion from facial expressions.

Due to cost and technology adoption barriers, companies may opt for more affordable ways to perform consumer research online. Moreover, there are various sceptics of this affective computing field, as some marketers are not exactly convinced that using facial analysis to understand how consumers think would be significantly more accurate than traditional consumer research techniques (decode, 2013).

“The emotions that, for example, our partner feels are interpreted by us as the observer and are not in any way objective information.” (decode, 2013)

Hence, competition might come from the traditional forms of market research methods like social media analytics, web analytics, surveys and focus groups.

Neuroscience based Market Research — Neuromarketing Companies

Although one or two affective computing start-ups like Affectiva are sometimes classified as neuroscience based companies, neuroscience fundamentally differs from affective computing, in that it involves neural measures by tapping into fMRI imaging to improve predictions of consumer behaviour. Competitors to the affective computing companies would thereby include Buyology Inc., the world’s primary neurological marketing company, which uses its global marketing neuroscience database “to develop rigorous frameworks and tools that bridge science and business” in order “to provide a provocative and proprietary understanding of consumer decision-making and brand relationships” (Neuromarketing, n.d.).

(Source: Neuro-Insight, 2015)

Neuro-Insight is also one competitor that uses Steady-State Topography (SST) to record and measure the scalp’s electrical signals “to build a second by second picture of activity in the brain”. This technique is effectively an improvement upon EEG that is extensively used in hospitals globally, and has been corroborated by research and utilized in clinical applications for over fifteen years (Neuromarketing, n.d.). While affective computing mostly evolves around analysing facial expressions, neuroscience based market research companies may be more reliable in analysing emotions more accurately, making them strong competitors to the affective computing companies.

What existing companies or industries might this technology disrupt?

The market growth in commercial affective computing is mainly attributed to the rise in demand to measure human emotions and transform them into actionable insights by the consumer market research industry.

Such actionable insights propagated by affective computing could effectively lead to the development of highly personalized products and subsequently increase overall consumer satisfaction for a company, making affective computing a formidable disruptive force in the market research industry that was worth US$68 billion in 2015 (ESOMAR, 2016). Wearables equipped with emotional computing algorithms could effectively propagate mood-targeted marketing, possibly disrupting the marketing industry as well, and also allow the gaming industry to construct more immersive gaming environments, as previously attempted in 2013 by Microsoft Kinect for Xbox One, using its time-of-flight camera to track players’ physiological conditions and eye movements (Mok, 2015).

The NMC Horizon Report 2016 Higher Education Edition predicts that in 4–5 years, affective computing, listed as one of six major developments in higher education technology, would have found its way into higher education (Johnson et al., 2016).

Figure 3 Five-year Horizon for Higher Education Institutions (Source: The NMC Horizon Report 2016: Higher Education Edition, 2016)

Autonomous or e-learning system could be implemented where the presentation style of an affective digital tutor can be adapted according to the learner’s response.

Technology Enabled Learning is a thereby a collaborative learning procedure that fundamentally transforms teaching-learning pedagogy where the role emotions play is frequently neglected (Ray and Chakrabarti, 2016).

With over 100 papers in the ScienceDirect database that include keywords like “affective computing in education” or “affective computing in learning,”, it is obvious that evolving technology has been applied to education (Wu, Huang and Hwang, 2015).


Despite the limitations in the data sets available and the accuracy and spontaneity of affective data sets presently available and ethical concerns that need to be worked out, the “age of emotional machines” is certainly arriving.

Although affective computing was listed as one of the six innovations that would revolutionise healthcare during the Future of Health and Wellness Conference held in 2013 at the Massachusetts Institute of Technology, it can be seen that affective computing is slowly distancing from its applications as an assistive technology in healthcare and shifting towards wider applications in consumer market research, education, politics and even entertainment (Eastwood, 2013).

The emergence of big data and machine learning has no doubt propelled affective computing research forward. This push toward the age of emotional machines has been accelerated by greatly enhanced sensors that handheld devices now commonly have.

With its broader commercial applications and increasingly affordable equipment needed to support affective computing (e.g. Tobii’s eye-tracking sensors, Intel’s RealSense 3D cameras, and Myo’s myoelectric sensors), it is no wonder that the affective computing market is expected to grow from USD 9.35 billion in 2015 to USD 42.51 billion in 2020 (The Economist Intelligence Unit Perspectives, 2015; MarketsandMarkets, 2015).


ACII 2017. (2017). Utilising Big Unlabelled and Unmatched Data for Affective Computing. [online] Available at: [Accessed 18 Feb. 2017].

BioPortfolio. (2015). Affective Computing Market by Technology (Touch-based & Touchless), Software (Speech, Gesture, & Facial Expression Recognition, and others), Hardware (Sensor, Camera, Storage Device & Processor), Vertical, & Region — Forecast to 2020. [online] Available at:,+software+(speech,+gesture,+%26+facial+expression+recognition,+and+others),+hardware+(sensor,+camera,+storage+device+%26+processor),+vertical,+%26+region+-+forecast+to+2020.html [Accessed 18 Feb. 2017].

Connor, M. (2015). Required Reading: Empathy & Disgust. [online] Rhizome. Available at: [Accessed 18 Feb. 2017].

Cowie, R. (2014). Ethical Issues in Affective Computing. Oxford Handbooks Online. [online] Available at: [Accessed 18 Feb. 2017].

Crunchbase. (n.d.). Brain Power. [online] Available at: [Accessed 18 Feb. 2017].

Daily, S., Meyers, D., Darnell, S., Roy, T. and James, M. (2013). Understanding Privacy and Trust Issues in a Classroom Affective Computing System Deployment. Distributed, Ambient, and Pervasive Interactions, [online] pp.414–423. Available at: [Accessed 18 Feb. 2017].

decode, (2013). Does facial expression = emotion? A scientific perspective on new methods of measuring emotions. decode Science Update. [online] decode Marketing. Available at: [Accessed 18 Feb. 2017].

decode, (2013). What does a result look like? [image] Available at: [Accessed 18 Feb. 2017].

Eastwood, B. (2013). 6 Innovations That Will Change Healthcare. [online] CIO. Available at: [Accessed 18 Feb. 2017].

Educause, (2016). 7 Things You Should Know About Affective Computing. ELI 7 Things You Should Know About … ™. [online] Educause Learning Initiative. Available at: [Accessed 18 Feb. 2017].

ESOMAR, (2016). Global Market Research 2016. [online] ESOMAR. Available at: [Accessed 18 Feb. 2017]. (n.d.). Affective computing and its applications in the apparel retail industry. [online] Available at: [Accessed 18 Feb. 2017].

Frost & Sullivan. (2016). Affective Computing’s Ability to Read Emotional Patterns Poised to Disrupt Multiple Industry Sectors. [online] Available at: [Accessed 18 Feb. 2017].

Hoque, M., Kaliouby, R. and Picard, R. (n.d.). When Human Coders (and Machines) Disagree on the Meaning of Facial Affect in Spontaneous Videos. [online] Available at: [Accessed 18 Feb. 2017].

Johnson, L., Brown, M., Becker, S., Cummins, M. and Diaz, V. (2016). NMC Horizon Report: 2016 Higher Education Edition. [online] Austin, Texas: The New Media Consortium. Available at: [Accessed 18 Feb. 2017].

Kumar, A. (2015). Microsoft Hololens’ Emotion Detection and Feedback System. [online] TWCN Tech News. Available at: [Accessed 18 Feb. 2017].

MarketsandMarkets. (2015). Affective Computing Market worth 42.51 Billion USD by 2020. [online] Available at: [Accessed 18 Feb. 2017].

Marsh, N. (2017). MOOC users reach 58 million globally. [online] The PIE News. Available at: [Accessed 18 Feb. 2017].

Microsoft Azure. (2017). Pricing — Emotion API | Microsoft Azure. [online] Available at: [Accessed 18 Feb. 2017].

Mizroch, A. (2015). Microsoft Awarded Patent for Emotion Detecting Eyeglasses. [online] WSJ. Available at: [Accessed 18 Feb. 2017].

Mok, K. (2015). The Rise of Emotionally Intelligent Machines That Know How You Feel. [online] The New Stack. Available at: [Accessed 18 Feb. 2017].

Murgia, M. (2016). Affective computing: How ‘emotional machines’ are about to take over our lives. [online] The Telegraph. Available at: [Accessed 18 Feb. 2017].

Murgia, M. (2016). Affective computing: How ‘emotional machines’ are about to take over our lives. [online] The Telegraph. Available at: [Accessed 18 Feb. 2017].

Murphy, M. (2015). Microsoft opens machine learning API that detects if a person is happy or sad. [online] Techworld. Available at: [Accessed 18 Feb. 2017].

Neuromarketing. (n.d.). Buyology, Inc. [online] Available at: [Accessed 18 Feb. 2017].

Neuromarketing. (n.d.). Neuro-Insight. [online] Available at: [Accessed 18 Feb. 2017].

Newton, R. (2016). Realeyes: testing emotional reactions to adverts. [online] FT. Available at: [Accessed 18 Feb. 2017].

Noga, M., Saravana, C. and Overby, S. (2017). Empathy: The Killer App for Artificial Intelligence. [online] Digitalist Magazine by SAP. Available at: [Accessed 18 Feb. 2017].

Pham, P. and Wang, J. (2015). AttentiveLearner: Improving Mobile MOOC Learning via Implicit Heart Rate Tracking. Ph.D. University of Pittsburgh.

Ray, A. and Chakrabarti, A. (2016). Design and Implementation of Technology Enabled Affective Learning Using Fusion of Bio-physical and Facial Expression. Educational Technology & Society, [online] 19(4), pp.112–125. Available at: [Accessed 18 Feb. 2017].

The New Media Consortium, (2016). Five-year Horizon for Higher Education Institutions. [image] Available at: [Accessed 18 Feb. 2017].

The Economist Intelligence Unit Perspectives. (2015). Computing with emotions. [online] Available at: [Accessed 18 Feb. 2017].

Thomson, A. (2016). 15 Leading Affective Computing Companies You Should Know. [online] VentureRadar. Available at: [Accessed 18 Feb. 2017].

Vizer, L., Zhou, L. and Sears, A. (2009). Automated stress detection using keystroke and linguistic features: An exploratory study. International Journal of Human-Computer Studies, [online] 67(10), pp.870–886. Available at: [Accessed 18 Feb. 2017].

Wagner, J., Kim, J. and Andre, E. (2005). From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. [online] Amsterdam: 2005 IEEE International Conference on Multimedia and Expo. Available at: [Accessed 18 Feb. 2017].

Wu, C., Huang, Y. and Hwang, J. (2015). Review of affective computing in education/learning: Trends and challenges. British Journal of Educational Technology, [online] 47(6), pp.1304–1323. Available at: [Accessed 18 Feb. 2017].