Revolutionizing Emotional Analysis and Feedback Capabilities with AIGC

Keqin Zhao
b8125-spring2024
Published in
3 min readApr 23, 2024

Traditional Approaches To Emotional Analysis

Traditionally, emotional analysis relied heavily on sentiment dictionaries and statistical algorithms. These dictionaries contain four major categories: negation words, positive sentiment words, negative sentiment words, and adverbs of degree.

To analyze and classify emotions, we typically go through 8 steps:

  1. Data Reading and Sentence Segmentation: Utilizing specific punctuation marks as sentence delimiters.
  2. Tokenization: Employing tokenization dictionaries and sentiment dictionaries for segmentation.
  3. Identifying Sentiment Words: Scanning sentences to identify positive or negative sentiment words and recording their positions.
  4. Adverb Handling: Analyzing adverbs of degree preceding sentiment words to assign varying weights based on intensity.
  5. Negation Handling: Detecting negation words before sentiment words and adjusting sentiment values based on the parity of negation word occurrences.
  6. Punctuation Handling: Paying particular attention to exclamation marks and question marks, which influence sentiment value computation.
  7. Calculating Sentiment Values: Computing sentiment values for each sentence and comparing the aggregate positive and negative sentiment values to determine the overall emotional inclination.
  8. Summarizing Emotional Inclination: Deriving the emotional inclination of the entire comment based on the sum of sentence sentiment values.

Although with these methods, we could already achieve a 73% accuracy rate in identifying emotional inclinations, these methods only tell us a simplified version of emotions (which are rough assessments of positivity/negativity and emotional intensity, without precise insight into the specific emotions involved).

For instance, when I stated, “Today is the anniversary of my grandfather’s passing,” these methods could only determine a negative emotional state without discerning whether I was feeling sad, angry, or anxious.

AIGC’s Transformative Impact on Emotional Analysis

However, following the explosion of AIGC, the landscape of emotional analysis and accurate sentiment discernment has transitioned from “impossible” to “possible.” Through extensive data training, AIGC models can effectively assess the exact emotional states conveyed by users.

1Text Model: Currently, the LLM text models are already showcasing strong emotional analysis capabilities.

  1. Direct Emotional Words: When users say direct emotional expressions or include emotional words in a sentence, the model can accurately identify the conveyed emotion. For instance, if I say to the LLM model, “It seems that every time when I feel satisfied and happy, things start to break,” the model would interpret my emotion as “sad and worried.”
  2. Tone Indicators: AIGC models can also recognize the intended emotion conveyed through tone, even detecting sarcasm.
  3. Implicit Emotional Expression: LLM models can infer underlying emotions from long paragraphs that don’t seem to have emotional content (but can be interpreted from context). The models could also accurately analyze the emotions depicted in literary works.

2Multi-modal Model: Beyond textual analysis, for instance, the recently launched product Hume.AI can analyze emotions from audio, images, and videos.

  1. Audio Analysis: Hume.AI, trained on millions of human interaction data, utilizes language modeling and text-to-speech technology to capture subtle voice variations, including non-verbal tones, rhythms, pitch, laughter, sighs, acknowledgments, cries, etc. With these variations, the model could then accurately identify users’ emotions expressed through voice. It can also engage in direct conversations with users, exhibiting a notably human-like tone with humor, colloquialisms, emotional inflections, and filler words such as “Ah, I see!” rather than direct responses.
  2. Image/Video Analysis: Hume.AI can analyze users’ facial expressions from images or videos, discerning subtle facial movements conveying love, admiration, awe, disappointment, or empathetic pain.

Conclusion

Identifying human emotions was once a daunting task, but now with AIGC, it has become achievable. While this field is still evolving, it is highly promising in the future, where AI will not only understand but also respond to human emotions in nuanced and empathetic ways. As we continue to explore and refine these technologies, the potential for enhancing human-computer interactions, mental health support systems, and personalized user experiences becomes increasingly tangible.

--

--