Detecting Virtual Social Contexts: A Peek into the World of Socially Anxious Individuals

Zhiyuan Wang
ACM UbiComp/ISWC 2023
4 min readAug 27, 2023

Co-authors: Maria A. Larrazabal, Mark Rucker, Emma R. Toner, Katharine E Daniel, Shashwat Kumar, Mehdi Boukhechba, Bethany A. Teachman, Laura E. Barnes

In today’s digitized world, information routinely collected by our smartphones and smartwatches provides insight into our health and behavior, including how well we’re sleeping, how much we’re moving around, and in some cases, how our body (e.g., heart rate) is responding to different activities. This increased capability of mobile sensing has allowed researchers to much more efficiently detect when individuals are engaging in certain behaviors, including those related to mental health. As these technologies become more ubiquitous, we are also better equipped to understand its implications for individuals with conditions like social anxiety.

Exploring the Tools: Ubiquitous Tools for Mental Health Insights

Mobile sensing is omnipresent. Beyond its function for daily communication and health tracking, mobile sensing has the potential to help us understand mental health in real-time. Whether it’s predicting daily mood shifts, moments of distress, or changes in behavior, passively sensed data is an invaluable resource. Beyond this, passive sensing can help us map patterns related to social interactions, whether they are happening virtually or in-person. This is particularly useful and relevant in today’s era, given the increasing demand for tailored mental health services.

Detecting Social Context: Why It Matters

A compelling application of mobile sensing is its ability to detect social contexts. This understanding can help us develop timely interventions that are well matched to the person’s context (e.g., suggesting that a person seek social support if they are around others and do a brief relaxation exercise if they are alone). This may be especially useful for socially anxious individuals, given their anxiety is likely to fluctuate alongside their social interaction patterns.

A New Era: Virtual Social Interactions

The world has swiftly embraced online platforms like Zoom and FaceTime for everyday communication. As this shift occurred rapidly, it left a gap in our understanding of how individuals experience virtual social interactions, especially if they tend to feel socially anxious. Although these virtual platforms mirror in-person interactions, they do not fully capture non-verbal cues (e.g., full body posture), and thus may be experienced differently by individuals. As our interactions migrate to these platforms, understanding the nuances of virtual chats will be increasingly important.

Study Highlights: Biobehavioral Mobile Sensing Data

Figure 1: The social context categories manipulated in this Social Interaction Monitoring study via Zoom.

To bridge this knowledge gap, this paper explored socially anxious individuals’ experience during various virtual social situations. We recruited a group of socially anxious undergraduates, set them up with wearable devices that collect biobehavioral data, and asked them to complete diverse virtual scenarios via Zoom.

Figure 2: Feature distributions during the current phase of non-social and social events. *** indicates a statistically significant difference on the Kruskal–Wallis test at p < .001. (PPG=Photoplethysmography, HRV= Heart Rate Variability, EDA=Electrodermal Activity, Std=Standard Deviation)

By utilizing a robust machine learning pipeline, this work studied the feasibility of detecting if participants were:

  • Engaged in a social situation.
  • A part of a specific sized group.
  • Experiencing varied degrees of social evaluation.
  • Navigating through different temporal phases of social interactions.

Using this method, we found differences in physiological responses (e.g., heart rate) across diverse contexts. Along with the results of deep learning models, this study shows it is feasible to accurately differentiate between most (assessed) virtual social contexts, like recognizing if an individual was in a social situation or not, or talking with one other person or a group of people.

Figure 3: Proposed machine learning pipeline for social context detection.

The Study’s Innovations: Do More Than Detect

This study stands out for its depth and breadth, bringing us closer to building context-aware Just-In-Time Adaptive Interventions (JITAIs):

Key Takeaways:

  • Performance in Social Context Detection: Our models showcased promising accuracy levels (78.01% differentiating non-social from social contexts). With personalized model tuning, our study achieved up to 85.95% accuracy in distinguishing social situations.
  • Temporal Insights into Interactions: Our models adeptly identified the various phases of a social interaction with a 73.81% accuracy rate when pinpointing active conversation moments, showcasing potential for real-world applications.
  • Challenges with Evaluative Detection: Determining evaluative contexts presented hurdles, with only a 50% accuracy rate. This emphasizes the intricate nature of such contexts and the need for further research.
  • Variability in Sensor and Feature Importance: While the combined use of all sensors typically provided optimal results, individual sensors like PPG and EDA showed varying levels of effectiveness across tasks. Certain features, such as PPG feature HRV NN-interval, emerged as consistently valuable across contexts.
  • Individual Differences: Our data reveal significant variability among participants, emphasizing the need for personalized detection systems. Using a fraction of individual data boosts prediction accuracy.
  • Audio’s Potential: Incorporating audio features helps distinguish between types of social interactions. While audio boosts accuracy, privacy remains paramount.

Future Implications: What Lies Ahead

The results of this study pave the way for future research in context detection. By understanding the contexts and triggers of social anxiety in virtual interactions, there’s potential for context-aware Just-In-Time Adaptive Interventions.

If you’re intrigued by the interplay of technology, virtual social interactions, and social anxiety, dive deeper into our research paper. We thank you for joining us on this exciting journey into the digital psyche of socially anxious individuals. See you in Ubicomp2023!

Zhiyuan Wang & Maria A. Larrazabal

--

--

Zhiyuan Wang
ACM UbiComp/ISWC 2023

PhD student in systems engineering @UVA. HCI/Ubicomp/Mobile Computing/Digital Health