An Experimental Video Conferencing Platform to Bridge the Gap in Communication

Chloe Eghtebas
ACM UbiComp/ISWC 2023
5 min readAug 21, 2023

Co-authors : Alexander Liebald, Maria Pospelova , Ashika M, @JulianGeheeb, Norma Puspitasari, Jamieaward, @GudrunKlinker

Hello ISWC/Ubicomp Community! 🤩

We are students from TU Munich (supervised by very smart people like Prof Jamie Ward and Prof Gudrun Klinker) and we think we’ve built something you might be interested in… it is an open source self-hosted experimental-hub for researchers (that’s you!) to host and conduct customizable online experiments on. Already hooked? Check it out on our Github page.

Demo trailer of our experimental-hub showing creating and conducting an experiment. In this gif, the hub is hosted locally, and the participant is joined twice with different “filters” for demonstration purposes.

Why did we build this?

During Covid, various labs began conducting remote experiments. Different labs had various remote solutions, like using existing tools that weren’t built for the researcher workflow or even developing their own tools for their unique experimental requirements. Think about designing an experiment on zoom to test out various WebAR filters you’ve implemented and the level of overhead/coordination with participants required! 🤔 Or how certain aspects of a video conference’s UI, which we have little control over, could be confounding factors to certain experimental hypotheses.

The potential opportunity to extend some of these “one-off prototypes” coming from our own research, cumulated into the idea for a more generalized experimental-hub, that seemed beneficial to other researchers as well. Even after covid, we are excited about the potential of this open source research tool that can enable a multitude of online experiments; affording more rigorous laboratory control to remote settings, and benefit researchers from access to a wider participant pool.

What makes the experimental-hub so special? 🌈️✨

Tldr; 1) Customizable Researcher and Participant Workflows, 2) Ecological Validity and Robustness of Remote Collected Data (i.e. video data), 3) Control over Data Privacy.

As a part of our vision for the experimental hub, we hope to make at home video conferencing studies have more laboratory control through the experimenter and participant workflows we’ve designed. The workflows are built with safeguards before an experiment takes place and with a flexible UI. For example, in the Lobby room, participant exposure to others in the call is limited and experimenters can check if an applied filter is working properly or if the participants need to make any adjustments to lighting, web-camera position, or adjustments to their hair or attire (i.e. glasses or hair covering their face)

Participant in Lobby being prepared before experiment starts.

In the Session Creation page, experimenters can have some control over the UI as they can currently set custom participant video screen position, size, order, filters applied per participant, etc. These values are then stored in the experimental template (see here for a template example) which can be published along with results to 1) increase repeatability across experiments, or 2) be included in various researchers’ experimental designs that want to build on top of the existing work.

Gif shows the experimental session being modified and it’s subsequent changes to the json template.

We also hope that the experimental templates and filters our platform uses, helps the ecological validity and repeatability across disciplines, i.e. ML/computer vision, communication/psychology, HCI. Filters are at the heart of our hub’s architecture and design. Filters can be used to analyze video/audio data in realtime, manipulate them, or both. Filters in our hub can be chained together to create unique pipelines where the order they are applied affects the output.

We know that some analysis filters like OpenFace are particularly useful in research and we also know that some of you may have already been developing your own to use in MR experiments (e.g. avatars, rPPG, motion prediction/inference, etc.). We have designed our filter class to be easily extendible to be able to use your own solutions in remote experiments such that all the set up will be on the researcher’s computer (by selecting which filters should be applied to which participants) and participants only have to navigate to the invite link sent to them. Btw, we think this can be a low barrier of entry for potential collaborations. If you have your own filters you have been wanting to try in remote settings and want to reduce juggling complicated setups that use OBS + 3 other software tools just to make your remote experiment happen, please do reach out to us with the contact information below!

Shown in this gif are some filters in our connection test page (one of our developer testing tool pages). First part shows a filter that connects to OpenFace and prints out AU6 and AU12 values (making a Duechenne smile which is a prerequisite from our previous research) in real-time to the video stream (notice how the values are lower when a smile is *less* present). Second part of the gif shows some other “mock filters”, possibly not as practical, but good for demonstration purposes. Do you have a compelling filter to use with the hub?

Additionally, we hope that more open science collaborations are achieved through encouraging the sharing of anonymized data, i.e. data from the real-time OpenFace model. Finally, because our experimental hub is self hosted, there should be more control over the data privacy of where potentially sensitive video and audio data is being stored.

Curious to try it out?

At the moment, running the experimental-hub requires a bit of technical setup know-how. Improving the setup process of our hub is the main topic we evaluate in our Ubicomp’23 poster paper. If you are a student, researcher, or professor in HCI (or psychology, or communication, or computer vision, or etc… ) and considering trying out the experimental-hub for your next remote experiment, check out our Github set up instructions wiki and don’t hesitate to get in touch! Either through our Github discussion forum or by email (eghtebas at in dot tum dot de). We would be eager to hear from you and possibly collaborate as we are continually developing and evaluating our platform with you in mind.

We hope that one day, the experimental-hub will grow into a community led effort 🌱 and help facilitate understanding the gap between in-person and online-communication that can be so fundamental in designing the future of remote communication applications (i.e. MR telepresence or the “metaverse”). Until then, we look forward to meeting you at Ubicomp in Mexico!

--

--