Detecting user states in video meetings with sensor data

  • Type:Bachelor and Master
  • Supervisor:

    Julia Seitz

  • Add on:

    Available

Motivation

The use of video meeting systems is ubiquitous in our daily lives, both in the context of work and leisure. Especially during the past three years, virtual meetings have become an integral part of our life. One important challenge of virtual meetings is that a variety of negative user states can arise. One example is the uncomfortable feeling regarding silent moments also called “awkward silence”. To support the user in making these moments of silence less awkward and by that allow people to get into a more pleasurable and productive state, this project should enhance existing open source video meeting systems to automatically detect silence as a contextual element and integrate a notification showing the currently collected sensor signal from the user (e.g. heart rate and heart rate data from ECG  via a wearable device). Therefore, in a first step, an approach to detect silence in the video meeting should be programmed. In a second step, the collected sensor signal should be integrated in the video meeting system, e.g. via a notification tile that shows the collected signal. Afterwards, the system should be briefly evaluated to gain user feedback.


Goal of the seminar
  • Develop the system
  • Conduct proof of concept study and briefly analyze the user feedback

 

Recommended Skills
  • Strong time management and communication skills
  • Strong analytical and English skills
  • Programming experience