Week 15–16

Adil Mufti
3 min readMar 11, 2023

--

20th February — 6th March

This is the final development blog! This project has definitely been a learning journey. There were many setbacks in the technology we could have used, but the project was still so enjoyable.

Before getting into our progress, I would like to thank Ayca Bas and Lee Scott from Microsoft. They have been supportive throughout and have always given appropriate feedback. In particular within the last few weeks they have been really helpful!

Teams Application

First well will talk about what we have managed to do with our Microsoft Teams App. I have managed to integrate around 8 games into the application which run inside a Teams meeting. There are a variety of multiplayer and single player games, and some of which do and don’t use the mouse cursor. For example, Tetris uses the arrow keys on your keyboard. Integrating, a selection of games with different controls will allow a wide raft of motion-input gestures to be utilised in our application. The more gestures that can be used with our app, mean more muscles can be exercised! We will continue to work in collaboration with other teams so we can achieve this! However, the blocker of Live Share SDK not working is still there. Therefore, games still cannot be played collaboratively inside a Teams meeting. This is unfortunately out of our control :(

For Teams, we want to polish the UI before getting testing done. We will look into unit tests and UI tests, as well as human testing.

Movement Data Collection

Data collection working in real-time

Also, we have begun working on data recording for gestures. This is such an important aspect for us, because measuring motion data builds the physiotherapy capabilities of the software. Right now we calculate shoulder and elbow angles. As well as calculating arm extensions and counting repetitions of a motion. For the angles, extensions and distances we calculate values every second and store these values in an array. The values saved are the maximum and minimum value in each array. At the end, (when video has stopped running), all of the saved data is written into a text file which is stored on the computer of the person running the video. This solution is in MediaPipe and can immediately be shipped into MotionInput, however we are just awaiting the green light from seniors before this happens.

Summary:

NDI — Done ✅

Data Collection — Done ✅ (As much as we can for now)

Teams — Nearly Done ⏳

We now need to integrate a few more games, then we will be done with the Teams aspect of our project. Unless, Live Share SDK starts working, then we will have to work to try an integrate Live Share separately into every page on the application, in limited time. We are due to meet with Mr Bob German from Microsoft who has worked on Live Share SDK, so he will hopefully help fix the issue for us!

Thank you whoever is reading this for keeping up with our blogs!

Written by Adil

(At the time of uploading this blog, Live Share SDK has started working as of the 9th of March. This will be a sprint to integrate it into our teams app by the end of the project!)

--

--