Augmented Reality Human Performance and Workplace Modelling using Sensor Fusion Data
Thesis type |
|
---|---|
Student | Rizwan Ali |
Status | Finished |
Submitted in | 2018 |
Proposal on | 15. Feb 2018 15:00 |
Proposal room | Bibliothek I5 |
Add proposal to calendar |
![]() ![]() |
Presentation on | 12. Sep 2018 14:30 |
Presentation room | Seminarraum I5 |
Add presentation to calendar |
![]() ![]() |
Supervisor(s) | |
Advisor(s) |
An immersive Augmented Reality (AR training development framework needs support for combinations of a wide range of appropriate devices for different use cases. In a surgery training situation this can be an AR headset and a handtracking device capturing exact movements in order to evaluate the accuracy. Therefore, it is necessary to get information from different sensors and interpret them in a common sensor fusion framework, in order to record the actions. The goal of the thesis is twofold. First, the WEKIT.one framework will be extended. New sensor hardware, which can be used to improve the training experience of apprentices, will be implemented, tested and evaluated against the given requirements. The implementation is based on the Unity SDK in combination with Microsoft HoloLens and other AR relevant devices. Second, we are interested in visual learning analytics of the gathered data. Therefore, means for storing learner traces both locally and externally will be evaluated with the intention to use the stored data for long-term analytics. Contributions will be specified and implemented with the help of the upcoming IEEE standard on Augmented Reality Learning Experience Models (ARLEM). As a Bachelor thesis, the scope can be adjusted.
Prerequisites
Ideally, you are familiar with IoT and/or Unity development. For the analytics part, JavaScript skills are an advantage.