Categories
Pages
-

DBIS

Guided Knowledge Presentations with Mixed Reality Agents

August 29th, 2024

Mixed reality agents are simulated humans who are displayed in a mixed reality environment, e.g., in augmented reality or virtual reality. They provide the opportunity to support teaching activities with automation. Possible use cases include general presentations in one place, e.g., of lecture content and station-based routes as seen in museums or with tourist guides. If mixed reality agents can take over these repetitive tasks, a consistent quality of the conveyed content is ensured as each run is the same. The agent has the potential to be superior to recorded video presentations as it could possess interactive capabilities, e.g., in answering common questions. Moreover, with the right data structure to encode the presented content, changing and updating the presentation of an agent can be done by quickly adjusting the underlying configuration file whereas videos require a re-recording of content.

Thesis Type
  • Bachelor
Student
Julia Tscholakov
Status
Running
Presentation room
Seminar room I5 6202
Supervisor(s)
Stefan Decker
Advisor(s)
Benedikt Hensen
Contact
hensen@dbis.rwth-aachen.de

Therefore, the goal of this thesis is to realize a mixed reality agent which is able to present content to an audience and to investigate which effects an agent as the presenter has on students. For this task, we provide a framework for creating virtual agents which can be extended and utilized to create a mixed reality learning application. As a conceptual output of the thesis, a data model needs to be devised which instructs the agent how to perform the presentation, e.g., to play gestures at the right moments according to the audio track. To support the use case of the tour guide, the data model should also be able to specify walking commands so that the agent can walk from one point of interest to another. In the application, one or two example presentations should be set up to demonstrate the functionality and to prepare the evaluation. The application should preferentially be realized on Android smartphones in augmented reality but a realization in virtual reality with the Meta Quest 2 is also possible. In the subsequent evaluation, the user interactions, engagement and learning impact can be measured. For instance, a comparative user study could let participants experience a human-guided presentation and one with the mixed reality agent as the presenter. During the experience, data about how participants interact with the guide/presenter can be collected. Moreover, users can be asked afterwards about their perceived motivation and workload and a test can be administered to assess whether their understanding of the presented topic changed due to the agent.


Prerequisites:

Required knowledge: C# or Java
Beneficial knowledge: mixed reality, Unity 3D engine