Abstract
Although immersive virtual reality (IVR) technology is becoming increasingly accessible, head-mounted displays with eye tracking capability are more costly and therefore rarely used in educational settings outside of research. This is unfortunate, since combining IVR with eye tracking can reveal crucial information about the learners’ behavior and cognitive processes. To overcome this issue, we investigated whether the positional tracking of learners during a short teaching exercise in IVR (i.e., microteaching) may predict the actual fixation on a given set of classroom objects. We analyzed the positional data of pre-service teachers from 23 microlessons by means of a random forest and compared it to two baseline models. The algorithm was able to predict the correct eye fixation with an F1-score of .8637, an improvement of .5770 over inferring eye fixations based on the forward direction of the IVR headset (head gaze). The head gaze itself was a .1754 improvement compared to predicting the most frequent class (i.e., Floor). Our results indicate that the positional tracking data can successfully approximate eye gaze in an IVR teaching scenario, making it a promising candidate for investigating the pre-service teachers’ ability to direct students’ and their own attentional focus during a lesson.
Original language | English |
---|---|
Pages (from-to) | 272-283 |
Number of pages | 12 |
Journal | CEUR Workshop Proceedings |
Volume | 3667 |
Publication status | Published - 2024 |
Event | 2024 Joint of International Conference on Learning Analytics and Knowledge Workshops, LAK-WS 2024 - Kyoto, Japan Duration: 18 Mar 2024 → 22 Mar 2024 |
Keywords
- eye gaze
- eye tracking
- microteaching
- multimodal learning analytics
- positional tracking
- teacher education
- virtual reality
ASJC Scopus subject areas
- General Computer Science