This repository contains the software components developed for an academic paper on the HINTS (Head Impulse, Nystagmus, Test of Skew) examination using Microsoft HoloLens 2 Mixed Reality headset. The system consists of two main components:
- Unity Application for HoloLens 2: A dedicated application designed to record eye and head tracking data during the HINTS examination protocol, or as an alternative for HINTS exam performed by experts.
- Python Processing Pipeline: Scripts and notebooks for processing, analyzing, and visualizing the raw data collected by the HoloLens 2 application.
This system enables clinicians and researchers to conduct standardized HINTS examinations, while capturing precise eye and head movement data for later analysis.
The Unity application (unity_app_hl2
) is designed to run on Microsoft HoloLens 2 and guides the examiner through the standardized HINTS protocol while recording eye tracking, head movement, and audio data.
- Structured Examination Protocol: Guides the examiner through each step of the HINTS examination with timed audio prompts
- Real-time Eye Tracking: Captures detailed eye gaze data using HoloLens 2's eye tracking capabilities
- Head Movement Tracking: Records head position and rotation during examination
- Audio Recording: Captures audio for clinical notes and observations
- Data Persistence: Saves all recorded data in structured format for later analysis
- Developed with Unity (optimized for HoloLens 2)
- Utilizes Mixed Reality Toolkit (MRTK) for HoloLens 2 integration
- Implements
ExtendedEyeGazeDataProvider
for enhanced eye tracking capabilities - Built with C# following component-based architecture principles
The Python component (python_processing_raw_data
) provides a complete pipeline for processing, analyzing, and visualizing the raw data collected by the HoloLens 2 application.
- Raw Data Processing: Converts raw .txt files into structured CSV format
- Data Organization: Organizes data by patient and measurement actions
- Visualization Tools: Provides comprehensive data visualization with Plotly
- Analysis Framework: Supports feasibility analysis and feature extraction
- Containerized Environment: Includes Docker configuration for reproducible analysis
- Python-based processing pipeline
- Jupyter notebooks for interactive analysis
- Dockerized environment for easy setup and reproducibility
-
Prerequisites:
- Unity 2021.3 LTS or later
- Mixed Reality Toolkit (MRTK) for Unity
- Visual Studio 2019 or later with UWP development tools
- Windows 10 with Windows SDK 18362 or later
-
Installation:
- Clone this repository
- Open the project in Unity (
unity_app_hl2/SampleEyeTrackingHL2hints
) - Configure your development environment for HoloLens 2 using MRTK
- known issues
- import the 'Newtonsoft' NuGet package manually if Newtonsoft namespace is missing.
- 'Extended Eye Gaze Data Provider' inside the 'ExtendedEyeTrackerHLhints' game object can be missing -> simply drag the object into the field stating it's missing.
- known issues
- Build the solution for ARM64 architecture
- Deploy to HoloLens 2 using Visual Studio or the Device Portal
-
Usage:
- Launch the application on HoloLens 2
- Perform calibration when prompted
- Follow audio instructions for each examination step
- Data will be automatically saved to the device for later export
-
Option 1: Using Docker (Recommended):
# Navigate to the python_processing_raw_data directory cd python_processing_raw_data # Build and start the Docker container docker-compose up --build -d # Access the notebooks via Visual Studio Code or your browser # (follow the README.md in python_processing_raw_data for details)
-
Option 2: Local Setup:
# Navigate to the python_processing_raw_data directory cd python_processing_raw_data # Install dependencies pip install --no-cache-dir -r requirements.txt # Run the notebooks using Jupyter jupyter notebook
-
Processing Data:
- Place raw data files in the
app/RawData/
directory - Run
1_PreProcessRawData.ipynb
to convert and organize the data - Run
2_plot4feasibilityStudy.ipynb
to generate visualizations and analysis
- Place raw data files in the
.txt
files containing eye tracking and head movement data.wav
files for audio recordings
- Structured CSV files organized by patient and examination action
- Generated features and labels for analysis
- Visualization outputs and images
The application guides clinicians through the following examination steps:
- Initial setup and calibration
- Room examination (scanning the environment)
- Stationary gaze fixation
- Nose examination (convergence test)
- Left and right gaze tests
- Ceiling and floor gaze tests
- Head movement test (for vestibulo-ocular reflex)
The Python notebooks demonstrate:
- Data preprocessing and organization
- Visualization of eye tracking patterns
- Feature extraction for potential machine learning tasks
- Quality assessment of collected data
This software was developed as part of academic research investigating the use of mixed reality and eye tracking for vestibular assessment. The HINTS exam is a critical tool for differentiating central from peripheral causes of acute vestibular syndrome.
This implementation allows for:
- Standardized examination protocol
- Objective measurement of eye and head movements
- Data collection for further analysis and potential diagnostic assistance
If you use this code or data in your research, please cite our paper: [Citation information will be added upon publication]
This project is licensed under the terms included in the LICENSE file.
- Microsoft Mixed Reality Team for HoloLens 2 and MRTK
- Contributors to the research and development of this system
- All volunteers who participated in data collection