Database Open Access
Eye Tracking Dataset for the 12-Lead Electrocardiogram Interpretation of Medical Practitioners and Students
Mohammed Tahri Sqalli , Dena Al-Thani , Mohamed Elshazly , Mohammed Al-Hijji
Published: March 16, 2022. Version: 1.0.0
When using this resource, please cite:
(show more options)
Tahri Sqalli, M., Al-Thani, D., Elshazly, M., & Al-Hijji, M. (2022). Eye Tracking Dataset for the 12-Lead Electrocardiogram Interpretation of Medical Practitioners and Students (version 1.0.0). PhysioNet. https://doi.org/10.13026/gsr5-8b11.
Please include the standard citation for PhysioNet:
(show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.
Abstract
This dataset contains collected eye tracking data from medical practitioners and students with different expertise levels to understand their electrocardiogram interpretation behavior. These medical practitioners encounter electrocardiograms during their daily medical practice. The end goal of the collection of this dataset was to use it for analysis to uncover insights about key best practices as well as pitfall in the ECG interpretation process. The data consists of quantitative eye tracking measurements for different eye gaze features for 63 participants. These participants belong to six expertise categories, namely: medical students, nurses, technicians, residents, fellows and finally consultants. Each one of these participants contributed to the data collection by interpreting ten different electrocardiograms. The eye tracking data includes the eye fixations count and duration, the eye gaze duration, and finally the fixations revisitations. It was collected at 60 frames per second using a Tobii Pro X2-60. Each participant was allowed a timeframe of 30 seconds to interpret each one of the ten ECGs. The collected data was processed taking into consideration two defined area of interest distributions. This enabled the extraction of metrics specifically tailored for the 12-lead electrocardiogram.
Background
The electrocardiogram (ECG) is among the most used medical diagnostic tests globally, with over 300 million ECG tests conducted per year in the United States alone [1]. The electrocardiogram is a graph that showcases the electrical activity of the human’s heart. The 12-lead ECG observes this activity from 12 different perspectives called leads. Accurately interpreting an ECG contributes to correctly diagnosing the arrhythmia being investigated. The current challenge, in addition to the lack of skilled ECG interpreters in the medical workforce, is the shortage of guidelines and educational material that explain how skilled interpreters accurately interpret an ECG [2]. This leads to a growing expertise gap between skilled interpreters and novice ones who are beginning to develop the skill of interpretation.
As the medical field is moving towards the digitalization of most of its domains, this dataset was collected with the aim of contributing to that trend. It was collected as part of an exploratory eye tracking study to understand the ECG interpretation behavior for medical students and practitioners with different expertise levels in ECG interpretation. These medical practitioners encounter ECGs during their daily medical practice. The end goal of the collection of this dataset was to use it for analysis to uncover insights about key best practices as well as pitfall in the ECG interpretation. A secondary aim behind the collection of the dataset is to explore novel innovative pathways into the means of delivering ECG interpretation training to medical students and practitioners, mainly by incorporating eye-tracking as well as machine learning into the ECG interpretation training. The resulting eye tracking data was used as an input to publish two separate journal papers that performed a quantitative analysis on the medical students’ category [3], as well as the remaining medical practitioners’ categories [4,5].
Methods
The collected data is from an experimental study that used eye tracking to measure the ECG interpretation behavior of medical practitioners. Eye-tracking data was collected using a Tobii Pro X2-60 eye tracker and iMotions version 8.1 software [6] to track eye movements with a frequency of 60 Hz (±1 Hz). iMotions calculates gaze and fixation durations according to the eye micro-saccades. Micro-saccades represent movements that are much shorter in the distance that is covered compared to normal saccades, at around 15 arcminutes [6]. The fixations algorithm used by iMotions is the I-VT (Velocity-Threshold Identification) Fixation Filter algorithm [8] where the gap fill-in option/interpolation is disabled. The noise reduction is also disabled. Regarding the fixations filter parameters, the window length is 20 milliseconds, and the velocity threshold is equal to 30 degrees per seconds. The merger of adjacent fixations is disabled and the discard of short fixations is also disabled. The data was collected in real-time while medical practitioners were performing the ECG interpretation. A total of 630 different ECG interpretations were collected, where 63 different medical practitioners from five expertise categories interpreted ten different ECGs each. Each participant was allowed a limited time to interpret the ECG. This time limit was 30 seconds per ECG. Table 1 in the tables file defines the different ECG abnormalities that were exposed for the participants to interpret, while the ECGs foloder shocases all the 10 ECG images used in the experiment.
Data Description
The below table lists all the figures and tables that are referenced in this section. The table also mentions their file names and location in the files repository. An extensive version of the same table was uploaded to the files repository. It includes the figures used in the eye tracking experiment.
Figure or Table |
Number |
File Name |
File Location Path |
Description |
Figures | ||||
Figure |
1 |
Grid_AOI.png |
<base>/AOI_Distributions/ |
Grid-based distribution applied to the normal sinus rhythm ECG. |
Figure |
2 |
Short_AOIs.png |
<base>/AOI_Distributions/ |
Long versus short areas of interest distribution applied to the normal sinus rhythm ECG. |
Figures | 3 - 12 | ECGs | <base>/ECGs/ | All the 10 ECG images used in the eye tracking experiment |
Tables | ||||
Table |
1 |
Table_1_ |
<base>/Tables/ |
Demographics of participants in each expertise category. |
Table |
2 |
Table_2_ECG_ |
<base>/Tables/Table_2 |
Different ECG abnormalities that were exposed for the participants to interpret. |
Table |
3 |
Table_3_Eye_Tracking |
<base>/Tables/Table_3_ |
Definition for all the collected eye tracking features. |
The data is to be used to uncover additional eye tracking insights about the interpretation process of medical students and practitioners. Below are how to use the two dataset files: The data contains quantitative eye tracking measurments for the interpretation of 10 different ECGs. Table 2. in the tables file showcases the different heart abnormalities used in the experiment. Moreover, Table 3. defines the quantitative measurements for the collected features of the eye tracking data for all the participants across two defined areas of interest (AOI) distributions. AOIs are eye-tracking tools for selecting regions of a displayed stimulus. In our case, the stimulus is the displayed 12-leads ECG. This enables the extraction of metrics specifically for those regions. In our case, these regions are the 12 leads composing the ECG. Although not strictly a metric by itself, the AOI defines the area by which other eye tracking metrics are calculated. AOIs allow for the quantification of visual attention at both the aggregate and the individual level. Figures 1 and 2 in the figures file showcase the two areas of interest distributions along with the name of each AOI used in the dataset to process the data applied to the Normal Sinus Rhythm ECG. The first distribution is a grid-distribution, where the ECG is divided into 24 equal areas, taking into consideration the ECG leads. The second AOI distribution divides the ECG image into two equal AOIs. The first AOI groups the 12 leads composing the electrocardiogram. These span for 2.5 seconds each across the x-axis. The second AOI however groups the three lower rhythm strips that span for 10 seconds across the x-axis. Data is in the format of frames captured by the eye tracker for the duration of 30 seconds for each participant across the ten selected ECGs. Each frame is a line in the dataset file where eye tracking metrics, including fixations and gaze data detailed in table 3 is recorded for each specified AOI. The total number of lines in the grid-based AOI file is 15373 lines and for the longvs. Short AOI distribution is 1891 lines.
Grid_Anonymized.csv
can be used to analyze the eye tracking data according to a grid-based structure as it is defined in the grid sample normal sinus rhythm ECG in the AOI structures folder.
Long_Short_Anonymized.csv
On the other hand can be used to analyze the eye tracking data according to long and short leads structure as it is defined in the long versus short sample normal sinus rhythm ECG in the AOI structures folder.
Usage Notes
The data has been used to understand the visual expertise of different health practitioners described in table 1 in relation to their accuracy of ECG interpretation. The dataset contributed to the publised papers referenced below [3-5]. The dataset has the potential to be re-used either by:
- Uncovering further visual expertise insights within the set of medical practitioners that contributed to the experiment. This is by analyzing furhther eye tracking features other than the ones in the referenced work [3-5].
- Comparing eye movement interpretation patterns with a different set of medical practitioners by replicating the same settings of the study. All the details of the methodology are described in our referenced work [3-5].
- This will eventually contribute to enlarging the dataset by including a more diverse demographics of medical practitioners from different health institutions.
It is important to note the known limitations that users should be aware of when using this dataset. These limitations include that:
- Each participant was allowed only 30 seconds to look at each ECG among the ten used ones in the experiment.
- Participants were not given any prior information about the patient history related to the ECG to be interpreted.
- The standard deviation for the eye tracker calibration did not exceed 0.19 inches on a 13.3 inch laptop monitor. The eye tracker was re-calibrated to each participant. All the details of the experiment design can be found in our published study [3]
We suggest below some codes and datasets that relate to this one and that could be of interest to the user community.
- The ECG analysis conducted and dataset collected by Davies A. at the University of Manchester. It is available in the laboratory's GitHub repository [9]. The dataset uses different ECGs, but the similar heart abnormality categories. The analysis is also published by the author in [2].
Ethics
This study received institutional review board approval from the ethical board of the Qatar Biomedical Research Institute at Hamad bin Khalifa University under the research protocol number QBRI-IRB-2020-01-009. Approvals were granted before the start of the experiment. The Institutional review board approval guarantees that all study methods were conducted following the guidelines and recommendations of international regulatory agencies. All the participants in the study gave written informed consent.
Acknowledgements
MTS would like to thank Dr. Yahya Sqalli Houssaini and Ahmed Kachkach for the discussion and insights. MTS would also like to thank Kiara Heide from iMotions for the onboarding training. The authors would like to thank the Industry Development and Knowledge Transfer Unit at Qatar Foundation. Finally, the authors would like to thank all the volunteering participants who contributed with their electrocardiogram interpretations.
Conflicts of Interest
The authors have no conflicts of interest to declare.
References
- Cadet JV (2009). "Report: Cost savings will drive ECG global market to nearly $160M by 2015". Cardiovascular Business.
- Davies A, Mueller J, Horseman L, Splendiani B, Hill E, Vigo M, Harper S, Jay C (2019). "How do healthcare practitioners read electrocardiograms? A dual-process model of electrocardiogram interpretation". Vol. 14, British Journal of Cardiac Nursing. Mark Allen Group. p. 1–19.
- Sqalli MT, Al-Thani D, Elshazly MB, Al-Hijji M (2021). "Interpretation of a 12-Lead Electrocardiogram by Medical Students: Quantitative Eye-Tracking Approach". Vol. 7, JMIR Medical Education. JMIR Publications Inc. p. e26675.
- Sqalli MT, Al-Thani D, Elshazly MB, Al-Hijji M, Alahmadi A, Sqalli Houssaini Y (2022). "Understanding Cardiology Practitioners’ Interpretations of Electrocardiograms: An Eye-Tracking Study". Vol. 9, JMIR Human Factors. JMIR Publications Inc. p. e34058.
- Sqalli MT, Al-Thani D, Elshazly MB, Al-Hijji M, Houssaini YS (2021). "The Journey Towards an Accurate Electrocardiogram Interpretation: An Eye-tracking Study Overview". 8th International Conference on Behavioral and Social Computing (BESC). IEEE.
- Farnsworth B (2020). "10 Most Used Eye Tracking Metrics and Terms." Available from: https://imotions.com/blog/10-terms-metricseye-%0Atracking/
- Farnsworth B (2019). "Types of Eye Movements [Saccades and Beyond]." iMotions. Available from: https://imotions.com/blog/types-of-eye-movements/
- Anneli O (2012). "The Tobii I-VT Fixation Filter Algorithm Description". Available from: http://www.vinis.co.kr/ivt_filter.pdf
- Davies A (2018). "ECG Analysis". Interaction Analysis and Modelling Laboratory. GitHub repository. Available from: https://github.com/IAM-lab/ECG-analysis.git
Access
Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.
License (for files):
Open Data Commons Open Database License v1.0
Discovery
DOI (version 1.0.0):
https://doi.org/10.13026/gsr5-8b11
DOI (latest version):
https://doi.org/10.13026/v5w5-n117
Topics:
human vision
medical students
ecg interpretation
medical image interpretation
medical practice
medical education
human-computer interaction
eye-tracking
medical practitioners
visual expertise
ecg
electrocardiogram
Corresponding Author
Files
Total uncompressed size: 30.5 MB.
Access the files
- Download the ZIP file (25.9 MB)
-
Download the files using your terminal:
wget -r -N -c -np https://physionet.org/files/eye-tracking-ecg/1.0.0/
-
Download the files using AWS command line tools:
aws s3 sync --no-sign-request s3://physionet-open/eye-tracking-ecg/1.0.0/ DESTINATION
Name | Size | Modified |
---|---|---|
Parent Directory | ||
Table_1_Demographics.pdf (download) | 112.5 KB | 2021-12-12 |
Table_2_ECG_abnormalities.pdf (download) | 93.7 KB | 2021-12-12 |
Table_3_Eye_Tracking_Features.pdf (download) | 64.7 KB | 2021-12-12 |