File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)

Article: Tracking gaze position from EEG: Exploring the possibility of an EEG-based virtual eye-tracker

TitleTracking gaze position from EEG: Exploring the possibility of an EEG-based virtual eye-tracker
Authors
KeywordsBSS
eye movement
high-density EEG
ICA
saccade
smooth pursuit
SOBI
Issue Date1-Oct-2023
PublisherWiley Open Access
Citation
Brain and Behavior, 2023, v. 13, n. 10 How to Cite?
AbstractIntroduction: Ocular artifact has long been viewed as an impediment to the interpretation of electroencephalogram (EEG) signals in basic and applied research. Today, the use of blind source separation (BSS) methods, including independent component analysis (ICA) and second-order blind identification (SOBI), is considered an essential step in improving the quality of neural signals. Recently, we introduced a method consisting of SOBI and a discriminant and similarity (DANS)-based identification method, capable of identifying and extracting eye movement–related components. These recovered components can be localized within ocular structures with a high goodness of fit (>95%). This raised the possibility that such EEG-derived SOBI components may be used to build predictive models for tracking gaze position. Methods: As proof of this new concept, we designed an EEG-based virtual eye-tracker (EEG-VET) for tracking eye movement from EEG alone. The EEG-VET is composed of a SOBI algorithm for separating EEG signals into different components, a DANS algorithm for automatically identifying ocular components, and a linear model to transfer ocular components into gaze positions. Results: The prototype of EEG-VET achieved an accuracy of 0.920° and precision of 1.510° of a visual angle in the best participant, whereas an average accuracy of 1.008° ± 0.357° and a precision of 2.348° ± 0.580° of a visual angle across all participants (N = 18). Conclusion: This work offers a novel approach that readily co-registers eye movement and neural signals from a single-EEG recording, thus increasing the ease of studying neural mechanisms underlying natural cognition in the context of free eye movement.
Persistent Identifierhttp://hdl.handle.net/10722/347987
ISSN
2023 Impact Factor: 2.6
2023 SCImago Journal Rankings: 0.908

 

DC FieldValueLanguage
dc.contributor.authorSun, Rui-
dc.contributor.authorCheng, Andy SK-
dc.contributor.authorChan, Cynthia-
dc.contributor.authorHsiao, Janet-
dc.contributor.authorPrivitera, Adam J-
dc.contributor.authorGao, Junling-
dc.contributor.authorFong, Ching hang-
dc.contributor.authorDing, Ruoxi-
dc.contributor.authorTang, Akaysha C-
dc.date.accessioned2024-10-04T00:30:46Z-
dc.date.available2024-10-04T00:30:46Z-
dc.date.issued2023-10-01-
dc.identifier.citationBrain and Behavior, 2023, v. 13, n. 10-
dc.identifier.issn2162-3279-
dc.identifier.urihttp://hdl.handle.net/10722/347987-
dc.description.abstractIntroduction: Ocular artifact has long been viewed as an impediment to the interpretation of electroencephalogram (EEG) signals in basic and applied research. Today, the use of blind source separation (BSS) methods, including independent component analysis (ICA) and second-order blind identification (SOBI), is considered an essential step in improving the quality of neural signals. Recently, we introduced a method consisting of SOBI and a discriminant and similarity (DANS)-based identification method, capable of identifying and extracting eye movement–related components. These recovered components can be localized within ocular structures with a high goodness of fit (>95%). This raised the possibility that such EEG-derived SOBI components may be used to build predictive models for tracking gaze position. Methods: As proof of this new concept, we designed an EEG-based virtual eye-tracker (EEG-VET) for tracking eye movement from EEG alone. The EEG-VET is composed of a SOBI algorithm for separating EEG signals into different components, a DANS algorithm for automatically identifying ocular components, and a linear model to transfer ocular components into gaze positions. Results: The prototype of EEG-VET achieved an accuracy of 0.920° and precision of 1.510° of a visual angle in the best participant, whereas an average accuracy of 1.008° ± 0.357° and a precision of 2.348° ± 0.580° of a visual angle across all participants (N = 18). Conclusion: This work offers a novel approach that readily co-registers eye movement and neural signals from a single-EEG recording, thus increasing the ease of studying neural mechanisms underlying natural cognition in the context of free eye movement.-
dc.languageeng-
dc.publisherWiley Open Access-
dc.relation.ispartofBrain and Behavior-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectBSS-
dc.subjecteye movement-
dc.subjecthigh-density EEG-
dc.subjectICA-
dc.subjectsaccade-
dc.subjectsmooth pursuit-
dc.subjectSOBI-
dc.titleTracking gaze position from EEG: Exploring the possibility of an EEG-based virtual eye-tracker-
dc.typeArticle-
dc.identifier.doi10.1002/brb3.3205-
dc.identifier.pmid37721530-
dc.identifier.scopuseid_2-s2.0-85171327219-
dc.identifier.volume13-
dc.identifier.issue10-
dc.identifier.eissn2157-9032-
dc.identifier.issnl2162-3279-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats