File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Humans have idiosyncratic and task-specific scanpaths for judging faces

TitleHumans have idiosyncratic and task-specific scanpaths for judging faces
Authors
KeywordsEye movements
Face perception
Machine learning
Scanpath routines
Issue Date2015
Citation
Vision Research, How to Cite?
AbstractSince Yarbus's seminal work, vision scientists have argued that our eye movement patterns differ depending upon our task. This has recently motivated the creation of multi-fixation pattern analysis algorithms that try to infer a person's task (or mental state) from their eye movements alone. Here, we introduce new algorithms for multi-fixation pattern analysis, and we use them to argue that people have scanpath routines for judging faces. We tested our methods on the eye movements of subjects as they made six distinct judgments about faces. We found that our algorithms could detect whether a participant is trying to distinguish angriness, happiness, trustworthiness, tiredness, attractiveness, or age. However, our algorithms were more accurate at inferring a subject's task when only trained on data from that subject than when trained on data gathered from other subjects, and we were able to infer the identity of our subjects using the same algorithms. These results suggest that (1) individuals have scanpath routines for judging faces, and that (2) these are diagnostic of that subject, but that (3) at least for the tasks we used, subjects do not converge on the same "ideal" scanpath pattern. Whether universal scanpath patterns exist for a task, we suggest, depends on the task's constraints and the level of expertise of the subject.
Persistent Identifierhttp://hdl.handle.net/10722/212302
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorKanan, C-
dc.contributor.authorBseiso, DN-
dc.contributor.authorRay, NA-
dc.contributor.authorHsiao, JHW-
dc.contributor.authorCottrell, GW-
dc.date.accessioned2015-07-21T02:31:17Z-
dc.date.available2015-07-21T02:31:17Z-
dc.date.issued2015-
dc.identifier.citationVision Research,-
dc.identifier.urihttp://hdl.handle.net/10722/212302-
dc.description.abstractSince Yarbus's seminal work, vision scientists have argued that our eye movement patterns differ depending upon our task. This has recently motivated the creation of multi-fixation pattern analysis algorithms that try to infer a person's task (or mental state) from their eye movements alone. Here, we introduce new algorithms for multi-fixation pattern analysis, and we use them to argue that people have scanpath routines for judging faces. We tested our methods on the eye movements of subjects as they made six distinct judgments about faces. We found that our algorithms could detect whether a participant is trying to distinguish angriness, happiness, trustworthiness, tiredness, attractiveness, or age. However, our algorithms were more accurate at inferring a subject's task when only trained on data from that subject than when trained on data gathered from other subjects, and we were able to infer the identity of our subjects using the same algorithms. These results suggest that (1) individuals have scanpath routines for judging faces, and that (2) these are diagnostic of that subject, but that (3) at least for the tasks we used, subjects do not converge on the same "ideal" scanpath pattern. Whether universal scanpath patterns exist for a task, we suggest, depends on the task's constraints and the level of expertise of the subject.-
dc.languageeng-
dc.relation.ispartofVision Research-
dc.subjectEye movements-
dc.subjectFace perception-
dc.subjectMachine learning-
dc.subjectScanpath routines-
dc.titleHumans have idiosyncratic and task-specific scanpaths for judging faces-
dc.typeArticle-
dc.identifier.emailHsiao, JHW: jhsiao@hku.hk-
dc.identifier.authorityHsiao, JHW=rp00632-
dc.identifier.doi10.1016/j.visres.2015.01.013-
dc.identifier.scopuseid_2-s2.0-84922440390-
dc.identifier.hkuros245536-
dc.identifier.isiWOS:000350781100008-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats