File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Multiview multitask gaze estimation with deep convolutional neural networks

TitleMultiview multitask gaze estimation with deep convolutional neural networks
Authors
KeywordsConvolutional neural networks (CNNs)
gaze tracking
multitask learning (MTL)
multiview learning
Issue Date2019
Citation
IEEE Transactions on Neural Networks and Learning Systems, 2019, v. 30, n. 10, p. 3010-3023 How to Cite?
AbstractGaze estimation, which aims to predict gaze points with given eye images, is an important task in computer vision because of its applications in human visual attention understanding. Many existing methods are based on a single camera, and most of them only focus on either the gaze point estimation or gaze direction estimation. In this paper, we propose a novel multitask method for the gaze point estimation using multiview cameras. Specifically, we analyze the close relationship between the gaze point estimation and gaze direction estimation, and we use a partially shared convolutional neural networks architecture to simultaneously estimate the gaze direction and gaze point. Furthermore, we also introduce a new multiview gaze tracking data set that consists of multiview eye images of different subjects. As far as we know, it is the largest multiview gaze tracking data set. Comprehensive experiments on our multiview gaze tracking data set and existing data sets demonstrate that our multiview multitask gaze point estimation solution consistently outperforms existing methods.
Persistent Identifierhttp://hdl.handle.net/10722/345231
ISSN
2023 Impact Factor: 10.2
2023 SCImago Journal Rankings: 4.170

 

DC FieldValueLanguage
dc.contributor.authorLian, Dongze-
dc.contributor.authorHu, Lina-
dc.contributor.authorLuo, Weixin-
dc.contributor.authorXu, Yanyu-
dc.contributor.authorDuan, Lixin-
dc.contributor.authorYu, Jingyi-
dc.contributor.authorGao, Shenghua-
dc.date.accessioned2024-08-15T09:26:03Z-
dc.date.available2024-08-15T09:26:03Z-
dc.date.issued2019-
dc.identifier.citationIEEE Transactions on Neural Networks and Learning Systems, 2019, v. 30, n. 10, p. 3010-3023-
dc.identifier.issn2162-237X-
dc.identifier.urihttp://hdl.handle.net/10722/345231-
dc.description.abstractGaze estimation, which aims to predict gaze points with given eye images, is an important task in computer vision because of its applications in human visual attention understanding. Many existing methods are based on a single camera, and most of them only focus on either the gaze point estimation or gaze direction estimation. In this paper, we propose a novel multitask method for the gaze point estimation using multiview cameras. Specifically, we analyze the close relationship between the gaze point estimation and gaze direction estimation, and we use a partially shared convolutional neural networks architecture to simultaneously estimate the gaze direction and gaze point. Furthermore, we also introduce a new multiview gaze tracking data set that consists of multiview eye images of different subjects. As far as we know, it is the largest multiview gaze tracking data set. Comprehensive experiments on our multiview gaze tracking data set and existing data sets demonstrate that our multiview multitask gaze point estimation solution consistently outperforms existing methods.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Neural Networks and Learning Systems-
dc.subjectConvolutional neural networks (CNNs)-
dc.subjectgaze tracking-
dc.subjectmultitask learning (MTL)-
dc.subjectmultiview learning-
dc.titleMultiview multitask gaze estimation with deep convolutional neural networks-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TNNLS.2018.2865525-
dc.identifier.pmid30183647-
dc.identifier.scopuseid_2-s2.0-85052799761-
dc.identifier.volume30-
dc.identifier.issue10-
dc.identifier.spage3010-
dc.identifier.epage3023-
dc.identifier.eissn2162-2388-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats