File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Visual psychophysics on the web: open-access tools, experiments, and results using online platforms

TitleVisual psychophysics on the web: open-access tools, experiments, and results using online platforms
Authors
Issue Date2018
PublisherAssociation for Research in Vision and Ophthalmology. The Journal's web site is located at http://wwwjournalofvisionorg/
Citation
18th Annual Meeting of Vision Sciences Society (VSS 2018), St. Pete Beach, FL, 18-23 May 2018, In Journal of Vision, 2018, v. 18 n. 10, p. 299-299 How to Cite?
AbstractAbstract In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science.
Descriptionabstract
Persistent Identifierhttp://hdl.handle.net/10722/276274
ISSN
2021 Impact Factor: 2.004
2020 SCImago Journal Rankings: 1.126

 

DC FieldValueLanguage
dc.contributor.authorRajananda, S-
dc.contributor.authorPeters, MAK-
dc.contributor.authorLau, HW-
dc.contributor.authorOdegaard, B-
dc.date.accessioned2019-09-10T02:59:35Z-
dc.date.available2019-09-10T02:59:35Z-
dc.date.issued2018-
dc.identifier.citation18th Annual Meeting of Vision Sciences Society (VSS 2018), St. Pete Beach, FL, 18-23 May 2018, In Journal of Vision, 2018, v. 18 n. 10, p. 299-299-
dc.identifier.issn1534-7362-
dc.identifier.urihttp://hdl.handle.net/10722/276274-
dc.descriptionabstract-
dc.description.abstractAbstract In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science.-
dc.languageeng-
dc.publisherAssociation for Research in Vision and Ophthalmology. The Journal's web site is located at http://wwwjournalofvisionorg/-
dc.relation.ispartofJournal of Vision-
dc.titleVisual psychophysics on the web: open-access tools, experiments, and results using online platforms-
dc.typeConference_Paper-
dc.identifier.emailLau, HW: oldchild@hku.hk-
dc.identifier.authorityLau, HW=rp02270-
dc.identifier.doi10.1167/18.10.299-
dc.identifier.hkuros304560-
dc.identifier.volume18-
dc.identifier.issue10-
dc.identifier.spage299-
dc.identifier.epage299-
dc.publisher.placeUnited States-
dc.identifier.issnl1534-7362-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats