Zur Kurzanzeige

dc.contributor.author
Lokka, Ismini E.
dc.contributor.author
Çöltekin, Arzu
dc.contributor.editor
Kiefer, Peter
dc.contributor.editor
Giannopolos, Ioannis
dc.contributor.editor
Göbel, Fabian
dc.contributor.editor
Raubal, Martin
dc.contributor.editor
Duchowski, Andrew T.
dc.date.accessioned
2017-12-20T11:05:32Z
dc.date.available
2017-12-19T07:45:00Z
dc.date.available
2017-12-19T09:21:24Z
dc.date.available
2017-12-20T11:05:32Z
dc.date.issued
2018-01-14
dc.identifier.uri
http://hdl.handle.net/20.500.11850/222473
dc.identifier.doi
10.3929/ethz-b-000222473
dc.description.abstract
In its broader scope, this paper is concerned about understanding how (visualization) designs of virtual environments (VE) interact with navigational memory. We optimized the design of a VE for route learning following specific visualization guidelines that we derived from previous literature, and tested it with a typical navigational recall task with 42 participants. Recall accuracies of our participants widely vary. We hypothesize that by analyzing the eye movements of high- and low-performing participants in a comparative manner, we can better understand this variability, and identify if these two groups rely on different visual strategies. Such efforts inform the visualization designs, and in turn, these designs can better assist people. Those who perform poorly in navigational tasks for reasons such as lack of training or differences in visuospatial abilities might especially benefit from such assistance. In this paper, we present our concept for a work-in-progress study and provide the relevant background.
en_US
dc.format
application/pdf
dc.language.iso
en
en_US
dc.publisher
ETH Zurich
en_US
dc.rights.uri
http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject
navigation
en_US
dc.subject
virtual environments
en_US
dc.subject
visual strategies
en_US
dc.title
A virtual reality experiment for improving the navigational recall: What can we learn from eye movements of high- and low-performing individuals?
en_US
dc.type
Conference Paper
dc.rights.license
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International
dc.date.published
2017-12-19
ethz.book.title
Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop
en_US
ethz.size
6 p.
en_US
ethz.event
3rd International Workshop on Eye Tracking for Spatial Research
en_US
ethz.event.location
Zurich, Switzerland
en_US
ethz.event.date
January 14, 2018
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.date.deposited
2017-12-19T07:45:00Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2017-12-19T09:21:27Z
ethz.rosetta.lastUpdated
2022-03-28T18:42:59Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=A%20virtual%20reality%20experiment%20for%20improving%20the%20navigational%20recall:%20What%20can%20we%20learn%20from%20eye%20movements%20of%20high-%20and%20low-performin&rft.date=2018-01-14&rft.au=Lokka,%20Ismini%20E.&%C3%87%C3%B6ltekin,%20Arzu&rft.genre=proceeding&rft.btitle=Eye%20Tracking%20for%20Spatial%20Research,%20Proceedings%20of%20the%203rd%20International%20Workshop
 Printexemplar via ETH-Bibliothek suchen

Dateien zu diesem Eintrag

Thumbnail

Publikationstyp

Zur Kurzanzeige