Rong-Hao Liang

ACM MobileHCI 2015: International Conference on Human-Computer Interaction with Mobile Devices and Services

LEaD: Utilizing Light Movement as Peripheral Visual Guidance for Scooter Navigation

Hung-Yu Tseng, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen

National Taiwan University

ACM Digital Library

Abstract

This work presents LEaD, a helmet-based visual guidance system utilizing light movement in scooter drivers’ peripheral vision for turn-by-turn navigation. A linear light strip mounted on a helmet navigates for scooter drivers using simple 1D light movement, which can be easily acquired and identified by peripheral vision with the on-going foveal vision task. User studies suggest that this novel system can effectively direct scooter drivers without introducing visual distractions in route-guided experiences.

Keywords

Wearable Display, Navigation, Peripheral Visualization

Cite this work (ACM)

Hung-Yu Tseng, Rong-Hao Liang, Liwei Chan, and Bing-Yu Chen. 2015. LEaD: Utilizing Light Movement as Peripheral Visual Guidance for Scooter Navigation. In <i>Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services</i> (<i>MobileHCI '15</i>). Association for Computing Machinery, New York, NY, USA, 323–326. DOI:https://doi.org/10.1145/2785830.2785831

Cite this work (Bibtex)

@inproceedings{10.1145/2785830.2785831,
author = {Tseng, Hung-Yu and Liang, Rong-Hao and Chan, Liwei and Chen, Bing-Yu},
title = {LEaD: Utilizing Light Movement as Peripheral Visual Guidance for Scooter Navigation},
year = {2015},
isbn = {9781450336529},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/2785830.2785831},
doi = {10.1145/2785830.2785831},
abstract = {This work presents LEaD, a helmet-based visual guidance system utilizing light movement in scooter drivers' peripheral vision for turn-by-turn navigation. A linear light strip mounted on a helmet navigates for scooter drivers using simple 1D light movement, which can be easily acquired and identified by peripheral vision with the on-going foveal vision task. User studies suggest that this novel system can effectively direct scooter drivers without introducing visual distractions in route-guided experiences.},
booktitle = {Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services},
pages = {323–326},
numpages = {4},
keywords = {Wearable Display, Navigation, Peripheral Visualization},
location = {Copenhagen, Denmark},
series = {MobileHCI '15}
}