An Interactive Multimodal Guide to Improve Art Accessibility for the Blind - School of Computing Seminar Series

Start date and time

Wednesday 26 September 2018


CoRe44, room C44, Merchiston Campus

The development of 3D printing technology has contributed to improve the engagement of the visually impaired when experiencing two-dimensional visual artworks. 3D printed tactile representations have spatial and texture dimensions that convey an artwork’s visual information. However, often it is still challenging for the visually impaired to explore, experience and get a clear understanding of visual art works on their own. We introduce an interactive multimodal guide in which a 3D printed 2.5D representation of a painting can be explored by "touch" that determined features triggers localized verbal, audio, wind, and light/heat feedback events that convey spatial and semantic information. In this work we present a working prototype developed through a participatory design approach and also pose an open future directions.

Jun-Dong Cho received his Ph.D. degree from Northwestern University, Evanston, IL, 1993, in Computer Science. He founded the department of Human ICT Convergence at Sungkyunkwan University, Sep. 2013. Since then, he published more than 40 papers in renowned HCI Conferences such as CHI, CSCW, TEI, and UbiComp. His current research topics include Wearable and User Experience Design.