Research explorer tool

27 results

jamTable: Can Physical Interfaces Support the Collaboration between Novice and Experienced Musicians?

Book Chapter
Esteves, A., Quintal, F., & Oakley, I. (2013)
jamTable: Can Physical Interfaces Support the Collaboration between Novice and Experienced Musicians?. In Haptic and Audio Interaction Design, 99-108. BMC. https://doi.org/10.1007/978-3-642-41068-0_11
This paper introduces jamTable, a system that enables the collaboration between users playing a standard musical instrument and users interacting with a tangible musical seque...

Orbits: gaze Interaction for smart watches using smooth pursuit eye movements.

Conference Proceeding
Esteves, A., Velloso, E., Bulling, A., & Gellersen, H. (2015)
Orbits: gaze Interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, (457-466). https://doi.org/10.1145/2807442.2807499
We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit ...

AmbiGaze: Direct Control of Ambient Devices by Gaze

Conference Proceeding
Velloso, E., Wirth, M., Weichel, C., Esteves, A., & Gellersen, H. (2016)
AmbiGaze: Direct Control of Ambient Devices by Gaze. In DIS '16 Proceedings of the 2016 ACM Conference on Designing Interactive Systems, (812-817). https://doi.org/10.1145/2901790.2901867
Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impract...

Beats: Tapping Gestures for Smart Watches

Conference Proceeding
Oakley, I., Lee, D., Islam, M. R., & Esteves, A. (2015)
Beats: Tapping Gestures for Smart Watches. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, (1237-1246). https://doi.org/10.1145/2702123.2702226
Interacting with smartwatches poses new challenges. Although capable of displaying complex content, their extremely small screens poorly match many of the touchscreen interact...

Supporting offline activities on interactive surfaces

Conference Proceeding
Esteves, A., Scott, M., & Oakley, I. (2013)
Supporting offline activities on interactive surfaces. In TEI 2013 - Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, 147-154. https://doi.org/10.1145/2460625.2460648
This paper argues that inherent support for offline activities -- activities that are not sensed by the system -- is one of strongest benefits of tangible interaction over mor...

Physical games or digital games?: comparing support for mental projection in tangible and virtual representations of a problem-solving task

Conference Proceeding
Esteves, A., van den Hoven, E., & Oakley, I. (2013)
Physical games or digital games?: comparing support for mental projection in tangible and virtual representations of a problem-solving task. In TEI '13 Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, 167-174. https://doi.org/10.1145/2460625.2460651
This paper explores how different interfaces to a problem-solving task affect how users perform it. Specifically, it focuses on a customized version of the game of Four-in-a-r...

Fall of Humans: Interactive tabletop games and transmedia storytelling.

Conference Proceeding
Dionisio, M., Gujaran, A., Pinto, M., & Esteves, A. (2015)
Fall of Humans: Interactive tabletop games and transmedia storytelling. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, (401-404). https://doi.org/10.1145/2817721.2823477
This paper illustrates how transmedia storytelling can help introduce players to interactive tabletop games. To do so, we developed Fall of Humans (FoH), an experience that ta...

TraceMatch: a computer vision technique for user input by tracing of animated controls

Conference Proceeding
Esteves>, A., Clarke, C., Bellino, A., Esteves, A., Velloso, E., & Gellersen, H. (2016)
TraceMatch: a computer vision technique for user input by tracing of animated controls. In UbiComp '16 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, (298-303). https://doi.org/10.1145/2971648.2971714
Recent works have explored the concept of movement correlation interfaces, in which moving objects can be selected by matching the movement of the input device to that of the ...

SmoothMoves: smooth pursuits head movements for augmented reality.

Conference Proceeding
Abreu Esteves, A., Verweij, D., Suraiya, L., Islam, R., Lee, Y., & Oakley, I. (2017)
SmoothMoves: smooth pursuits head movements for augmented reality. In UIST '17 Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, (167-178). https://doi.org/10.1145/3126594.3126616
SmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-scre...

Remote control by body movement in synchrony with orbiting widgets: an evaluation of TraceMatch

Journal Article
Clarke, C., Bellino, A., Esteves, A., & Gellersen, H. (2017)
Remote control by body movement in synchrony with orbiting widgets: an evaluation of TraceMatch. PACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1(3), 1-22. https://doi.org/10.1145/3130910
In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface di...

Date


Research Centres Groups