Research explorer tool

25 results

Viewing speech in action: speech articulation videos in the public domain that demonstrate the sounds of the International Phonetic Alphabet (IPA)

Journal Article
Nakai, S., Beavan, D., Lawson, E., Leplâtre, G., Scobbie, J. M., & Stuart-Smith, J. (2016)
Viewing speech in action: speech articulation videos in the public domain that demonstrate the sounds of the International Phonetic Alphabet (IPA). Innovation in Language Learning and Teaching, 1-9. https://doi.org/10.1080/17501229.2016.1165230
In this article, we introduce recently released, publicly available resources, which allow users to watch videos of hidden articulators (e.g. the tongue) during the production...

Visualising the soundfield and soundscape: extending Macaulay and Crerar’s 1998 method.

Presentation / Conference
McGregor, I., Crerar, A., Benyon, D., & LePlâtre, G. (2008, June)
Visualising the soundfield and soundscape: extending Macaulay and Crerar’s 1998 method. Paper presented at ICAD 2008: The 14th annual International Conference on Auditory Display, Paris, France
The introduction of effective auditory warnings into a shared environment requires a prior understanding of the existing soundfield and soundscape. Reifying the physical and p...

CymaSense: a real-time 3D cymatics-based sound visualisation tool.

Conference Proceeding
McGowan, J., Leplâtre, G., & McGregor, I. (2017)
CymaSense: a real-time 3D cymatics-based sound visualisation tool. In DIS '17 Companion Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive Systems. , (270-274). https://doi.org/10.1145/3064857.3079159
What does music look like? Representation of music has taken many forms over time, from musical notation [16], through to random algorithm-based visualisations based on the am...

AniMorph: animation driven audio mosaicing.

Conference Proceeding
Tsiros, A., & LePlâtre, G. (2015)
AniMorph: animation driven audio mosaicing. In Proceedings of Electronic Visualisation and the Arts, (115-116
This paper describes AniMorph a system for animation driven Concatenative Sound Synthesis (CSS). We can distinguish between two main application domains of CSS in the context ...

Workplace soundscape mapping: a trial of Macaulay and Crerar’s method.

Presentation / Conference
McGregor, I., Crerar, A., Benyon, D., & LePlâtre, G. (2006, June)
Workplace soundscape mapping: a trial of Macaulay and Crerar’s method. Paper presented at ICAD 2006: The 12th International Conference on Auditory Display, London, UK
This paper describes a trial of Macaulay and Crerar’s method of mapping a workplace soundscape to assess its fitness as a basis for an extended soundscape mapping method. Twel...

Establishing key dimensions for reifying soundfields and soundscapes from auditory professionals.

Presentation / Conference
McGregor, I., Crerar, A., Benyon, D., & LePlâtre, G. (2007, June)
Establishing key dimensions for reifying soundfields and soundscapes from auditory professionals. Paper presented at ICAD 2007: The 13th International Conference on Auditory Display, Montreal, Canada
This paper presents a unique insight into the way acousticians, computing specialists and sound designers describe the dimensions of sound they use. Seventy-five audio profess...

An evaluation of a sketching interface for interaction with concatenative sound synthesis.

Presentation / Conference
Tsiros, A., & LePlâtre, G. (2016, September)
An evaluation of a sketching interface for interaction with concatenative sound synthesis. Paper presented at 42nd International Computer Music Conference, Utrecht, The Netherlands
This paper presents the evaluation of morpheme a sketching interface for the control of sound synthesis. We explain the task that was designed in order to assess the effective...

The effects of syllable and utterance position on tongue shape and gestural magnitude in /l/ and /r

Conference Proceeding
Lawson, E., Leplatre, G., Stuart-Smith, J., & Scobbie, J. M. (2019)
The effects of syllable and utterance position on tongue shape and gestural magnitude in /l/ and /r. In Proceedings of the 19th International Congress of Phonetic Sciences, (432-436
This paper is an ultrasound-based articulatory study of the impact of syllable-position and utterance position on tongue shape and tongue-gesture magnitude in liquid consonant...

Exploring Musical Creativity through Interactive Cymatics

Conference Proceeding
McGowan, J., Leplatre, G., & McGregor, I. (2018)
Exploring Musical Creativity through Interactive Cymatics. https://doi.org/10.14236/ewic/HCI2017.67
What does music look like? Representation of sound has taken many forms over time, from sinusoidal wave patterns, through to random algorithm-based visualisations based on the...

Utilising natural cross-modal mappings for visual control of feature-based sound synthesis

Conference Proceeding
Tsiros, A., & Leplâtre, G. (2017)
Utilising natural cross-modal mappings for visual control of feature-based sound synthesis. In Proceedings of the 19th ACM International Conference on Multimodal Interaction - ICMI 2017, (128-136). https://doi.org/10.1145/3136755.3136798
This paper presents the results of an investigation into audio-visual (AV) correspondences conducted as part of the development of Morpheme, a painting interface to control a ...
6 results

Learning to cope with non-discretionary use of digital technologies

0 - 2016
The thesis describes an investigation into how people learn to cope with non-discretionary use of digital technologies. This includes...
Dr Emilia Sobolewska | Director of Studies: Prof David Benyon | Second Supervisor: Dr Michael Smyth

Real-time 3D graphic augmentation of music therapy sessions for people on the autism spectrum

2015 - 2019
Autism Spectrum Disorder (ASD) is a lifelong neurodevelopmental disorder that can affect people in a number...
Dr John McGowan | Director of Studies: Dr Gregory Leplatre | Second Supervisor: Dr Iain McGregor

Intermediated Reality

2016 - 2020
Real-time solutions to reducing the gap between virtual and physical worlds for photorealistic interactive Augmented Reality (AR) are presented. First, a method of texture deforma...
Dr Llogari Casas Cambra | Director of Studies: Prof Kenny Mitchell | Second Supervisor: Dr Kevin Chalmers

A multidimensional sketching interface for visual interaction with corpus-based concatenative sound synthesis

2009 - 2016
The present research sought to investigate the correspondence between auditory and visual f...
Dr Augoustinos Tsiros | Director of Studies: Dr Gregory Leplatre | Second Supervisor: Dr Michael Smyth

Appropriating interaction

2009 - 2016
This investigation examines various methods for evaluating interactivity and engagement with technology. By using a static model, namely the interactive gallery at The Public...
Dr Tom Flint | Director of Studies: Dr Phil Turner | Second Supervisor: Dr Gregory Leplatre

An investigation of olfactory display technology for the enhancement of presence within virtual reality experiences

2018 - date
Andrew McKelvey | Director of Studies: Dr Emilia Sobolewska | Second Supervisor: Dr Michael Smyth

Date


Date