Balandino Di Donato

balandino di donato

Dr Balandino Di Donato

Lecturer

Biography

Balandino Di Donato is Lecturer in Interactive Audio at Edinburgh Napier University (ENU).
He currently lead the funded BSL in Music Embodied Interaction project, and collaborates on a number of ENU projects.

He received a PhD in Designing Embodied Human-Computer Interactions in Music Performance from Royal Birmingham Conservatoire, Birmingham City University.

Before joining ENU, Balandino was Lecturer Creative Computing at University of Leicester. He has been Research Assistant at Goldsmiths - University of London's Embodied AudioVisual Interaction Unit (EAVI) working on the ERC funded project BioMusic led by Prof Atau Tanaka. In this project Balandino was Interaction and Sound designer for a wireless EMG board and AI-driven software for musical applications.
Balandino collaborated with De Montfort University on the Creative AI Dataset for Live Co-operative Music Performance project founded by the EPRSC HDI Network; and, with the University of Leicester on the Innovating and Crafting Sustainable Digital Ecosystems (INCITE) project. He worked at Integra Live on the Integra Live development, and at Centro Ricerche Musicali di Roma, he assisted the Composer Michelangelo Lupone and the artist Licia Gallizia in the realisation of interactive sound art installations.

Alongside his academic career, Balandino is a Sound Artist. He realised award-winning sound art installations (Biennale of Contemporary art and Design 2018, Biennale ArteScienza 2019) and performed in international conferences (NPAPW, ICMC, Audio Mostly, EmuFest and ElectroAQustica). From 2007 till 2014, he was a freelance Sound Engineer and Audio Technician supporting international music productions (Katie Perry, Backstreet Boys, McBusted, The Voice, Carl Palmer, etc).

Esteem

Conference Organising Activity

  • Audio Mostly 2022 Best Short Paper Award Jury
  • Audio Mostly 2022 Paper session chair
  • Audio Mostly Workshop Chair

 

External Examining/Validations

  • MA Music & Sound Design programme validation at University of Greenwich
  • BA Music & Sound production programme validation at University of Greenwich
  • Examiner for Creative Technologies MA/MSc program at De Montfort University

 

Invited Speaker

  • Talk at Mixed Reality Meetup
  • Human-Sound Interaction (HSI)
  • Human-Sound Interaction workshop

 

Public/Community Engagement

  • Mixed Reality Scotland

 

Reviewing

  • Audio Developer Conference 2022
  • SoniHED Reviewer
  • XXIII Colloquium of Musical Informatics
  • ACM CHI
  • Human Technology Journal
  • Digital Creativity - Taylor & Francis Online

 

Date


14 results

Designing Gestures for Continuous Sonic Interaction

Conference Proceeding
Tanaka, A., Di Donato, B., Zbyszynski, M., & Roks, G. (2019)
Designing Gestures for Continuous Sonic Interaction. In M. Queiroz, & A. X. Sedó (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression (180-185). https://doi.org/10.5281/zenodo.3672916
This paper presents a system that allows users to quickly try different ways to train neural networks and temporal modeling techniques to associate arm gestures with time vary...

Improvising through the senses: a performance approach with the indirect use of technology

Journal Article
Michailidis, T., Dooley, J., Granieri, N., & Di Donato, B. (2018)
Improvising through the senses: a performance approach with the indirect use of technology. Digital Creativity, 29(2-3), 149-164. https://doi.org/10.1080/14626268.2018.1511600
This article explores and proposes new ways of performing in a technology-mediated environment. We present a case study that examines feedback loop relationships between a dan...

Myo Mapper: a Myo armband to OSC mapper

Conference Proceeding
Di Donato, B., Bullock, J., & Tanaka, A. (2018)
Myo Mapper: a Myo armband to OSC mapper. In D. Bowman, & L. Dahl (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression (138-143). https://doi.org/10.5281/zenodo.1302705
Myo Mapper is a free and open source cross-platform application to map data from the gestural device Myo armband into Open Sound Control (OSC) messages. It represents a ‘quick...

MyoSpat: A hand-gesture controlled system for sound and light projections manipulation

Conference Proceeding
Di Donato, B., Dooley, J., Hockman, J., Bullock, J., & Hall, S. (2017)
MyoSpat: A hand-gesture controlled system for sound and light projections manipulation. In Proceedings of the 2017 International Computer Music Conference (335-340
We present MyoSpat, an interactive system that enables performers to control sound and light projections through hand-gestures. MyoSpat is designed and developed using the Myo...

Current Post Grad projects