imagining & sensing

my PhD work surrounding the vocalist-voice relationship, vocal embodiment and perception

2019 - 2023

My PhD at Queen Mary University of London (QMUL) was done in the Augmented Instruments Lab (AIL) with Prof. Andrew McPherson. The work culmuninated in my thesis, Imagining & Sensing: Understanding and Extending the Vocalist-Voice Relationship Through Biosignal Feedback.

🏆 This PhD was awarded the ACM SIGCHI Outstanding Dissertation Award in 2024 for contributions to the HCI community.

abstract

The voice is body, instrument, and identity: To explore and unpack the relationship that vocalists have with their voices and vocal practice, this work involved autoethnographic research in my own singing and work with other vocalists and practices.

This work also explores the tensions and opportunities between third-person view of the voice by listeners, sensors, and digital agents, and the critical, embodied, first-person relationship of the vocalist. The vocalist’s understanding of their multi-sensory experiences is through tacit knowledge of the body. This knowledge is difficult to articulate, yet awareness and control of the body are innate. In the ever-increasing emergence of technology which quantifies or interprets physiological processes, we must remain conscious also of embodiment and human perception of these processes. Focusing on the vocalist-voice relationship, this project expands knowledge of human interaction and how technology influences our perception of our bodies.



Designing a vocal electromyography instrument, the VoxEMG. The original prototype worn here on my suprahyoid (left) with wired electrodes, conductive paste, and kinetic tape. The current PCB (right) is designed for textile incorporation and fabric electrode connections with castellated inputs.

contributions

  • VoxEMG electromyography platform: a novel vocal interaction method which uses measurement of laryngeal muscular activations through surface electromyography (sEMG).
  • Long-term first-person/autoethnographic and in-depth work with other vocalists on reflection on the incorporation of biosignal feedback understanding body movements and vocal practice and how such feedback can function as a metaphor.
  • Comprehensive examination of how technology and the feedback we receive in human-computer interaction (HCI) can shape our perception and understanding of our bodies and our actions.
  • Strategy for adopting technologies from other practices into traditional arts and other contexts through the use of soft wearables, e.g., the Singing Knit EMG wearable.
  • Analysis of how metaphors used in fundamental vocal pedagogy and how metaphorical communication between humans works; this proposed novel ways in which we can structure interaction with technology to aid in sensory communication.
  • Biofeedback-based reflections on ways in which vocalists are in control and controlled by their voices, work with and against their bodies.
  • Nuanced account of human interaction and perception of the body through vocal practice, as an example of how technological intervention enables exploration and influence over embodied understanding.

Implementing the VoxEMG into the Singing Knit with designs by Sophie Skach (top). The VoxEMG board can be affixed to textiles and use conductive thread inputs (left); the conductive thread traces to fabric electrodes are woven into the knit (centre), which stretches with the garment (right).

theory & methods

  • mixed-methods
  • electromyography (EMG) biofeedback
  • entanglement theory
  • agential realism
  • somaesthetics and soma design
  • cognitive science and music psychology
  • first-person methods & autoethnography
  • micro-phenomenology
  • thematic analysis
  • contemporary metaphor theory
  • vocal organology

The Singing Knit inside with conductive fabric electrode pads (top left) and outside with affixed VoxEMG boards for 8 channels of EMG data (top right). As modelled by me, the garment is able to stretch for flexible performance wear (bottom row).

2024

  1. Reed_MSX_AAFVocalAccuracy.png
    Auditory imagery ability influences accuracy when singing with altered auditory feedback
    Courtney N. Reed, Marcus Pearce, and Andrew McPherson
    Musicae Scientiae, Feb 2024

2023

  1. Reed_CHI23_VocalMetaphor.png
    Negotiating Experience and Communicating Information Through Abstract Metaphor
    Courtney N. Reed, Paul Strohmeier, and Andrew P. McPherson
    In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Apr 2023
  2. Reed_CHI23_BodyLutherie.png
    As the Luthiers Do: Designing with a Living, Growing, Changing Body-Material
    Courtney N. Reed
    In ACM CHI Workshop on Body X Materials, Apr 2023
  3. Reed_TEI23_BodyAsSound.png
    The Body as Sound: Unpacking Vocal Embodiment through Auditory Biofeedback
    Courtney N. Reed, and Andrew P. McPherson
    In Proceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction, Feb 2023
  4. Reed_PhD_ImaginingSensing.png
    Imagining & Sensing: Understanding and Extending the Vocalist-Voice Relationship Through Biosignal Feedback
    Courtney N. Reed
    PhD Computer Science, Queen Mary University of London, Feb 2023

2022

  1. Reed_NIME22_Microphenomenology.png
    Exploring Experiences with New Musical Instruments through Micro-phenomenology
    Courtney N. Reed, Charlotte Nordmoen, Andrea Martelloni, and 6 more authors
    In Proceedings of the International Conference on New Interfaces for Musical Expression, Jun 2022
  2. Reed_CHI22_CommunicatingBodies.png
    Communicating Across Bodies in the Voice Lesson
    Courtney N. Reed
    In ACM CHI Workshop on Tangible Interaction for Well-Being, Apr 2022
  3. Reed_CHI22_SensorySketching.png
    Sensory Sketching for Singers
    Courtney N. Reed
    In ACM CHI Workshop on Sketching Across the Senses, Apr 2022
  4. Reed_AHs22_SingingKnit.png
    Singing Knit: Soft Knit Biosensing for Augmenting Vocal Performances
    Courtney N. Reed, Sophie Skach, Paul Strohmeier, and 1 more author
    In Proceedings of the Augmented Humans International Conference 2022, Mar 2022
  5. Reed_TEI22_EmbodiedSingingDC.png
    Examining Embodied Sensation and Perception in Singing
    Courtney N. Reed
    In Proceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction, Feb 2022

2021

  1. Reed_TEI21_sEMGPerformance.png
    Surface Electromyography for Sensing Performance Intention and Musical Imagery in Vocalists
    Courtney N. Reed, and Andrew P. McPherson
    In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction, Feb 2021

2020

  1. Reed_NIME20_VocalsEMG.png
    Surface Electromyography for Direct Vocal Control
    Courtney N. Reed, and Andrew McPherson
    In Proceedings of the International Conference on New Interfaces for Musical Expression, Jul 2020