PCB versions of the VoxEMG. The v3.1 eTextile Configuration (left) features castellated inputs for conductive thread input and loops for textile integration. The v3.1.2 Bela Mini Capelet (middle, right) slots directly into the A0 and A1 analogue inputs, power, and ground from the Bela Mini.
about
The VoxEMG circuit is an extension of the open-source EMG Circuit v7.1 (Advancer Technologies), from which other EMG platforms such as the Myoware are derived. VoxEMG is specifically aimed to detect activation of the extrinsic laryngeal muscles in both vocalised and subvocalised singing. High-precision and trimmable resistors are used to ensure noise reduction and the circuit is flexible to be used with different types of electrodes for desired implementation. The EMG signals can then be used in a variety of manners, for instance to relay feedback about the singer’s movements during practice.
There are currently two open-source versions of the VoxEMG available on GitHub. The PCB configurations use the same circuit implemented in different PCB setups:
This paper discusses the design of the Singing Knit, a wearable knit collar for measuring a singer’s vocal interactions through surface electromyography. We improve the ease and comfort of multi-electrode bio-sensing systems by adapting knit e-textile methods. The goal of the design was to preserve the capabilities of rigid electrode sensing while addressing its shortcomings, focusing on comfort and reliability during extended wear, practicality and convenience for performance settings, and aesthetic value. We use conductive, silver-plated nylon jersey fabric electrodes in a full rib knit accessory for sensing laryngeal muscular activation. We discuss the iterative design and the material decision-making process as a method for building integrated soft-sensing wearable systems for similar settings. Additionally, we discuss how the design choices through the construction process reflect its use in a musical performance context.
@inproceedings{Reed_AHs22_SingingKnit,title={{Singing Knit: Soft Knit Biosensing for Augmenting Vocal Performances}},author={Reed, Courtney N. and Skach, Sophie and Strohmeier, Paul and McPherson, Andrew P.},year={2022},month=mar,booktitle={{Proceedings of the Augmented Humans International Conference 2022}},location={Kashiwa, Chiba, Japan},publisher={Association for Computing Machinery},address={New York, NY, USA},series={AHs '22},pages={170–183},numpages={14},doi={10.1145/3519391.3519412},isbn={9781450396325},url={https://doi.org/10.1145/3519391.3519412},}
2021
Surface Electromyography for Sensing Performance Intention and Musical Imagery in Vocalists
Courtney N. Reed, and Andrew P. McPherson
In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction, Feb 2021
Through experience, the techniques used by professional vocalists become highly ingrained and much of the fine muscular control needed for healthy singing is executed using well-refined mental imagery. In this paper, we provide a method for observing intention and embodied practice using surface electromyography (sEMG) to detect muscular activation, in particular with the laryngeal muscles. Through sensing the electrical neural impulses causing muscular contraction, sEMG provides a unique measurement of user intention, where other sensors reflect the results of movement. In this way, we are able to measure movement in preparation, vocalised singing, and in the use of imagery during mental rehearsal where no sound is produced. We present a circuit developed for use with the low voltage activations of the laryngeal muscles; in sonification of these activations, we further provide feedback for vocalists to investigate and experiment with their own intuitive movements and intentions for creative vocal practice.
@inproceedings{Reed_TEI21_sEMGPerformance,title={{Surface Electromyography for Sensing Performance Intention and Musical Imagery in Vocalists}},author={Reed, Courtney N. and McPherson, Andrew P.},year={2021},month=feb,booktitle={{Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction}},location={Salzburg, Austria},publisher={Association for Computing Machinery},address={New York, NY, USA},series={TEI '21},articleno={22},numpages={11},doi={10.1145/3430524.3440641},isbn={9781450382137},url={https://doi.org/10.1145/3430524.3440641},}
2020
Surface Electromyography for Direct Vocal Control
Courtney N. Reed, and Andrew McPherson
In Proceedings of the International Conference on New Interfaces for Musical Expression, Jul 2020
This paper introduces a new method for direct control using the voice via measurement of vocal muscular activation with surface electromyography (sEMG). Digital musical interfaces based on the voice have typically used indirect control, in which features extracted from audio signals control the parameters of sound generation, for example in audio to MIDI controllers. By contrast, focusing on the musculature of the singing voice allows direct muscular control, or alternatively, combined direct and indirect control in an augmented vocal instrument. In this way we aim to both preserve the intimate relationship a vocalist has with their instrument and key timbral and stylistic characteristics of the voice while expanding its sonic capabilities. This paper discusses other digital instruments which effectively utilise a combination of indirect and direct control as well as a history of controllers involving the voice. Subsequently, a new method of direct control from physiological aspects of singing through sEMG and its capabilities are discussed. Future developments of the system are further outlined along with usage in performance studies, interactive live vocal performance, and educational and practice tools.
@inproceedings{Reed_NIME20_VocalsEMG,title={{Surface Electromyography for Direct Vocal Control}},author={Reed, Courtney N. and McPherson, Andrew},year={2020},month=jul,booktitle={{Proceedings of the International Conference on New Interfaces for Musical Expression}},publisher={Birmingham City University},address={Birmingham, UK},pages={458--463},doi={10.5281/zenodo.4813475},issn={2220-4806},url={https://www.nime.org/proceedings/2020/nime2020_paper88.pdf},editor={Michon, Romain and Schroeder, Franziska},presentation-video={https://youtu.be/1nWLgQGNh0g},}