0% found this document useful (0 votes)
25 views4 pages

Gest-O: Performer Gestures Used To Expand The Sounds of The Saxophone

Uploaded by

Miguel Vargas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views4 pages

Gest-O: Performer Gestures Used To Expand The Sounds of The Saxophone

Uploaded by

Miguel Vargas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Gest-O: Performer gestures used to expand the

sounds of the saxophone


John Melo Daniel Gómez Miguel Vargas
Universidad Icesi Universidad ICESI Instituto tecnológico Metropolitano
Cali, Colombia Cali, Colombia Medellín, Colombia
[email protected] [email protected] [email protected]

ABSTRACT had little contact with the processing of sound and neither with
This paper describes the conceptualization and development of synthesizers. After testing with saxophone players and trumpet
an open source tool for controlling the sound of a saxophone players, this investigation focuses on the saxophone.
via the gestures of its performer. The motivation behind this
work is the need for easy access tools to explore, compose and This article describes the development of a hardware system
perform electroacoustic music in Colombian music schools and incorporated to the saxophone connected via blue tooth to a
conservatories. This work led to the adaptation of common DSP system developed in Pure Data [1]. Various user tests
hardware to be used as a sensor attached to an acoustic were carried on with professional saxophone players using
instrument and the development of software applications to diverse approaches to find the right gestures. This first stage of
record, visualize and map performers gesture data into signal the process ends up with the composition, montage,
processing parameters. The scope of this work suggested that performance and recording of a piece for solo sax and
focus was to be made on a specific instrument so the saxophone electronics.
was chosen. Gestures were selected in an iterative process with
the performer, although a more ambitious strategy to figure out The project was aimed at finding gestures that could be used to
main gestures of an instruments performance was first defined. control a real time sound processing system. Diverse strategies
Detailed gesture-to-sound processing mappings are exposed in were used to define the right set of gestures. The final used
the text. An electroacoustic musical piece was successfully strategy consisted in making agreements between the performer
rehearsed and recorded using the Gest-O system. and the designers using an electroacoustic piece as a
framework. Mappings between gestures and different
processing algorithms were derived from those agreements.
Keywords Although the strategy proves successful and the piece is
Electroacoustic music, saxophone, expanded instrument, rehearsed and recorded, a partial conclusion is that such a
gesture. system in the future should allow the musician the selection of
his/her gestures and let the system adapt to them in automatic
1. INTRODUCTION fashion.
In Colombian context, there is an evident gap between the
musical community and the techniques related to 2. RELATED WORK
electroacoustic music. On one hand, because the curriculum in Several works have been made with the intention of
the majority of music schools, from junior to senior level, expanding/augmenting instruments by means of a technological
doesn't encourage the use of technology; on the other hand, way. Such tools augment the interpretation and compositing
because the instrumental groups remain mostly traditional. possibilities, either way by expanding the techniques to
Thus, the daily life in music schools is not too close to the use approach the instrument, or by changes in the instrument’s
of technological devices in the process of assembly, sound using real time processing.
performance or production of electroacoustic works.
Using sensors that translate actions from the interpreter in an
This project aims to create useful open source tools to be shared analog way, we find Tod Machover and Joe Chung’s work with
with the musical community, particularly through the the Hyperinstruments project [2]. Hyperinstruments builds a
expansion of acoustical musical instruments. It is expected in solid ground in instrument expansion and in the approach of the
the mid-term that new composers and musicians will be able to musical community towards electroacoustic music. The term
familiarize with pieces and electroacoustic composition through Hyperinstrument becomes one of the ways we can name an
performance and experimentation. expanded instrument; we find in Hyper-bow [3] and Hyper-
flute[4], interesting advances in the refinement of the way we
The first stage of the project has been focused on the expansion capture gestures and in the use of musical gestures to carry out
of wind instruments, knowing that most of the performers have sound expansion, taking into account the physical capabilities
of the interpreter.
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are Along in the same line as the Hyper-Bow we find K-Bow. It’s
not made or distributed for profit or commercial advantage and that an instrument developed by KMI (Keith Macmillan
copies bear this notice and the full citation on the first page. To copy
otherwise, to republish, to post on servers or to redistribute to lists,
Instruments) which successfully brings the interpreter and the
requires prior specific permission and/or a fee. compositor closer to electronically expanded techniques. Very
NIME’11, May 21-23, 2011, University of Michigan, Ann Arbor. similar to Hyper-bow, K-bow uses pressure sensors and
Copyright remains with the author(s). accelerometers to complete a wireless device that
communicates the interpreter’s gestures to the computer.
Today, a quartet K-Bow piece exists, composed by Douglas
Quin and interpreted for the first time by the Kronos quartet in 3.2 The sensor
November 2011[5]. With the goal of capturing performers gestures, the electronic
plaque of a Wiimote control was used. Considering that only
The Bass Sleve is yet another project that uses a great variety of movement data from the accelerometer and gyroscope is
sensors in the creation of a real time augmented bass needed, there was no need to use any of the buttons. Also a
multimedia controller [6]. It has controllers located on the knee, cable was soldered to the current input to avoid battery weigh
foot and on the instrument, and it has Computer vision image from affecting performers interpretation (see image 1).
analysis; the global device uses auxiliary movements, different
from the traditional movements made by a bass player; these
movements can result to be invasive for the interpreter, from
which the direct work with interpreters is fundamental.

In the field of the brass instrument family we find in [17], the


starting point of expanding brass instruments that take into
account performers traditional interpretation; several
augmented brass instruments have been designed and
implemented in the past years, instruments may be more or less
invasive depending on the control device used for sound
Image 1: sensor
processing, pressure and position sensors, plus in some cases
push buttons, seem to be a general choice [16][18] to mention
Batteries were fitted onto the performers belt with a two meter
two; augmenting brass instruments have come to the point of
long cable and a case was designed so the sensor could be set
developing a whole new interface in which the primary concept
onto the saxophone in the less invasive way possible. A
of the instrument may become blurry due to its changes, not
Bluetooth connection with the computer is made via OSCulator
only physical but in the way instrument is played [19], although
software, which, passes data to PureData through OSC[11].
we don't have nothing against this approach, our own research
showed that traditional music performers, which are our main
interest, really don't feel comfortable with new controls built in 3.3 Tests with different instruments
their instruments. Initial tests consisted in measuring musical interpretations to
measure fluctuations and possible patterns in the performers
movements that could be present in different kinds of musical
3. GEST-O moods.
3.1 About musical gesture
Musical experience, to denote the moment in which we listen to In case that some pattern in movements could be found, they
music, whether it is in a concert, the radio, or an mp3 player, could be used as gestures to control an instrument processing
finds itself deeply related to corporal movement [8]. It becomes system in real time. The idea was that, if such natural gestures
evident since the origins of music that it is closely related to existed, they would be candidates to be used as control input
dance. Moreover, musical terminology uses metaphoric ideas for an audio processing system. This assumption was made due
such as allegro, dolce, vivace to guarantee that expressiveness to the fact that if those gestures exist in any kind of expressive
from the performer has the desired intention that the composer mood and that they were common to all musicians, they would
conceived [9]. It is clear that human beings are in full capacity not imply new learning.
to acknowledge gestures and movements that could be violent,
joyful, loving [9], and that we can understand between In order to get data from the sensor in early tests, a PureData
mechanical movement and the reasons behind it [10]. patch was developed to store audio and sensor data from the
performances. Recorded sensor variables are shown in Table 1.
A gesture could be considered in communication, control and
metaphor perspectives [8]. When we use gestures to Table 1. Variables the sensor sends while fixed onto the
communicate, they serve as means to our intentions, for instrument
instance when we interact socially. When we use them under
the control point of view, they become elements to manipulate Variable Meaning
some parameter. Metaphoric gesture is that whose movement is Acceleration acceleration
led by a concept. This project is focused in the analysis of pitch Y rotation
gesture as a mechanism of control to communicate concepts yaw Z rotation
conceived by the composer.
roll X rotation
A wind instrument performer usually focus his attention in the pitchAngle Y tilt
control of his fingers and the blowing of his instrument, leaving rollAngle Z tilt
as the only way to interact with hardware, the use of pedals yawAngle X tilt
limited at one control at a time. However there is a great Audio Audio
amount of body gestures additional to those needed for the
interpretation that can be measured through digital systems to Initial tests were carried out with two professional music
be used in real time sound control. performers: a saxophonist and a trumpet player (see images 2.a
and 2.b).
a. Test with saxophone b. Test with trumpet c. Data in the system
Image 2. User Tests’ record

3.5 An approach to gesture analysis


The data gathered from the tests was loaded in a custom
application (see image 2.c) for its analysis. Striving to look for In the left column of Table 2 there is a list of processes and the
patterns between audio and gesture data, work took another different variables involved in their control. The difference
road and it was decided to concentrate merely on the between discrete binary variables and continuous variables was
saxophone, so the objectives of the project could be achieved in key to the development of the application, as each one of them
the allotted time. The project leaned toward the reduction of the requires a different type of gesture: a continuous variable
breach between the interpreter and a specific musical piece. requires of a gesture that is equally continuous so the
Tests on this second stage were done with a sax student. The interpreter can control it with certain amount of precision
attention focused on the search for gestures that were having in mind a range of movement that defines a maximum
independent from the main performance gestures, such as and a minimum point. A discrete binary variable (i.e. on/off)
blowing and fingering. After making several explorative requires other type of gesture, a gesture that points or indicates
sessions with the performer, a group of independent gestures but with no defined dimension.
that the interpreter felt comfortable with were defined. (See left
column of Table 2) Mapping was made by agreements between the design team
and the performer and consisted firstly in defining the possible
As to the musical piece, “Perla” from Colombian composer relations between the type of variables and gestures, secondly
Jose Gallardo was selected. This is a piece for solo sax and real on programming algorithms that identify each one of the
time electronics. In this piece 6 different sound effects are used gestures and, finally, on testing that mapping. This process
which are: grane sampling, ring modulation, reverberation, repeats itself a good amount of times while playing the piece so
delay, and multiphonics. the gesture, its relation with the instruments interpretation and
above all the calibration of the range of movement can be
optimized. As an additional constraint to the mapping, the score
3.6 Rehearsal and recording of the piece suggested that at some points various effects should be
Once the gestures and the control requirements were defined, available for the performer at the same time. In the right
and the piece was chosen, the development of a PureData column of Table 2 are the gestures that were finally defined
application to use on the prototypes final test was started. The with the interpreter that correspond to the necessary variables
application inputs are real time sensor data, and the instruments for sound processing in the left column.
sound. The output is the transformed sound. The critical part of
the development lies in mapping the best possible way the The variables Grane Amplitude, Grane size, Modulation Index,
movements, gestures, and actions of the user/interpreter with Reverb mix, and Delay mix are all continuous variables
the processing systems parameters. Depending on the style and controlled with continuous gestures. Grane sampling on/off,
the way we approach the gesture mapping, it can considerably Ring Modulation on/off and Multiphonics on/off were of the
alter the way the instrument behaves [12]. In this ongoing discrete kind and controlled with discrete gestures.
investigation all the technical factors of the traditional
instruments are considered, in order to achieve accurate and
fluent control from the performer.

Table 2. Gestures and their corresponding effects


Effect Gesture
Grane Sampling On/Off Jump
\ Grane Amplitude Upwards Leaning
\ Grane Size Sideways Leaning (right)
Shaking the instrument
Ring Modulation On/Off
sideways
Average between upwards
\ Modulation Index Image 3. PureData application
and sideways (right) tilt
Reverb Mix Downwards leaning The Pure Data application was finished with a GUI specifically
Shaking the instrument designed to monitor the processing values in real time, and the
Multiphonics On/Off changes produced by performers movements (see Image 3).
forward
This GUI has all the information of the system status at the
Delay Mix Upwards Leaning
moment of the performance, so it is intended to be used as a
real time guide. A video of the system running in a [2] Machover, T. & Chung J. Hyperinstruments: Musically
performance of the musical piece can be found online [15]. Intelligent and Interactive Performance and Creativity
Systems. In Proc. Intl. Computer Music Conference
4. CONCLUSSIONS AND FUTURE (1989)
[3] Young, D. The Hyper-Bow Controller: Real-Time
WORK Dynamics Measurement of Violin Performance. In
Finding gestural patterns in different performers of the same Proceedings Conference on New Instruments for Musical
instrument is a complex task even when there are a small Expression (NIME-02), Dublin, 2002
number of performers. Instrument players have each different [4] Palacio-Quintin C. The Hyper-Flute. In Proceedings
backgrounds, tastes, and ways that lead to different gestures Conference on New Instruments for Musical Expression
during performance. Although this was Gest-O main objective, (NIME-03), Montreal, 2003.
due to limitations of time and scope this objective was adjusted [5] http://www.keithmcmillen.com/k-bow/overview
to the interpretation of gestures from a specific performer to [6] Ramkissoon, I. (2011). The Bass Sleeve: A Real-time
meet the needs of a specific piece. Multimedia Gestural Controller for Augmented Electric
Despite taking a different path, it is probable that the search for Bass Performance. NIME'11. Oslo, Norway.
common gestures might not be the right choice in projects of [7] Godøy, I.R & Leman, M. Musical Gestures. Sound,
this type. Another choice could be a system that allows a user Movement, and Meaning. Routledge, 2010
to choose his/her own gesture and to choose where it is going to [8] Cienki, A & Müller C. Metaphor and Gesture. John
be mapped to. The development of such adaptive systems is a Benjamins B.V., 2008
direction in which this project might evolve. [9] Camurri, A., & Moeslund, T.B. Visual Gesture
Recognition, From Motion Tracking to Expressive
A first goal of creating a system useful for the transformation of Gesture, in Musical Gestures, Sound Movement, and
saxophones sound using performer gestures was completed. Meaning. Routledge, 2010, 238-263.
The rehearsal and recording of the piece was done correctly and [10] Schneider, A. In Music and Gestures: A Historical
the results are musically satisfactory. The performer, although Introduction and Survey of Earlier Research, in Musical
it was his first work with electroacoustic music, was very Gestures, Sound, Movement, and Meaning. Routledge,
pleased to find out that such alterations of saxophones sound 2010, 69-100.
could be achieved. [11] Wright, M., & Freed, A. (1997). Open Sound Control: A
Universal gestures (greetings, surprise, anger), change between New Protocol for Communicating with Sound
societies depending on existing cultural codes. Although there Synthesizers. Proceedings of the international Computer
are advanced studies about gesture and speech [13], gesture and Music Conference. International Computer Music
music [14], it is known that hand movements and facial Association
expressions that go with speech are an inextricable aspect of [12] Hunt, A., Wanderley, M. M., & Paradis, M. (2002). The
communication [8]. Music, to be communicated to a listener, importance of parameter mapping in electronic instrument
has to go through at least two important steps: a composer design. NIME'02. Dublin.
writes in the language of sound, so a performer can [13] Mcneill, D. Language and Gesture. Cambridge University
communicate the listener the work conceived by the composer. Press. 2000
All this social, cultural and personal factors additional to the [14] Dahl, S., & Friberg, A. Visual Perception of Expressivenes
complex musical abstractions make the big idea of a series of in Musicians’ Body Movements. Music Perception vol. 24
standardized gestures almost out of reach. If technical factors of (5). University of California Press. 2007
each instruments performance are added, possibilities of [15] Video of the performance: http://youtu.be/8xQ6nmxMNqs
generalization are reduced even more. [16] Burtner, M. The Metasaxophone: concept,
implementation, and mapping strategies for a new
4. ACKNOWLEDGEMENTS computer music instrument. Articulo: Organised Sound,
To the ACORDE research group research group in the ITM Volume 7 Issue 2. pag 201-213. Cambridge University
Medellin for their preliminary discussions. To Jose Gallardo for Press. 2002
composing “Perla”. To the performers Paulo Sanchez, Diego [17] Cook, P & Morrill D. Hardware, software, and
Murillo and Oscar Plaza for their collaboration in the tests. To compositional tools for a real-time improvised solo
Juan Carlos Perez for the documentation. Finally to the trumpet work. presented at the Int. Computer Music Conf.
Leonardo research group in ICESI University for facilitating (ICMC), Columbus, OH 1989.
the space and resources to develop the project. [18] Schiesser, S & Traube C. In Proceedings Conference on
New Instruments for Musical Expression (NIME-06),
5. REFERENCES Paris, 2006.
[1] Puckette, M. (1997, Mayo 7). Pure Data: Another [19] Favilla,S. Cannon, J, Hicks T. Chant, D & Favilla P.
integrated computer music environment. Second Guilsax: Bent Leather Band´s Augmented Saxophone
Intercollege Computer Music Concerts. Tachikawa. Project. In Proceedings Conference on New Instruments
for Musical Expression (NIME-08), Italy, 2008.

You might also like