Mapping: Introductory References


Mapping is the connection between of a set of extracted musical/physical gestures and a set of control parameters of a sound synthesis algorithm (Verfaille). The word “mapping” has its roots in mathematics, where the term is used to denote a particular function. When creating a digital music instrument (DMI) a significant, if not the most important, task is that of determining the mapping. This does not necessarily mean that mappings must be complicated to be successful. Some DMIs may need a mapping that shows a clear correlation between the the musical/physical gestures and the resulting sound, while the aesthetic of other DMIs could be to have an unintuitive mapping .

There are three main kinds of mapping:

  1. one-to-one (such as volume or panning potentiometer on a controller)
  2. one-to-many, which includes
    • Convergent Mapping (many-to-one)
    • Divergent Mapping (one-to-many)
  3. many-to-many

Many acoustic instruments can be used as an analogy for convergent and divergent mappings. A violin bow controls timbre, pitch (selection of string), and volume. In the same way we cannot say that the volume is controlled only by one parameter (Hunt). This knowledge is useful if the intended goal of a DMI is to use already known gestures of an acoustic instrumentalists. Mapping can be explicit, such as designed by mathematical models, or implicit, like the “black box” model (Verfaille 4).


  • Strategies for interpolation in n-to-m mappings where n ≠ m
  • Visualization of mappings
  • Interfaces for creating mappings
  • Implicit vs. explicit mappings
  • Techniques for generating implicit mappings (machine learning)
  • Complexity of mappings: insufficient control vs. cognitive load
  • Limitations of the term “mapping” vs. the importance of mapping

Reading Suggestion

Here's a list of basic references on mapping selected by the subgroup's coordinators with input from participants. They're mostly related to general issues of mapping controller variables to synthesis variables in real-time performance systems. Other references are indicated below as extra material for the interested reader. A complete bibliography is available from the mapping subgroup's bibliography page. We suggest that as an initial reading the paper "Towards a Model of Instrumental Mapping in Expert Musical Interaction", since it is a reasonably short paper, where a review of various previous works is made, and one which calls for Mapping to be studied as a subject in its own right.

Basic References

  • Bowler, I., A. Purvis, P. Manning, and N. Bailey. 1990. “On Mapping N Articulation onto M Synthesiser-Control Parameters.” In Proceedings of the 1990 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 181-184.
  • Favilla, S. 1997. “Real-time Control of Synthesis Parameters for LightHarp MIDI Controller.” In Proceedings of the 1997 ACMA Conference. Auckland, New Zealand.
  • Garnett, G., and Goudeseune, C. 1999. “Performance Factors in Control of High-Dimensional Spaces.” In Proceedings of the 1999 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 268 - 271.
  • Hunt, A., and R. Kirk. 2000. “Mapping Strategies for Musical Performance.” In M. Wanderley and M. Battier, eds. Trends in Gestural Control of Music. Ircam - Centre Pompidou.
  • Hunt, A., M. Wanderley, and R. Kirk. 2000. “Towards a Model for Instrumental Mapping in Expert Musical Interaction.” In Proceedings of the 2000 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 209 - 212.
  • Kramer, G. 1996. “Mapping a single data stream to multiple auditory variables: A subjective approach to creating a compelling design.” In Proceedings of International Conference on Auditory Display - ICAD'96.
  • Mulder, A., S. Fels, and K Mase, 1997. “Mapping Virtual Object Manipulation to Sound Variation.” In T. Rai and R. Basset, eds. IPSJ SIG notes 97(122):63-68.
  • Rovan, J., M. Wanderley, S. Dubnov, and P. Depalle. 1997. “Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance.” Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop, A. Camurri, ed. Genoa: Associazione diInformatica Musicale Italiana, October 3-4, 1997, pp. 68-73.
  • Winkler, T. 1995. “Making Motion Musical: Gestural Mapping Strategies for Interactive Computer Music.”In Proceedings of the 1995 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 261-264.

Complementary References

  • Choi, I., R. Bargar, and C. Goudeseune. 1995. “A Manifold Interface for a High Dimensional Control Space.” In Proceedings of the 1995 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 385-392.
  • Favilla, S. 1996. “Non-Linear Controller Mapping for Gestural Control of the Gamaka.” In Proceedings of the 1996 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 89-92.
  • Fels, S. 1994. Glove-Talk II: Mapping Hand Gestures to Speech Using Neural Networks - An Approach to Building Adaptive Interfaces. PhD Thesis, University of Toronto, Canada.
  • Goudeseune, C. 2001.Composing with Parameters for Synthetic Instruments. PhD Thesis, University of Illinois at Urbana-Champaign.
  • Hunt, A. 1999. Radical User Interfaces for Real-time Musical Control.DPhil thesis, University of York UK.
  • Lee, M., and D. Wessel.1992. “Connectionist Models for Real-Time Control of Synthesis and Compositional Algorithms.” In Proceedings of the 1992 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 277-280.
  • Marin-Nakra, T. 2000.Inside the Conductor's Jacket: Analysis, Interpretation and Musical Synthesis of Expressive Gestures.PhD Thesis. MIT Media Lab.
  • Metois, E. 1996. Musical Sound Information: Musical Gestures and Embedding Systems. PhD Thesis. MIT Media Lab.
  • Modler, P. 1998. “Interactive Control of Musical Structures by Hand Gestures.” In Proceedings of the Fifth Brazilian Symposium on Computer Music, pp. 143-150.
  • Mulder, A. 1998. Design of Virtual Three-Dimensional Instruments for Sound Control.PhD. thesis. Burnaby, BC, Canada: Simon Fraser University.
  • Rovan, J., and Wechsler, R. 2000. “The Multi-Dimensional Mapping of Movement-to-Sound within an Integrated Eyecon and Max Environment. Performance of “Seine Hohle Form …” In Proc. of the 10e Symposium International des Arts Electroniques, ISEA 2000. Paris.
  • Wanderley, M., N. Schnell and J. B. Rovan. 1998. “Escher- Modeling and Performing Composed Instruments in Real-Time.” In Proceedings of the 1998 IEEE International Conference on Systems, Man and Cybernetics(SMC'98), pp. 1080-1084.
  • Wanderley, M. 2001. Performer-Instrument Interaction: Applications to Gestural Control of Music. PhD Thesis. Paris, France: University Pierre et Marie Curie - Paris VI.
  • Wessel, D. 1979. “Timbre Space as a Musical Control Structure.” Computer Music Journal 3(2):45-52.
  • Wright, M., A. Freed, A. Lee, T. Madden, and A. Momeni. 2001. “Managing Complexity with Explicit Mapping of Gestures to Sound Control in OSC.” In Proceedings of the 2001 International Computer Music Conference. San Francisco, International Computer Music Association.

More references are available on the bibliography page

Special Issue

The Volume 7, number 2 issue of Organised Sound was dedicated to Mapping Strategies in Real-time Computer Music and guest edited by Marcelo Wanderley. A collection of ten articles describe several approaches to mapping and give a general overview of the research in this field:

  • Wanderley, M. 2002. "Mapping Strategies in Interactive Computer Music." Organised Sound, 7(2):83-84.
  • Goudeseune, C. 2002. “Interpolated Mappings for Musical Instruments.” Organised Sound, 7(2):85-96.
  • Hunt, A., and M. M. Wanderley. 2002. “Mapping Performer Parameters to Synthesis Engines.” Organised Sound, 7(2):97-108.
  • Fels, S., A. Gadd, and A. Mulder. 2002. “Mapping Transparency through Metaphor: towards more expressive musical instruments.” Organised Sound, 7(2):109-126.
  • Arfib, D., J. M. Couturier, L. Kessous, and V. Verfaille. 2002. “Strategies of Mapping between Gesture data and Synthesis Model Parameters using Perceptual Spaces.” Organised Sound, 7(2):127-144.
  • Doornbusch, P. 2002. “Composers' Views on Mapping in Algorithmic Composition.” Organised Sound, 7(2):145-156.
  • Myatt, T. 2002. “Strategies for Interaction in Construction 3.” Organised Sound, 7(2):157-169.
  • Levitin, D., S. McAdams, and R. Adams. 2002. “Control Parameters for Musical Instruments: a foundation for new mappings of gesture to sound.” Organised Sound, 7(2):171-189.
  • Ng, K. 2002. “Sensing and Mapping for Interactive Performance.” Organised Sound, 7(2):191-200.
  • Burtner, M. 2002. “The Metasaxophone: concept, implementation, and mapping strategies for a new computer music instrument.” Organised Sound,7(2):201-213.
  • Nichols, C. 2002. “The vBow: a virtual violin bow controller for mapping gesture to synthesis with haptic feedback.” Organised Sound,7(2):215-220.

Importance vs. Limitations of Mapping

Of related interest, the NIME02 conference presented two (conflicting?) articles on mapping, the first one by Joel Chadabe on "The Limitations of Mapping as a Structural Descriptive in Electronic Instruments" and the second by Andy Hunt, Marcelo Wanderley and Matt Paradis on "The Importance of Parameter Mapping in Electronic Instrument Design".