Table of Contents

Interaction & Performance: Bibliography

A

Arfib, D., and L. Kessous. 2000. “From 'Music V' to 'creative gestures in computer music'”. Accessed at http://gsd.ime.usp.br/sbcm/2000/papers/arfib.pdf

Alexandraki, C., and D. Akoumanakis. Sum 2010. “Exploring New Perspectives in Network Music Performance: The DIAMOUSES Framework,” Computer Music Journal, vol. 34, pp. 66-83.

Anderson, S., Apr 2008. “Microsound in public space: compositional methods to enhance site-specific sound,” Organised Sound, vol. 13, pp. 51-60.

Andean, J. Aug 2011. “Ecological Psychology and the Electroacoustic Concert Context,” Organised Sound, vol. 16, pp. 125-133.

B

Balin, W., and J. Lovischach. 2011. “Gestures to Operate DAW software.” Proceeding of the 130th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 8456.

Barbosa, A., and M. Kaltenbrunner. 2002. “Public sound objects: A shared musical space on the web.” Second International Conference on Web Delivering of Music, Proceedings.

Bayreuther, R., 2007. “Contemporary composition as musical disciplina - The example of sound installations of Robin Minard,” Musiktheorie, vol. 22, pp. 357-364.

Belet, B. 2003. “Live performance interaction for humans and machines in the early twenty-first century: One composer's aesthetics for composition and performance practice,” Organised sound: An international journal of music technology, vol. 8, p. 305.

Bertini, G. & Carosi, P. (1992). The light baton: a system for conducting computer music performance. Proceedings International Computer Music Conference, San Jose, California, USA, 73-76. San Francisco CA, USA: International Computer Music Association.

Black, D., Gohlke, K., and Lovischach, J. (2010). “Foley Sonic: Placing Sounds on a Timeline Through Gestures.” Proceedings of the 128th Audio Engineering Society Conference. New York: Audio Engineering Society. Paper no. 8004.

Blaine, T. and T. Perkis. 2000. “The Jam-O-Drum interactive music system: a study in interaction design,” pp. 165-173.

Boltz, M. G., B. Ebendorf, and B. Field. Sep 2009. “AUDIOVISUAL INTERACTIONS: THE IMPACT OF VISUAL INFORMATION ON MUSIC PERCEPTION AND MEMORY,” Music Perception, vol. 27, pp. 43-59.

Bongers, B. (1994). The use of active tactile and force feedback in timbre controlling electronic instruments. In Proceedings of the 1994 International Computer Music Conference, (pp. 171-174).

Bongers, B. 1998. “An Interview with Sensorband.” Computer Music Journa l 22(1): 13-24.

Bown, O., A. Eldridge, and J. McCormack. Aug 2009. “Understanding Interaction in Contemporary Digital Music: from instruments to behavioural objects,” Organised Sound, vol. 14, pp. 188-196.

Brown, A. R. and A. Sorensen. 2009. “Interacting with Generative Music through Live Coding,” Contemporary Music Review, vol. 28, pp. 17-29.

Brown, N. G. 2011. “Being Among the Living: On the interrelations between performers, witnesses and environment in As I Have Now Memoyre,” Organised Sound, vol. 16, pp. 176-183.

Buxton, W., et. al. 1979. “A Computer-Based System for the Performance of Electroacoustic Music.” Proceedings of the 64th Annual Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 1529.

C

Camurri, A. et al (1987). Interactions between music and movement: A system for music generation from 3D animations. Proceedings of the 4th international conference on event perception and action, Trieste.

Camurri, A. (1995). Interactive dance/music systems. Proceedings of the 1995 International Computer Music Conference, (pp. 245-252).

Camurri, A. 1999. “Music content processing and multimedia: Case studies and emerging applications of intelligent interactive systems,” Journal of New Music Research, vol. 28, pp. 351-363.

Camurri, A., et. al. 2000. “EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems.” Computer Music Journal, 24(1):57.

Canazza, S., et. al. Sep 2003. “An abstract control space for communication of sensory expressive intentions in music performance,” Journal of New Music Research, vol. 32, pp. 281-294.

Cascella, D. Apr 2008. “Carl Michael von Hausswolff,” Organised Sound, vol. 13, pp. 21-29.

Cerana, C. 2000. “Gesture Control of Musical processes: A MAX Environment for Buchla's Lightning.” Organised Sound, 5: 3-7.

Choi, I. 1998. “From Motion to Emotion: Synthesis of Interactivity with Gestural Primitives.” Emotional and Intelligent: The Tangled Knot of Cognition, AAAI Fall Symposium, Orlando FL. (October):22-25.

Coulter, J. Apr 2010. “Electroacoustic Music with Moving Images: the art of media pairing,” Organised Sound, vol. 15, pp. 26-34, Apr 2010.

Cronkite, D. 2000. “Mediating Gesture: An Interactive Approach to Computer-Assisted Music.” Musicworks: Explorations in Sound, 78: 20.

Cseres, J. 2009. “In Between as a Permanent Status: Milan Adamciak's Version of Intermedia,” Leonardo Music Journal, vol. 19, pp. 31-34.

D

Dipper, G. Dec 2009. “Interactive Interfaces: Installations produced at the ZKM vertical bar IMA,” Organised Sound, vol. 14, pp. 286-298.

Downes, P. 1987. “Motion Sensing in Music and Dance Performance.” Proceedings of the 5th Audio Engineering Society Conference. New York: Audio Engineering Society. Paper no. 105.

Drummond, J. 2009. “Understanding Interactive Systems.” Organised Sound, 14(2):124-133.

E

Eigenfeldt, A. Aug 2011. “Real-time Composition as Performance Ecosystem,” Organised Sound, vol. 16, pp. 145-153.

Essl, G. and M. Rohs. Aug 2009. “Interactivity for Mobile Music-Making,” Organised Sound, vol. 14, pp. 197-207.

F

Ferguson, S., and MM Wanderley. 2009. “The McGill Digital Orchestra: Interdisciplinarity in Digital musical Instrument Design.”

Fléty, E. 2000. “3D Gesture Acquisition Using Ultrasonic Sensors.” Trends in Gestural Control of Music. 193-208.

Frank, M., L. Mehr, A. Sontacchi, & F. Zotter. 2010. “Flexible and Intuitive Pointing Method for 3D Auditory Localization Experiments.” Proceedings of the 130th Audio Engineering Society Conference. New York: Audio Engineering Society. Paper no. 38.

Frengel, M. 2010. “A Multidimensional Approach to Relationships between Live and Non-live Sound Sources in Mixed Works”. Organised Sound 15 (2):96-106.

Friberg, A. 1991. “Generative rules for music performance: A formal description of a rule system”. Computer Music Journal:56-71.

Friberg, A., et al.1991. PERFORMANCE RULES FOR COMPUTER-CONTROLLED CONTEMPORARY KEYBOARD MUSIC. Computer Music Journal 15 (2):49-55.

G

Garnett, G. E. 2001. “The Aesthetics of Interactive Computer Music.” Computer Music Journal, 25(1): 21-33.

Gibet, S. & Marteau, P.-F. (1990). Gestural control of sound synthesis. Proceedings International Computer Music Conference, Glasgow, UK, 387-391. San Francisco CA, USA: International Computer Music Association.

Gluck, R. J. 2005. “Sounds of a community: Cultural identity and interactive art”. Leonardo Music Journal 15:37-43.

Goldstein, M. 1998. “Gestural Coherence and Musical Interaction Design.” In Proceedings of IEEE Systems, Man and Cybernetics Conference, San Diego, CA.

Goudeseune, C. 2002. “Interpolated Mappings for Musical Instruments.” Organised Sound, 7(2): 85-96.

Green, O. 2011. “Agility and Playfulness: Technology and skill in the performance ecosystem”. Organised Sound 16 (2):134-144.

Grollmisch, S. E. Cano Ceron, and C. Dittmar. 2011. “Songs 2 See: Learn to Play by Playing.” Proceedings of the 131st Audio Engineering Society Convention. New York: Audio Engineering Society. pp. 2-3.

Grossmann, R. 2008. “The tip of the iceberg: laptop music and the information-technological transformation of music”. Organised Sound 13 (1):5-11.

Gurevich, M., and A. C. Fyans. 2011. “Digital Musical Interactions: Performer-system relationships and their perception by spectators”. Organised Sound 16 (2):166-175.

H

Harada, T., A. Sato, S. Hashimoto, and S. Ohteru. 1992. “Real Time Control of 3D Sound Space by Gesture.” Proceedings of the 1992 International Computer Music Conference. Ann Arbor, MI: MPublishing, University of Michigan Library. pp. 85-88.

Hashimoto, S., and H. Sawada. 2005. “A Grasping Device to Sense Hand Gesture for Expressive Sound Generation.” Journal of New Music Research, 34(1): 115.

Hoadley, R. 2010. “Implementation and Developments of Interfaces for Music Performance Through Analysis of Improvised Dance Movements.” Proceedings of the 128th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 8002.

Hunt, A., M. Wanderley, and M. Paradis. 2002. “The Importance of Parameter Mapping in Electronic Instrument Design.” Proceedings of the 2002 conference on New Interfaces for Musical Expression. Singapore: National University of Singapore, pp. 1-6.

Hunt, A., and M. Wanderley. 2002. “Mapping performer parameters to synthesis engines.” Organised Sound, 7(2): 97-108.

I

Iaird, I., D. Murphy, R. Capman, and S. Jouan. 2011. “Development of a Virtual Performance Studio with Application of Virtual Acoustic Recording Methods.” Proceedings of the 130th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 8358.

J

Jean, F., and A.B. Albu. 2008. “The Visual Keyboard: Real-Time Feet Tracking for the Control of Musical Meta-Instruments.” Signal Processing: Image Communication, 23(7): 505-515.

Johnston, A., L. Candy, and E. Edmonds. 2008. “Designing and evaluating virtual musical instruments: facilitating conversational user interaction”. Design Studies 29 (6):556-571.

Jordanous, A., and A. Smaill. 2009. “Investigating the Role of Score Following in Automatic Musical Accompaniment”. Journal of New Music Research 38 (2):197-209.

K

Kapur, A., et al. 2005. “Wearable sensors for real-time musical signal processing”. In 2005 Ieee Pacific Rim Conference on Communications, Computers and Signal Processing.

Karjalainen, M., T. Maki-patola, A. Kanerva, and A. Huovilainen. 2006. “Virtual Air Guitar.” Journal of Audio Engineering Society. 54(10): 964-980.

Katayose, H. and Inokuchi, S. 1993. “Learning Performance Rules in a Music Interpretation System.” Computers and the Humanities. 27(1):31-40.

Katayose, H. and Okudaira, K. 2004. “iFP: A Music Interface Using an Expressive Performance Template.” Entertainment Computing - Icec 2004. 3166: 529-540.

Kim-Boyle, D. 2009. “Network Musics: Play, Engagement and the Democratization of Performance.” Contemporary Music Review. 28(4-5):363-375.

Kirlik, A. and Maruyama, S. 2004. “Human-technology Interaction and Music Perception and Performance: Toward the Robust Design of Sociotechnical Systems.” Proceedings of the IEEE 92(4):616-631.

Krefeld, V. (1990). The Hand in the Web: An interview with Michel Waisvisz. Computer Music Journal, 14 (2), 28-33.

L

Lehrman, P.D. 2009. “The Wii Remote as a Musical Instrument: Technology and Case Studies.” Proceedings of the 127th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 7888.

Lovischach, J. 2009. “Music at Your Fingertips: An Electrotactile Fader.” Proceedings of the 123rd Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 7306.

Lee, M. & Wessel, D. (1992). Connectionist models for real-time control of synthesis and compositional algorithms. Proceedings International Computer Music Conference, San Jose, CA, USA, 277-280. San Fransisco CA, USA: International Computer Music Association.

Leman, M. and Camurri, A. 2005. “Understanding Musical Expressiveness using Interactive Multimedia Platforms.” Journal of Musicae Scientiae. pp.209-233.

Lusted, H. S. and R. B. Knapp. 1996. “Controlling Computers with Neural Signals.” Scientific American (October):58-63.

M

Machover, T. & Chung, J. (1989). Hyperinstruments: Musically intelligent and interactive performance and creativity systems. Proceedings International Computer Music Conference, Columbus, Ohio, USA. San Fransisco CA, USA: International Computer Music Association.

Maes, M., et. al. 1996. “Dance-Music Interface Based on Ultrasound Sensors and Computers.” Proceedings of the 3rd Brazilian Symposium on Computer Music. Accessed at http://recherche.ircam.fr/equipes/analyse-synthese/wanderle/Gestes/Externe/sbcm96.pdf

Maestre, E et al. 2007. “Acquisition of Violin Instrumental Gestures Using a Commercial EMF Tracking Device.” Proceedings of the 2007 International Computer Music Conference. Copenhagen: Denmark. pp. 386–393.

Magnusson, T. 2009. “Of Epistemic Tools: Musical Instruments as Cognitive Extensions.” Jounal of Organised Sound. 14(2):168-176.

Malloch, J., Birnbaum, D., Sinyor, E. and Wanerley, M.M. 2006. “Towards a New Conceptual Framework for Digital Musical Instruments.” Proceedings of the 9th International Conference on Digital Audio Effects. Montreal, Canada, pp.49-52.

Malloch, J. Sinclair, S. and Wanderley, M.M. 2008. “A Network-based Framework for Collaborative Development and Performance of Digital Musical Instruments.” in Computer Music Modeling and Retrieval: Sense of Sounds. pp. 401-405.

Merrill, D., Raffle, H. and Aimi, R. 2008. “The Sound of Touch: Physical Manipulation of Digital Sound.” in Proceedings of CHI(2008). Florence, Italy, pp. 739-742.

McAlpine, K., M. Bett, and J. Scanlan. 2009. “Approaches to Creating Real-Time Adaptive Music in Interactive Entertainment: A Musical Perspective.” Proceedings of the 128th Audio Engineering Society Conference. New York: Audio Engineering Society. 35(14).

Mion, L., D'Inca, G., de Gotzen, A. and Rapana, E. 2010. “Modeling Expression with Perceptual Audio Features to Enhance User Interaction.” Computer Music Journal. 34(1):65-79.

Mulder, A. (1991). Viewing dance as instrumental to music. Interface 4 (2), 15-17. Columbus, Ohio, USA: ACCAD, Ohio state university.

Mulder, A. 1994. “Virtual musical instruments: Accessing the Sound Synthesis Universe as a Performer.” Proceedings of the first Brazilian Symposium on Computer Music. Caxambu, Minas Gerais: Brazil. pp. 243-250.

Mulder, A. 1996. “Hand Gestures for HCI.” Technical Report 96-1. Simon Fraser University.

Mulder, A. (2000). Toward a choice of gestural constraints for instrumental performers. Trends in Gestural Control of Music, 315-335.

Murray-Browne, T., et. al. 2010. “The Serendiptichord: A Wearable Instrument for Contemporary Dance Performance.” Proceedings of the 128th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 8139.

N

Nagashima, Y. 1998. “Biosensorfusion: New Interfaces For Interactive Multimedia Art.” In Proceedings of the International Computer Music Conference, San Francisco: International Computer Music Association, pp. 129-132.

Newton, D. and Marshall, M.T. 2011. “The Augmentalist: Enabling Musicians to Develop Augmented Musical Instruments.” in Proceedings of the 5th International Conference on Tangible, Embedded and Embodied Interaction. pp. 249-252.

Ng, K.C. 2004. “Music via Motion: Transdomain Mapping of Motion and Sound for Interactive Performances.” in Proceedings of the IEEE. 92(4):645-655.

O

O'Sullivan, L., and F. Boland. 2011. “Visualizing and Controlling Sound with Graphical Interfaces.” Proceedings of the 131st Audio Engineering Society Convention. New York: Audio Engineering Society. 41. pp.2-4.

Obrenovic, Z. 2005. “A flexible system for creating music while interacting with the computer.” in 13th Annual ACM International Conference on Multimedia. New York, NY, USA, pp. 996-1004.

Overholt,D. Berdahl, E. and Hamilton, R. 2011. “Advancements in Actuated Musical Instruments”. Journal of Organised Sound. 16(2):154-165.

Overholt, D. Thompson, J. Putnam, L. Bell, B. Kleban, J. Sturm, B. and Kuchera-Morin, J. 2009. “A Multimodal System for Gesture Recognition in Interactive Music Performance.” Computer Music Journal. 33(4):69-82.

P

Paine, G. 2009. “Gesture and Morphology in Laptop Music Performance.” In R. T. Dean (Ed.), The Oxford Handbook of Computer Music and Digital Sound Culture. Oxford: Oxford University Press. pp. 299-329.

Paradiso, J. 1999. “The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance.” Journal of New Music Research, 28(2): 130.

Paradiso, J., and M. Ó Modhrain. 2003. “Current Trends in Electronic Music Interfaces.” Journal of New Music Research, 32(4): 345.

Paine, G. 2009. “Towards Unified Design Guidelines for New Interfaces for Musical Expression.” Journal of Organised Sound. 14(2):142-155.

Petersen, K. Solis, J. and Takanishi, A. 2008. “Development of a Real-Time Instrument Tracking System for Enabling the Musical Interaction with the Waseda Flutist Robot.” in IEEE/RSJ International Conference on Intelligent Robots and Systems. Nice, pp.313-318.

Petersen, K. Solis, J. and Takanishi, A. 2009. “Development of a Aural Real-Time Rhythmical and Harmonic Tracking to Enable the Musical Interaction with the Waseda Flutist Robot.” in IEEE/RSJ International Conference on Intelligent Robots and Systems. St. Louis, MO, pp. 2303-2308.

Petersen, K. Solis, J. and Takanishi, A. 2010. “Musical-based interaction system for the Waseda Flutist Robot.” Autonomous Robots. 28(4):471-488.

Pope, S.T. 1993. “Real-Time Performance via User Interfaces to Musical Structures.” Journal of Interface. 22(3):195-212.

Pressing, J. (1990). Cybernetic issues in interactive performance systems. Computer Music Journal, 14 (1), 12-25.

Putnam, W. 1993. “The Use of the Electromyogram for the Control of Musical Performance.” Masters Thesis, San Jose State University, CA

Q

R

Rasamimanana, N. et al. 2011. “Modular Musical Objects Towards Embodied Control of Digital Music.” in Proceedings of Tangible and Embedded Interaction. pp.9-12.

Richards, J. 2008. “Getting the Hands Dirty.” Leonardo Music Journal. 18:25-31.

Risset, J. and van Duyne, S. 1996. “Real-time performance interaction with a computer-controlled acoustic piano.” Computer Music Journal. 20(1):62-75.

Rovan, J., Wanderley, M. M., Dubnov, S., & Depalle, P. (1997). Instrumental gesture mapping strategies as expressivity determinants in computer music performance. Presented at Kansei Technology of Emotion Workshop, Genova, Italy

Rowe, R. 1993. Interactive Music Systems: Machine Listening and Composing. MIT Press Cambridge.

Rubine, D. & McAvinney, P. (1990). Programmable finger tracking instrument controllers. Computer Music Journal, 14 (1), 26-41.

S

Sapir, S. 2002. “Gestural Control of Digital Audio Environments.” Journal of New Music Research, 31(2): 119.

Selfridge, R. and J.D. Reiss. 2011. Interactive Mixing Using Wii Controller. Proceedings of the 130th Audio Engineering Society Convention. New York: Audio Engineering Society. Paper no. 8396.

Suzuki, K., and S. Hashimoto. 2004. “Robotic Interface for Embodied Interaction Via Dance and Musical Performance.” Proceedings of the IEEE. 92(4): 656-671.

Schumacher, M. and Bresson, J. 2011. “Spatial Sound Synthesis in Computer-Aided Composition.” Journal of Organised Sound. 15(3):271-289.

Spasov, M. 2011. “Music Composition as an Act of Cognition: ENACTIV -Interactive Multi-modal Composing System.” Journal of Organised Sound. 16(1):69-86.

Shedel, M. and Rootberg, A. 2009. “Generative Techniques in Hypermedia Performance.” Contemporary Music Review. 28(1): 57-73.

T

Tanaka, A. (1997). Musical technical issues in using interactive instrument technology with applications to the BioMuse. In Proceedings of the 1993 International Computer Music Conference, (pp. 124-126).

Tanaka, A. 2000. “Musical Performance Practice on Sensor-Based Instruments.” Trends in Gestural Control of Music. 389-406.

Tobenfeld, E. 1992. “A System for Computer Assisted Gestural Improvisation.” Accessed at http://quod.lib.umich.edu/cgi/p/pod/dod-idx?c=icmc;idno=bbp2372.1992.025

Tindale, A.R., A. Kapur, G. Tzanetakis, and W.A. Schloss. 2005. “Indirect Acquisition of Percussion Gestures Using Timbre Recognition.” Proceedings of the Conference on Interdisciplinary Musicology, Montreal: Canada.

U

Ungvary, T. and Vertegaal, R. 2000. “Designing Musical Cyberinstruments with body and soul in mind.” Journal of New Music Research. 29(3):245-255.

V

Vertegaal, R., T. Ungvary, and M. Kieslinger. 1996. “Towards a Musicianʼs Cockpit: Transducers, Feedback and Musical Function.” Proceedings of the International Computer Music Conference, pp. 308–311

W

Wanderley, M. M. (2001). Performer-instrument interaction: Applications to gestural control of sound synthesis, (Ph.D. dissertation) Université Pierre et Marie Curie—Paris VI, Paris, France.

Wanderley, M., and N. Orio. 2002. “Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI.” Computer Music Journal, 26(3): 62-76.

Whalley, I. 2009. “Software Agents in Music and Sound Art Research/Creative Work: Current State adn a Possible Direction.” Journal of Organised Sound. 14(2):156-167.

Weinberg, G. 2005. “Voice Networks: The Human Voice as a Creative Medium for Musical Collaboration.” Leonardo Music Journal. 15:23-26.

Weinberg, G. and Thatcher, T. 2006. “Interactive Sonification: Aesthetics, Functionality and Performance.” Leonardo Music Journal. 16:9-12.

X

Y

Yamaguchi, T. and Hashimoto, S. 2009. “Grasping Interface with Photo Sensor for a Musical Instrument.” in Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques. San Diego, pp. 542-547.

Z

Zannos, I., P. Modler, and K. Naoi. 1997. “Gesture Controlled Music Performance in a Real-Time Network.” In Proceedings of KANSEI - The Technology of Emotion Workshop, pp. 60-63.