Akamatsu, M. (n.d.). akalogue. Retrieved from

Bacot, B., & Féron, F.-X. (2016). The Creative Process of Sculpting the Air by Jesper Nordin: Conceiving and Performing a Concerto for Conductor with Live Electronics. Contemporary Music Review, 35(4-5), 450-474. Retrieved from doi:10.1080/07494467.2016.1257569

Bagenal, H. (1951). Musical Taste and Concert Hall Design. Proceedings of the Royal Musical Association, 78, 11-29. 

Bergsland, A., & Wechsler, R. (2017). Issues and Strategies of Rhythmicality for MotionComposer. Paper presented at the Proceedings of the 4th International Conference on Movement Computing, London, United Kingdom. 

Bradshaw, D., & Ng, K. (2008, 17-19 Nov. 2008). Tracking Conductors Hand Movements Using Multiple Wiimotes.Paper presented at the 2008 International Conference on Automated Solutions for Cross Media Content and Multi-Channel Distribution.

Brandslet, S. (2019). Digital glove returns control to conductor. Retrieved from

Dart, T. (1954). The interpretation of music. London.

De Prisco, R., Sabatino, P., Zaccagnino, G., & Zaccagnino, R. (2011, 2011//). A Customizable Recognizer for Orchestral Conducting Gestures Based on Neural Networks.Paper presented at the Applications of Evolutionary Computation, Berlin, Heidelberg.

Dick, R. (1975). The other Flute. New York, Toranto: Oxford University Press.

Flø, A. B. (2014). Doppelgänger Exhibition. Retrieved from

Halmrast, T., Guettler, K., Bader, R., & R.I.Godøy. (2010). Musical gestures : sound, movement, and meaning. New York: Routledge.

Höfer, A., Hadjakos, A., & Mühlhäuser, M. (2009). Gyroscope-Based Conducting Gesture Recognition. In: Zenodo.

Ircam, T. I. T. a. (n.d.). MuBuForMax. Retrieved from

Jensenius, A. R. (2014). From experimental music technology to clinical tool. In: Norges musikkhøgskole.

Karg, M., Samadani, A. A., Gorbet, R., Kühnlenz, K., Hoey, J., & Kulić, D. (2013). Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation. IEEE Transactions on Affective Computing, 4(4), 341-359. doi:10.1109/T-AFFC.2013.29

Knussen, O. (1994). Oliver Knussen on Arthur Nikisch: The art of conducting – Great conductors of the past. Retrieved from

Kolesnik, P., & Wanderley, M. M. (2004). Recognition, Analysis and Performance with Expressive Conducting Gestures.Paper presented at the ICMC.

Kyungho Lee, M. J. J., Guy E. Garnett. (2016). A Review of Interactive Conducting Systems: 1970-2015. Paper presented at the The 42nd International Computer Music Conference (ICMC’16).

Lacroix, M. (2018, June 20-23). Deux Ex Machina: Methods, Processes and Analysis of Mixed Music.Paper presented at the Electroacoustic Music Studies Network Conference (EMS), Florence, Italy.

Lee, E., Kiel, H., Dedenbach, S., Grüll, I., Karrer, T., Wolf, M., & Borchers, J. (2006). <i>iSymphony</i>: an adaptive interactive orchestral conducting system for digital audio and video streams. In (pp. 259-262).

Marrin, T., & Picard, R. (1998). THE “CONDUCTOR’S JACKET”:

A Devise for Recording Expressive Musical Gestures.Paper presented at the International Computer Music Conference.

Mathews, M., & Moore, F. (1970). GROOVE-a program to compose, store, and edit functions of time. Communications of the ACM, 13(12), 715-721. doi:10.1145/362814.362817

Mathews, M. V. (1991). The Radio Baton and Conductor Program, or: Pitch, the Most Important and Least Expressive Part of Music. Computer Music Journal, 15(4), 37-46. Retrieved from doi:10.2307/3681070

May, A. (1999). [Jupiter, Philippe Manoury]. Computer Music Journal, 23(3), 103-104. Retrieved from

Meyer, J., & Hansen, U. (2009). Acoustics and the performance of music : manual for acousticians, audio engineers, musicians, architects and musical instruments makers(5th ed.). New York: Springer Science+Business Media. (n.d.). home. Retrieved from

Morita, H., Hashimoto, S., & Ohteru, S. (1991). A computer music system that follows a human conductor. Computer, 24(7), 44-53. doi:10.1109/2.84835

Music, N. A. o. (2018, n.d.). Aural Sonology : Emergent Musical Forms. Retrieved from

Nakra, T. M., Ivanov, Y., Smaragdis, P., & Ault, C. (2009). The Ubs Virtual Maestro : An Interactive Conducting System. In: Zenodo.

Nakra, T. M., Machover, T., & Picard, R. W. (1999). Inside the” conductor’s jacket”: analysis, interpretation and musical synthesis of expressive gesture. Dept. Media Arts Sci., Ph. D. dissertation, Massachusetts Institute of Technology, Cambridge, MA, USA

Nikoladze, K. (2018). 

Beat Machines ++ | Koka Nikoladze | TEDxOslo. Retrieved from

Ninke, W. H. (1965). Graphic 1: a remote graphical display console system. Paper presented at the Proceedings of the November 30–December 1, 1965, fall joint computer conference, part I, Las Vegas, Nevada. 

NOTAM. (n.d.). Controller suit. Retrieved from

Novitz, A. (2019).Monsters I Love: On Multivocal Arts.(PhD Artistic Research), Stockholm University of the Arts, Retrieved from

Nunzio, A. D. (2013). 4X. | computer music history and more.Retrieved from

Oliver La Rosa, J. E. (2011). A computer music instrumentarium.University of California. 

Paradiso, J. A. (1999). The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance. Journal of New Music Research, 28(2), 130-149. Retrieved from doi:10.1076/jnmr.

Peng, L., & Gerhard, D. (2009). A Wii-Based Gestural Interface For Computer Conducting Systems. In: Zenodo.

Philharmonic., N. Y. (1999, n.d.). A short history of the symphony orchestra. Retrieved from

Puckette, M. (1991). Something Digital. Computer Music Journal, 15(4), 65-69. Retrieved from doi:10.2307/3681075

Puckette, M., & Lippe, C. (1992). Score following in practice.Paper presented at the Proceedings of the International Computer Music Conference.

Roads, C. (1996). The computer music tutorial. Cambridge, Mass: MIT Press.

Rowe, R. (1992). Interactor 4.0.8. [Interactor 4.0.8, Mark Coniglio, Morton Subotnick]. Leonardo Music Journal, 2(1), 122-123. Retrieved from doi:10.2307/1513229

Rowe, R. (1993). Interactive music systems : machine listening and composingInteractive Music Systems.

Rowe, R. (1999). The aesthetics of interactive music systems. Contemporary Music Review, 18(3), 83-87. Retrieved from doi:10.1080/07494469900640361

Sarasua, A., Caramiaux, B., & Tanaka, A. (2016). Machine Learning of Personal Gesture Variation in Music Conducting. Paper presented at the Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, California, USA. 

Sarasúa Berodia, Á. (2017). Musical interaction based on the conductor metaphor. In E. Gómez Gutiérrez, E. Guaus, & I. L. C. Universitat Pompeu Fabra. Departament De Tecnologies De La Informació (Eds.).

Schramm, R., Jung, C. u. R., & Miranda, E. R. (2015). Dynamic Time Warping for Music Conducting Gestures Evaluation. IEEE Transactions on Multimedia, 17(2), 243-255. doi:10.1109/TMM.2014.2377553

SONAMI, L. (n.d.). LAETITIA SONAMI. Retrieved from

Stenslie, S. (n.d.). EROTOGOD. Retrieved from

Visentin, P., Staples, T., Wasiak, E. B., & Shan, G. (2010). A Pilot Study on the Efficacy of Line-of-Sight Gestural Compensation While Conducting Music. SAGE journals, 110(2), 647-653. doi:doi:

Waisvisz, M. (1985). The Hands: A Set of Remote MIDI-Controllers.Paper presented at the International Computer Music Conference (ICMC), Burnaby, BC, Canada.

Wallin, R. (n.d.). Cabinet of Curiosities. Retrieved from

Wechsler, R. (n.d). YOU MAKE THE MUSIC WE MAKE YOU MOVE. Retrieved from