Search & Find
Accueil
Accueil : « Gesture Cutting through Textual Complexity: a Model and a Tool for the Embodied Navigation of Complex Piano Notation »

« Gesture Cutting through Textual Complexity: a Model and a Tool for the Embodied Navigation of Complex Piano Notation »

Le 29 octobre 2016
De 16h00 à 16h30
ICLA (Yamanashi Gakuin University), 2-7-17 Sakaori, Kofu-shi Yamanashi-ken 400-0805 (Japon)

     Communication dispensée par Pavlos ANTONIADIS (membre du GREAM) dans le cadre du Symposium « Music + the Brain », au International College of Liberal Arts de la Yamanashi Gakuin University à Kofu-shi (Japon), le 29 octobre 2016 à 16h00.

     The proposed paper and demo introduces a model of embodied interaction with complex piano notation and a prototype interactive system for the gestural processing and control of musical scores.

     In the first part, we present the post-cartesian foundations of a model of embodied interaction with symbolic notation as complex as Iannis Xenakis’s and Brian Ferneyhough’s. The performance of such complex notation is conceptualized as embodied navigation in a non-linear space of notational affordances. The affordances are representable as annotations of the score, which takes the form of a multi-layered tablature (fig.1).The performer moves through and between the tablature's layers and manipulates notation as if it had tangible properties; as if it formed part of the musical instrument. As opposed to the timeline of a singular performance, the concept of the score as non-linear space allows for the representation of diachronic learning processes and interpretational variations in series of performances. The act of navigating this space is claimed to form indispensable part of the cognitive processes involved in learning and performing: It dynamically transforms the notation as external information-bearing structure and it constitutes an example of mediation between symbolic signification, action-oriented descriptors and physical energy. In this sense, gesture acts as an interface for notation processing and notation forms part of the dynamic system “body-instrument-notation”, rather than the composer’s “brain in a vat”. Concepts from Gibson’s ecological psychology, Rowlands’s externalism, Lakoff’s metaphor theory, dynamic systems theory and, last but not least, Leman’s embodied mediation theory, are mapped upon Xenakis’s and Ferneyhough’s ideologies on notation and performance, offering an embodied and extended supplement, or even alternative, to traditional interpretation models.

     The second part proposes a technological application of the above-mentioned model. It introduces a prototype interactive system for the real-time processing and control of complex piano notation through the pianist’s gesture. This system, by the name GesTCom, draws from latest developments in the fields of computer music representation (augmented and interactive musical scores via Fober’s INScore) and gesture modeling (motionfollower by Bevilacqua / ISMM Team IRCAM). Gestural, video, audio and MIDI data are captured, qualitatively correlated to the musical score (fig. 2) and appropriately mapped back into it, turning it into  a  personalized, dynamic, multimodal tablature (fig. 3). This tablature may be used for performance analysis and documentation, learning through augmented feedback, and can contribute to the design of interactive multimodal systems.

     Concluding: We wish to present a performer’s perspective on the osmosis between contemporary performance practice, embodied cognition and computer music interaction, by way of a theoretical model of embodied navigation of complex notation and an interactive system dedicated to it. This presentation affirms the centrality of gesture as an interface between physical energy and symbolic representations and hopes to contribute in the discussion concerning the ontological status of gesture and notation in a digitally mediated world.