The “Т|А” Project: the Duo of the Human Being and the Computer

Main Article Content

Alexei V. Krasnoskulov

Abstract

The aim of the “Т|А” project is to research the potentials of the computer system as a “virtual” performing musician
(“agent”), its possibilities in the creation and transformation of musical material in artistic and procedural conditions
founded on the perception and analysis of sound manifestation of the artistic conception and the outer expression of
emotions of the human being – the “real” performer. The software especially created for the project makes it possible
to realize the interactive duo of the “real” and the “virtual” musicians, where the latter perceive the sound of the part
of the human performer and by making use of the genetic algorithm assembles it into the sound “landscape” of its own
musical part. The “real” musician directs the process of performance of the “agent” by means of change of the emotions
expressed by the face. Each change of emotional state finds its reflection in the adjustment of the characteristics of
timbre and reverberation utilized by the computer system of sound elements and the transformation of the sound
of the entire “virtual” part. Basing itself on scholarly works about the peculiarities of the correlation of sounds of a
particular pitch and/or timbre and the emotional states aroused by them, as well as the auditory tests carried out directly
as part of the project, the article demonstrates a structure of correlation of the basic emotions with the frequency and
spatial parameters of sound. On the example of two specific musical compositions this work gives a description of the
algorithmic and creative processes of man and machine and also discusses the problems and perspectives of such kind
of interactive ensemble.

Keywords: interaction between man and computer, discernment of emotions, genetic algorithm, interactive
music.

Article Details

How to Cite
Krasnoskulov, A. V. (2017). The “Т|А” Project: the Duo of the Human Being and the Computer. Music Scholarship / Problemy Muzykal’noj Nauki, (2), 22–26. https://doi.org/10.17674/1997-0854.2017.2.022-026
Section
Horizonts of Musicology
Author Biography

Alexei V. Krasnoskulov, Rostov State S. V. Rachmaninoff Conservatory

Ph.D. (Arts), Head at the Department of Sound Engineering and Informational Technologies, Professor at the Piano Major Department

References

1. Krasnoskulov A. V. Ansamblevoe muzitsirovanie v tsifrovom mire [Ensemble Performance in a Digital World]. Muzykal’noe iskusstvo v sovremennom sotsiume: sb. nauch. st. [The Art of Music in the Modern Society: A Compilation of Scholarly Articles]. Rostov-on-Don: Publishing House of the Rostov Conservatory, 2014, pp. 278–288.
2. Krasnoskulov A. V. Evolyutsionnoe modelirovanie muzyki: printsipy, podkhody, problemy [The Evolutional Modeling of Music: Principles, Approaches, Issues]. Yuzhno-Rossiyskiy muzykal’nyy al’manakh [South-Russian Musical Anthology]. 2016. No. 1 (22), pp. 24–30.
3. Krasnoskulov A. V. Evolyutsionnye vychisleniya v interaktivnoy muzyke [Evolutionary Computations in Interactive Music]. Vestnik muzykal’noy nauki [Herald of Musical Scholarship]. 2016. No. 2 (12), pp. 54–59.
4. Chau C. J., Mo R., Horner A. The Correspondence of Music Emotion and Timbre in Sustained Musical Instrument Sounds. Journal of the Audio Engineering Society. 2014. Vol. 62, no. 10, pp. 663–675.
5. Chau C. J., Mo R., Horner A. The Emotional Characteristics of Piano Sounds with Different Pitch and Dynamics. Journal of the Audio Engineering Society. 2016. Vol. 64, no. 11, pp. 918–932.
6. Chau C. J., Wu B., Horner A. Timbre Features and Music Emotion in Plucked String, Mallet Percussion, and Keyboard Tones. Proceedings of the 40th International Computer Music Conference (ICMC). Michigan, 2014, pp. 982–989.
7. Evolutionary Computer Music. Miranda, Eduardo Reck; Biles, John Al (Eds.) London: Springer, 2007. 249 p.
8. Mo R., Wu B., Horner A. The Effects of Reverberation on the Emotional Characteristics of Musical Instruments. Journal of the Audio Engineering Society. 2015. Vol. 63, no. 12, pp. 966–979.
9. Winters R. M., Hattwick I., Wanderley M. M. Emotional Data in Music Performance: Two Audio Environments for the Emotional Imaging Composer. Proceedings of the 3rd International Conference on Music & Emotion (ICME 3). Jyväskylä, Finland, 11th–15th June 2013. URL: https://jyx.jyu.fi/dspace/bitstream/handle/123456789/41617/R.%20Michael%20Winters%20-%20Emotional%20Data%20in%20Music%20Performance%20-%20Two%20Audio%20Environments%20for%20the%20Emotional%20Imaging%20Composer.pdf?sequence=1.
10. Wu B., Horner A., Lee C. Musical Timbre and Emotion: The Identification of Salient Timbral Features in Sustained Musical Instrument Tones Equalized in Attack Time and Spectral Centroid. Proceedings of the 40th International Computer Music Conference (ICMC). Michigan, 2014, pp. 928–934.