Music composition is an intellectually demanding human activity that engages a wide range of cognitive faculties. Although several domain-general integrated cognitive architectures (ICAs) exist—ACT-R, Soar, Icarus, etc.—the use of integrated models for solving musical problems remains virtually unexplored. In designing MusiCOG, we wanted to bring forward ideas from our previous work, and combine these with principles from the fields of music perception and cognition and ICA design, in an initial attempt at an integrated model. Here we provide an introduction to MusiCOG, outline the operation of its various modules, and share some initial musical results.
ManuScore is a music notation-based, interactive music composition application, backed by a cognitively-inspired music learning and generation system. In this paper we outline its various functions, describe an applied composition study using the software, and give results from a study of listener evaluation of the music composed during the composition study. The listener study was conducted at a chamber music concert featuring a mixed programme of human-composed, machine-composed, and computer- assisted works.
Maxwell, James B., Arne Eigenfeldt, Philippe Pasquier, and Nicolas Gonzalez Thomas. “MusiCOG: A cognitive architecture for music learning and generation.” In Proceedings of the Sound and Music Computing Conference. 2012.
Maxwell, J.B., Eigenfeldt, A., Pasquier, P. “MANUSCORE: Music Notation-based Computer-Assisted Composition.” International Computer Music Conference, Ljubljana, 2012.