OUR PROJECTS
[[bpstrwcotob]]
Apollo: An Interactive Environment for Generating Symbolic Musical Phrases using Corpus-based Style Imitation
Interactive Web Framework for Interactive Machine learning (IML) as Computer-Assisted Composition
MASOM: Musical Agent based on Self-Organizing Maps
MASOM learns how to play music by listening to some.
P.O.E.M.A @Oi Futuro, Rio de Janeiro, Brazil @Olympics 2016
VR and dance generative music and generative video performance
Mova: Using Aesthetic Interaction to Interpret and View Movement Data
The Lab's movement visualisation tool
Walknet: Affective Movement Recognition and Generation
Machine learning models to recognizer the valence and arousal of movement information