The field of Musical Metacreation (MuMe) has produced impressive results for both autonomous and interactive creativity, recently aided by modern deep learning frameworks. However, there are few examples of these systems crossing over to the “mainstream” of music creation and consumption. We tie together existing frameworks (Electron, TensorFlow.js, and Max For Live) to develop a system whose purpose is to bring the promise of interactive MuMe to the realm of professional music creators. Combining compelling applications of deep learning-based music generation with a focus on ease of installation and use in a popular DAW, we hope to expose more musicians and producers to the potential of using such systems in their creative workflows. Our suite of plug-ins for Ableton Live, named Magenta Studio, is available for download at http://g.co/magenta/studio along with its open source implementation.
Keywords
Deep LearningMachine LearningMusical MetacreationTensorFlow
Full Study
Institute(s)
GoogleUniversity of California
Year
2019
Abstract
Author(s)
Adam RobertsJesse EngelJon GillickYotam MannClaire KayacikSigne NørlyMonica DinculescuCarey RadebaughCurtis HawthorneDouglas Eck
Tool