A four-armed, marimba-playing robot can now write and play its own compositions with aid from artificial intelligence and deep learning.
Researchers fed the robot nearly 5,000 complete songs—from Beethoven to the Beatles to Lady Gaga to Miles Davis—and more than 2 million motifs, riffs, and licks of music. Aside from giving the machine a seed, or the first four measures to use as a starting point, no humans are involved in either the composition or the performance of the music.
Welcome to our latest issue of ValueWalk’s hedge fund update. Below subscribers can find an excerpt in text and the full issue in PDF format. Please send us your feedback! Featuring Point72 Asset Management losing about 10% in January, Millennium Management on a hiring spree, and hedge fund industry's assets under management swell to nearly Read More
The first two compositions are roughly 30 seconds in length. See and hear the robot, named Shimon, play in the videos below.
Doctoral student Mason Bretan is the man behind the machine. He’s worked with Shimon for seven years, enabling it to “listen” to music played by humans and improvise over pre-composed chord progressions. Now Shimon is a solo composer for the first time, generating the melody and harmonic structure on its own.
“Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,” says Bretan, who will receive his doctorate in music technology this summer at the Georgia Institute of Technology.
“Shimon’s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments,” he says.
Bretan says this is the first time a robot has used deep learning to create music. And unlike its days of improvising, when it played monophonically, Shimon is able to play harmonies and chords. It’s also thinking much more like a human musician, focusing less on the next note, as it did before, and more on the overall structure of the composition.
“When we play or listen to music, we don’t think about the next note and only that next note,” says Bretan. “An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.”
“This is a leap in Shimon’s musical quality because it’s using deep learning to create a more structured and coherent composition,” says Gil Weinberg, Bretan’s advisor, a professor in the School of Music and Shimon’s creator.
“We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring, and strange,” he says.
Shimon will create more pieces in the future. As long as the researchers feed it a different seed, the robot will produce something different each time—music that the researchers can’t predict. In the first piece, Bretan fed Shimon a melody comprised of eighth notes. It received a sixteenth note melody the second time, which influenced it to generate faster note sequences.
Bretan acknowledges that he can’t pick out individual songs that Shimon is referencing. He is able to recognize classical chord progression and influences of artists, such as Mozart, for example.
“They sound like a fusion of jazz and classical,” says Bretan, who plays the keyboards and guitar in his free time. “I definitely hear more classical, especially in the harmony. But then I hear chromatic moving steps in the first piece—that’s definitely something you hear in jazz.”
Shimon’s debut as a solo composer appeared in a video clip in the Consumer Electronic Show (CES) keynote and will have its first live performance at the Aspen Ideas Festival.
Source: Georgia Institute of Technology