Google’s AI lets a user create own melodies in a virtual music room

The room is a two-dimensional, pixelated drawing displayed in a web browser where clicking on different objects, like the clock and the piano allows the user to adjust different tracks.   | Photo Credit: Google

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Google Magenta’s latest project, Lo-Fi player lets a user mix several music tracks to create a virtual music room without any music expertise.

The customised music room can be shared with others as well. The entire experience is powered by models from Magenta, which is designed to make music and art using Google’s machine learning systems.

“We’re able to create something more like a “music generating room” than a musical instrument or composition tool,” Vibert Thio, the summer 2020 Magenta intern who designed the player said.

The Lo-Fi player is simply a virtual room in the browser where a user can tinker around the objects in the room, play with the beat and change the music in real-time. The view outside the room window relates to the background sound in the track where both the visual and the music can be changed by clicking on the window.

The TV in the centre represents MusicVAE, where a user can create new melodies by recombining existing ones. The radio beside the TV shows MelodyRNN that generates new melodies.

“We want to show that something as simple as applying MusicVAE to short melodies can produce pleasing results when done in a creative, fun context, Thio said.

To let people inhabit the same music together, Lo-Fi player will be streamed on YouTube for a few weeks. The interaction mode however, differs to browser. The user will be required to type commands in the chat, rather than clicking on the elements in the room. The commands would change the colour of the room, change the melody, switch the instruments, and so on.

Every time the beat loops, the system will randomly select comments from the live chat to modify the music and those comments will be highlighted with a conversation bubble. Even users who don’t interact will be able to hear how it evolves as it is modified by chat commands.

“This is very much a first attempt at ML-powered interactive YouTube streaming,” Thio said.

He chose Lo-Fi Hip Hop as it’s a popular genre with relatively simple music structure and the limited flexibility helps ensure that the music always makes sense. For developers, the company has also built a js tutorial called “Play, Magenta!” to edit the sounds and canvas with the editor live in your browser.

This article is closed for comments.
Please Email the Editor

Printable version | Feb 26, 2021 7:13:30 AM |

Next Story