Google Assistant can now help you identify that song

The feature, earlier available only on Pixel phones, is being rolled out across all Android devices

November 07, 2017 05:03 pm | Updated 05:36 pm IST

Google Assistant did not have music recognition when first rolled out. This is now being rectified with the latest Android update. Representational image. | AP

Google Assistant did not have music recognition when first rolled out. This is now being rectified with the latest Android update. Representational image. | AP

Now, musicophiles will no longer need to download the Shazam app to help them track every piece of music they come across. Nor do you need to splurge on a Pixel 2 to get access to a built-in song identifier. Google Assistant, available by default on Android 6 and above, now has that superpower — music recognition.

This means, if you hear a song playing around you, all you have to do is say "Ok Google. What song is this?". And the Assistant will serve you up with complete information about the song, including title, artist, lyrics, through Google Play Music, YouTube links, and any third party apps installed on your device — the works.

The feature was rolled out in an update to Android devices today. It had been released in a limited way last month, for Google Pixel and Pixel 2 smartphones alone. Google Now did have the ability to decipher music, but the early version of the Assistant had dispensed with the functionality.

It is not available in all countries just yet, however.

How does song recognition work?

Technically, gadgets or apps don't identify songs as much as they identify sounds.

Shazam uses a technique called spectrography, wherein it takes a database of thousands of popular songs and creates a unique spectral signature or "acoustic fingerprint" for each one. This fingerprint captures and stores sound data in three parameters — frequency, amplitude and time, and creates a numeric signature for the song.

Then, when you the user hold your phone up to a musical playback or performance, it captures a sample of the song clip through the device's microphone. It then runs the audio through the same spectral algorithm, and finds a match in its database.

Shazam has a catalogue of over 11 million songs, or acoustic fingerprints. Google as of now has over 10,000 popular songs in its database that you can expect it to identify.

The algorithm, which was first developed by Shazam's chief scientist Avery Wang, can "correctly identify music in the presence of voices, traffic noise, dropout, and even other music" and other forms of external interference. You can find the research document here .

However, it was designed to match sound files with existing signatures in the database. So, if you want to identify the song from a live performance and avoid a false positive, the artiste or band had better be pretty accurate in reproducing the studio version of their song.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.