Blogs

Can’t Remember the Song? Hum to Search With Google’s New Feature!

BLOG  by 

CitrusBits
October 16, 2020
#XR #VR #UX #UI

What’s that Queen song that starts with a bang of drums and claps and more drum-bangs and claps, something like ‘dun-dun tach, dun-dun tach’? It’s a bummer. I can’t seem to recall the lyrics either. Nevermind, I’ll just hum it to my Google Assistant.

Ba-dum-tss!

Yeap. Google search now lets you search for songs by humming the tunes, similar to Apple’s Shazam. What’s more, it uses machine learning to better understand and overcome any misspellings or typos too.

Here’s the best part, Google says you don’t have to be pitch-perfect to get the desired results. So your tone-deafness would not come in the way of your song-searching.

This new ‘hum to search’ feature – announced on October 15 – is available for both iOS and Android via Google Assistant.

Isn’t Google truly magnificent? Magical? Well, of course, it’s all AI and machine learning down to its very core, oh but it makes it all the more magical. Perhaps even better than magic.

Hum to Search with Google: How it works

There’s no rocket science just mostly machine learning. Google is all about convenience. All a smartphone user has to do is tap their microphone button in the Google app and say “What’s this song”?

If that doesn’t seem to work, you can just click the “Search a song” button.

It will take an approximate of 10 to 15 seconds of humming before Google returns with your desired song results – for what it thinks the song is. Google will share the most relevant choices based on the tune you hum. While Google clearly says you don’t have to be pitch-perfect, in my opinion, it may still require you to be somewhat accurate with your humming.

The Senior Product Manager of Google Search, Krishna Kumar, explained in a blog post that when you hum a melody into Google Search, the machine learning models at play convert your humming audio into a number-based sequence characterizing the song’s tune. What’s more, these models are utilized to detect songs depending on a wide array of sources, including human singing, studio recordings, whistling, and humming.

Adding further, algorithms often peel away all other information, such as assisting instruments as well as the timbre and pitch of your voice. What we’re left with is then the number-based sequence of the song. In simpler words, a song’s melody is like its fingerprint

Google tested it to search for the Bee Gee’s Staying Alive and Drake’s Hotline Bling and the desired results were achieved however, the real struggle came with identifying more challenging songs like Peter Gabriel’s Solsbury Hill. I can only imagine what my search results for an ACDC or Pink Floyd track would be like. Nevertheless, a search attempt is mandatory, I say.

Here are a couple of songs – each a different genre – that I looked for using ‘hum to search’. I whistled as well as hummed and it took me a few attempts for certain tracks like Metallica’s Enter Sandman. On the contrary, certain songs like Joan Osbourne’s One of us were considerably easier to search by humming.

Meanwhile, Google claims, it’s still refining its ability to comprehend misspelled queries and says that one in every 10 searches are misspelled.

The new spelling algorithm leverages a deep neural net to boost its recognition. Google says it’s a modification that “allows for better spelling than all of the other changes over the last five years”.

Here’s another interesting update powered by Google’s Live Transcribe technology
The Big G’s on fire. Just a while back, Google updated its Android OS as well. Now it will send users notifications when their smartphones hear a certain household sound i.e doorbell, running water, or a barking dog. This functionality – powered by Google’s Live Transcribe technology – was originally leveraged for converting speech to text. However, now, it will also detect sounds around you.

Albeit, the household sound notifications were initially developed for people with hearing loss, I believe, most of us will be using this feature to stay alert especially when we’re wearing headphones, Netflixing, listening to music, or podcasts.

Other than your Android phone you can also receive these alerts on your OS watch.

The Big G further adds this is an offline feature that works through on-device machine learning and can identify as many as10 types of sounds including running water, baby sounds, smoke and fire alarms, and knocks on a door. However, to try out this feature, you have to have an Android 5.0 or above.

No Tricks, Just Treats!

It appears Halloween came early this year with Google showering us with fun treats. Although the ‘Hum to Search‘ feature may not seem like Google’s best works thus far, it’s an interesting development in the algorithms and shows how far AI and machine learning have come. Besides, why download a separate app when you can have it all in one place?

CitrusBits is a leading custom software development company that specializes in creating innovative solutions tailored to meet unique business needs. With a team of expert developers and designers, they deliver high-quality mobile app development services, empowering businesses to engage their audience and drive digital transformation. CitrusBits’ commitment to excellence and customer satisfaction makes them a trusted partner for companies seeking cutting-edge software solutions.

About the Author

CitrusBits

Content Writer

Lorem ipsum dolor sit amet consectetur. Odio ullamcorper enim eu sit. Sed sed sociis varius odio vitae viverra. Eu sapien at vitae vulputate tortor massa semper vel. Lectus sed gravida blandit lorem consequat erat integer non ut. Morbi amet dui cras posuere venenatis. Laoreet sapien lacus sit sit elementum risus massa auctor. Enim ornare pharetra quis massa fusce. Nibh vitae in erat ut mollis erat. Amet cursus ut sem condimentum ultrices. Felis morbi malesuada sit amet ultrices at ut consectetur.

Newsletter

Let’s stay in touch