How Google deploys Machine Learning?

Pradeep Kumar
4 min readOct 20, 2020

Hey Everyone…

Do you guys know how Google uses Machine Learning to enhance the performance of its products? Lets find out.

In this blog we will learn :

  • What is Machine Learning
  • What is Artificial Intelligence
  • How Google uses Machine Learning

Machine Learning :

Machine learning is an emerging technology that has its application in Data Sciences, Artificial Intelligence (AI) and Deep Learning. Machine Learning gives the ability to a machine to learn without being explicitly programmed or monitored. The basic application of Machine Learning technology is to implement algorithms that can receive input data for data analysis and uses statistical analytics to predict outcomes.

In recent times, Machine Learning has evolved to provide much precise predictions by the use of Deep Learning and Neural Networks. Such applied algorithms in real time has given hope in the field of Artificial Intelligence in which machines might be able to think like humans.

Artificial Intelligence :

Artificial Intelligence (AI) is the study of computer science focusing on developing software or machines that exhibit human intelligence. Artificial Intelligence is of 3 types :

  • Narrow Intelligence ‐ this type of AI can perform actions only a set of specified tasks
  • General Intelligence ‐ this type of AI is the one which mimics human intelligence
  • Super Intelligence ‐ this AI might posses intelligence which can outsmart the smartest humans on earth in all possible ways

How Google uses Machine Learning :

So do you guys wonder in which of its products Google is using Machine Learning? Well, every people around the world using Google is using Machine Learning directly or indirectly. Yes, you heard it right. Google uses Machine Learning in most of its products and services and millions of users use these feature without knowing that Machine Learning is used. Here are some of the services where Google uses ML :

  1. Google Search

Google search deploys ML by helping find users exactly what they are looking for. Our query is processed and the ML model ranks the relevant results to provide results. Recently Google has not only been able to rank web pages but also were able to index individual passages from web pages. This breakthrough will improve 7% of the search queries across all languages. Bidirectional Encoder Representations from Transformers (BERT) language understanding systems are helping to deliver more relevant content.

2. Spelling

It has been know that one in ten has been misspelled. A new Deep Learning algorithm is used to decipher misspellings. Google claims that this single change makes a greater improvement to spelling than all of our improvements over the last five years. A new spelling algorithm helps them understand the context of misspelled words, so they can help you find the right results, all in under 3 milliseconds.

3. Subtopics

Neural Networks were applied to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad. As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.

4. Data Commons

Google is also bringing Data Commons — its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in India” on Search to see information in context.

5. Augmented reality

On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other auto brands, for example, smartphone users can zoom in to view the vehicle’s steering wheel and other details to scale. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffled sleeves, and more.

In another addition to Search, Google says it will deploy a feature that highlights notable points in videos for example, a screenshot comparing different products or a key step in a recipe. Google expects 10% of searches will use this technology by the end of 2020. And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants, including how busy they tend to be and their star ratings.

6. Hum to search

Lastly, Google says it will let users search for songs by simply humming or whistling melodies, initially in English on iOS and in more than 20 languages on Android. You will be able to launch the feature by opening the latest version of the Google app or Search widget, tapping the mic icon, and saying “What’s this song?” or selecting the “Search a song” button, followed by at least 10 to 15 seconds of humming or whistling. “After you’re finished humming, our machine learning algorithm helps identify potential song matches,”

End of Blog!!!

So we have seen various services where Google uses Machine Learning and there are much more. Machine Learning and Artificial Intelligence provides many useful services and has the potential to solve any complex problems that no human mind can think of.

--

--