Skip to main content
Google BERT

Google BERT is an algorithm that increases the search engine’s understanding of human language. Is an AI language model that the company now applies to search results.

When was BERT released?

In November 2018, Google launched BERT in open source on the GitHub platform. From then on, anyone can use BERT’s pre-trained codes and templates to quickly create their own system.

Google itself used BERT in its search system. In October 2019 , Google announced its biggest update in recent times: BERT’s adoption in the search algorithm.

This is essential in the universe of searches since people express themselves spontaneously in search terms and page contents — and Google works to make the correct match between one and the other. BERT is the acronym for Bidirectional Encoder Representations from Transformers.

Google had already adopted models to understand human language, but this update was announced as one of the  most significant leaps in search engine history.

How it's Works?

Though it's a complex model, Google BERT's purpose is very simple:

BERT uses AI in the form of natural language processing (NLP), natural language understanding (NLU), and sentiment analysis to process every word in a search query in relation to all the other words in a sentence.

In the past, Google used to process words one-by-one in order. The difference in results between the old and new approach can be dramatic.

Google offers the example of a search like "2019 brazil traveler to usa need a visa."

In the past, Google would have interpreted this search as a US traveler looking for a visa to Brazil.

That's because Google didn't account for prepositions and context inherent in human language. In this example, Google would not have taken the word "to" into account. That changes the meaning of the search.

Compare that to BERT's approach.

BERT takes the whole sentence into account, including prepositions. In this example, BERT now understands the searcher is a Brazilian looking for a US visa—not the other way around.

A lot of people use natural language to search for information. This language includes plenty of context clues that change search meaning.

Thanks to BERT's NLP model, Google will now return information that better understands this context.

Google says the BERT model will have an effect 10% of all US searches, so it's a big deal. And, the language model that powers BERT also have language understanding of non-English languages. So, expect its impact to be even bigger over time.

Video explanation

Similar posts