All you need to know about BERT algorithm in Google search

In October, Google launched a major update of search algorithms over the past five years - BERT Update. Implementation of this algorithm was designed to improve the understanding of natural language queries and standing behind their intentions. As results, Google is now able to provide more relevant search results.

BERT covers 10% of search queries, and it means that it can to some extent influence the visibility and traffic to organic search.

Below - an overview of frequently asked questions on the topic BERT, prepared by Search Engine Land.

When BERT has been launched in a Google search?

BERT Launch the Google search engine for queries in English was launched October 21, 2019.

In the future, Google plans to expand this algorithm for all languages supported by the search engine, but a precise timetable for the company yet . BERT model is also used to further block the answers (featured snippets) in more than 20 countries.

What is a BERT?

BERT (Bidirectional Encoder Representations from Transformers) - a method of natural language processing, neural networks based on the use of the new architecture for working with sequences, known as "Transformers". This technology helps Google determine the context of the words in the search query.

For example, in the phrases «nine to five» (nine to five) and «a quarter to five» (a quarter to five) excuse «to» has two different meanings, obviously for men, but less clear for the search engines. BERT is to distinguish such nuances to facilitate the formation of the more relevant Google results.

In November 2018, Google opened the source code of BERT, and now anyone can use this technology to educate their own system of natural language processing - to respond to questions or other problems.

What is a neural network?

In simple terms, neural networks - are algorithms designed to identify patterns. The distribution of images by categories, handwriting recognition and even predicting trends in the financial markets - this is the usual applications of neural networks. They are also used in the search engines.

To recognize patterns, a neural network trained on data sets. Pre-BERT training took place on the body simple text from Wikipedia, the what Google told me when I opened the source code technology.

What is natural language processing?

Natural language processing (Natural language processing, NLP) relates to the field of artificial intelligence, which is engaged in linguistics, to allow computers to understand how people communicate.

Examples of achievements in the field of NLP include social media monitoring tools, chat bots, etc.

In general, NLP - it is not a new feature for search engines. Thus BERT represents progress in the NLP, achieved using a bi-directional training (more on this below).

How does BERT?

Breakthrough in the BERT is its ability to teach the language models based on the entire set of words in an application or request (bi-directional training), then as an ordered sequence of words is analyzed in traditional learning (from left to right or right to left). BERT allows language model to understand the context of the word based on the surrounding words, not just the word that precedes or follows immediately after it.

Google calls BERT "deeply bidirectional" because the contextual presentation of words begin "from the bottom of deep neural network."

"For example, the word" bank "will have the same context-free performance in" bank account "(bank account) and" bank of river "(river). Context model instead generate a representation of each word based on other words in the sentence. For example, in the sentence «I accessed the bank account» (I had access to a bank account) unidirectional context model will represent the "bank" on the basis of "I accessed the", but not "account". In this case BERT is "bank", using the preceding and the following context: "I accessed the ... account" ».

Google revealed a few examples of how the use of BERT finding may influence the results. So, at the request [math practice books for adults] (textbooks in mathematics for adults) in the top of search results before deduced textbook for grades 6-8. After starting the BERT at the top SERP is a book entitled «Math for Grownups» ( "Mathematics for adults").

Textbook for grades 6-8 are still displayed on the first page for that term, but in the issue also has two books, is focused on adults who are ranked higher, including a block with the answer.

Changing the search results similar to the above should reflect the new understanding of the request with BERT.

Is the BERT for understanding all of requests?

No. BERT helps Google understand 1 of 10 requests in English in the United States.

"In particular, in the case of longer, more conversational requests where prepositions, such as" for "and" to "are of great importance, the search engine [now] able to understand the context of the words in your query," - said in a Google blog.

However, not all requests are spoken or include prepositions. Brand queries and shorter sentences - are just two examples of those requests that do not require the use of BERT.

As BERT affect my favorites snippets?

As we have seen in the example above, BERT may also affect the results that appear in the blocks with the answers (featured snippets or "selected snippets").

The example below compares the Google search on the selected snippets [parking on a hill with no curb] (car park on a hill without curb), explaining:

"In the past, such a request brought our systems into confusion - we paid too much attention to the word" curb "and ignored the word" no ", not realizing it was more critical to the proper response to this request. So we returned results for parking on a hill with a border. "

What is the difference between BERT and RankBrain?

Some of the features resemble the first BERT A planted Google method for understanding on the basis of the AI requests - RankBrain. But these are two separate algorithm that can be used to improve search results.

"The first thing to understand about RankBrain, - is that it works in parallel with the conventional algorithms ranking in organic search, and used to correct the results calculated by these algorithms," - said Eric Enge (Eric Enge), Perficient Digital general manager.

RankBrain adjusts the results by viewing the current request and finding similar past requests. Then he checks the effectiveness of the search results for these historical inquiries. "On the basis of what he sees, RankBrain can adjust the display of the results of conventional ranking algorithm" - said Enge.

RankBrain also helps Google to interpret search queries, so that it can display results that contain the words that are in the query. In the below example, Google was able to establish that the user is looking for information on the Eiffel Tower, despite the fact that the name of the tower does not appear in the query [the height of the symbol of Paris].

«BERT works completely differently" - continues to Enge. "Traditional algorithms are trying to analyze the content on the page to see what he was and what he can relate. In this case, the traditional NLP algorithms are generally only able to view the content before and after words, words to better understand the importance and relevance of the word. Bi-directional component BERT - this is what distinguishes him. "

As mentioned above, BERT scans the content before and after the speech to clarify his understanding of the importance and relevance of the word. "This is a critical improvement in the processing of natural language, because human communication is by nature multi-level and complex," - said Enge.

And BERT, and RankBrain used Google for query processing and content of web pages, to better understand the meaning of the words used in them.

BERT - is not a replacement RankBrain. Google can use several methods to understand the request, and it means that BERT can be used on their own, along with other Google algorithms, in tandem with RankBrain, in any combination, or not used at all - depending on the search query.

What other Google products may be affected by BERT?

Google Statement BERT start only applies to the search, but this update will also be to some extent affect the Assistant. When queries, Assistant, return ready-made answers, or the results of basic research, these results may be influenced by BERT.

In comments Search Engine Land Google representative said that BERT is not currently used for advertising, but if it will be integrated in this vertical in the future, it may help to improve some of the unfortunate family of options that prevent advertisers.

Is it possible to optimize the site for BERT?

According to the words of the Evangelist Search Danny Sullivan (Danny Sullivan): «There is nothing that can be optimized for BERT, and that would have to rethink. Our fundamental desire to reward great content remains unchanged. "

For a good Google rankings Stability Board to focus on the users and create content that meets their search intent. As BERT for the interpretation of your dreams, it becomes clear why the provision of the user what he wants, is still recommended by Google.

"Optimization" now means that you can focus more on quality and well-written content, rather than to seek a compromise between the creation of content for your audience and the construction of linear phrases for cars.

Where can I learn more about BERT?

Below - a small selection of links to materials that can be studied to better understand the BERT. All of them are in English.

Source: Search Engine Land
subscribe

Subscribe to SearchEngines newsletter

preview Rand Fishkin of optimization "for" and "against» Google

Rand Fishkin of optimization "for" and "against» Google

According to the materials of my head SparkToro Rand Fishkin at SMX East 2019. The report was devoted to the evolution of the Google search engine business model...
preview How do I get more traffic to listings

How do I get more traffic to listings

How to create new listings so that they fall into the need of users...
preview Internal SEO for e-commerce sites

Internal SEO for e-commerce sites

Internal search engine optimization helps businesses involved in e-commerce, to occupy high positions in the search, to attract new users and, subsequently, to turn them into...
preview Modern problems of SEO-specialists

Modern problems of SEO-specialists

Report by Sergey Koksharov, SEO-analyst, consultant, author of the blog devaka...
preview Turbo pages on real projects

Turbo pages on real projects

So to introduce at the turbo-page or not? Report Ilya Gorbachev (Racurs) on Optimization 2019 conference...
preview The largest SEO-myth version of Bill Slavskaya

The largest SEO-myth version of Bill Slavskaya

Based on materials from the article Director of SEO-Research Go Fish Digital and expert Google Patents Slavskaya Bill (Bill Slawski...