Google BERT: In A Nutshell
Posted on November 04, 2019 Michael Grezlikowski
Google has just rolled out a new update called BERT (Bidirectional Encoder Representations from Transformers) as of October 24th, 2019. BERT can already be seen affecting search results in the United States, and will be expected to be seen in other countries as Google continues the roll-out. BERT is the biggest update in five years since the RankBrain update, and continues on the path of Google’s algorithm becoming better at understanding searcher intent. It is expected to affect 1 out of 10 organic search results along with featured snippets. So what can you expect to come from it?
Google BERT will help Google understand long-tail queries better by adding context to articles and pronouns. Before, Google would select certain keywords or phrases from long-tail searches and show results relevant to that. However, the results may not have been the most relevant due to Google not fully understanding the context of the longer search query. Now, thanks to BERT, Google will be able to better discern context from pronouns and articles in long-tail searches (bringing more relevant results from search queries). We figure this change was also put in place to accommodate people who use voice search as well (more conversational search queries come from voice search).
Marketers shouldn’t be terrified of Google BERT. We believe it will make content marketing even MORE effective, while also allowing legacy content marketing efforts to get better placement in organic search results. Additionally, we believe this update should drive forward (even more) the importance of a great content strategy that is based upon conversational, long-tail search terms.
For more information on Google BERT, check out these other blog articles:
https://www.blog.google/products/search/search-language-understanding-bert/