In late October, Google announced its latest update, named BERT, which will allow Google to better understand search intent and return more relevant results to users.
Google has called it, “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search”.
But what is BERT? How does it work? And what should marketers and SEOs do in response?
BERT (or the slightly less catchy Bidirectional Encoder Representations from Transformers) is an algorithm-based technique created by Google to better understand and process natural language.
Thanks to machine learning, BERT allows Google to better understand the full context of a search. For example, being more considerate of how words like “from”, “to” and “for” impact surrounding words and what a searcher is actually looking for, rather than returning results focused on the main keywords of a search.
An example used by Google is the query “2019 brazil traveler to USA need a visa”. Before BERT, the results displayed information for US travellers heading to Brazil. After BERT, Google Search now understands the importance of the word “to” in that query, and returns information specific to a traveller going from Brazil to the US.
Simply put, BERT lets Google return more accurate results for more nuanced, complex and conversational queries.
As of late October 2019, BERT affects 10% of searches in the U.S. in English. Google plans to roll-out BERT to more countries and languages over time.
BERT impacts both standard organic listings and featured snippets at the top of search results.
Due to how BERT operates, it allows Google to use learnings from one language and apply them to another. BERT is also live for any language that supports featured snippets, and Google specifically states that BERT is showing improvements for featured snippets in Hindi, Korean and Portuguese.
Danny Sullivan, Google’s public Search Liaison, has stated that there’s nothing to optimise for with BERT.
Fundamentally, it is not possible to optimise for machine-learning technology, or “a neural network-based technique for natural language processing (NLP) pre-training”, as Google describes it.
Danny Sullivan has again reinforced that SEOs should continue to write content for users. This makes perfect sense at BERT is technology that helps Google better understand searches, and thereby return content that best meets the needs of the user.
As BERT will make it easier for Google to return content that better matches more complex queries, it could be the case that pages that better address a specific user need may improve in search visibility after being previously overshadowed by broader, more and/or more generic pages.
This, however, remains to be seen. For now, SEOs and marketers should continue creating content to meet their user’s needs.