Google BERT Update Set to Re-Shape Google’s Understanding of Users

Last week Google made headlines by announcing that their Search platform had received its most important update in the past five years.

“BERT” is named after Google’s Bidirectional Encoder Representations from Transformers project that was released in November 2018 as an open source project. At its core, BERT uses machine learning to better understand conversational search queries so that their search results are more accurate and relevant to a searcher’s needs.


Related Reading
: In May of 2019, Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova – a group of researchers at Google AI Language – provided more information on how BERT works with their research paper which is available online as a PDF.

How Significant is BERT?
According to announcements, BERT was released in the United States for English speaking users. While other regions and languages are expected benefit from BERT in the future – digital marketers latched on to the fact that BERT was expected to impact search results for approximately 10% of all Google searches.

Why is 10% so significant to us?

It starts with an understanding that Google provides more than 5.6 billion search results each day. In other words, according to Google’s own figures, BERT will soon be working to provide improved results for more than 560 million search queries globally, every day.

The last change of similar significance that Google acknowledged took place back in October of 2015, when RankBrain was confirmed. At the time, RankBrain – another machine learning algorithm enhancement – was said to impact 15% of Google Searches.

BERT’s Impact on Search Engine Marketing
For years, digital marketing communities have been quick to react, review and interpret Google’s announcements in large part due to the impact they have on search engine optimization.

In the case of BERT, it’s important to note that the new technologies are focused on an improved understanding of a user’s search query — not the end content that we as marketers normally look to optimize.

With BERT working hard to improve Google’s understanding of a searcher’s intent – search engine marketers responsible for Google Ads performance and search engine optimization (SEO) need to pay close attention to changes in search impressions in the wake of BERT’s release. This is due to the nature of how BERT works in using natural language and nuanced learnings to evaluate queries for meanings where simple pattern matching is insufficient.

Here’s a look at one example that Google provided when they announced BERT. The query, “Can you get medicine for someone pharmacy,” remains unchanged – but the accuracy of search results has been improved significantly.

google bert update

In the “BEFORE” example, Google is trying to provide the user with information about filling their own prescriptions through a pharmacy. With BERT, the “AFTER” example shows how search results are improved (based on the query) to help the searcher understand how they may be able to take receipt of a prescription from a pharmacy if it’s intended for another person.

For more interesting examples, we encourage you to review Google’s announcement of BERT, Understanding searches better than ever before.

The Future of Google Search with BERT

While there’s no immediate need to optimize content specifically for BERT, search marketing managers should be watching impression level data and queries much more closely now that BERT has gone live. For data sources, stick to Google Analytics, Google Ads and Google Search Console – being mindful that some platforms (like GSC) may be delayed by a few days in providing search query data.

One final piece of advice is to continue watching for BERT updates from the digital marketing industry as well as from Google themselves. Machine learning technologies are dynamic in nature, and as such – are subject to change.

In the case of RankBrain, some six months after that technology was confirmed, at least one Google employee admitted on the record that Google “didn’t quite understand” all that RankBrain was doing at the time.

In announcing BERT, Pandu Nayak, Google Fellow and Vice President of Search, described the release as “one of the biggest leaps forward in the history of Search.”

Don’t worry, Google. We’ll be watching.