Blog

UNDERSTANDING GOOGLE’S BERT ALGORITHM

In October 2019, Google released what it has deemed the most important update it has provided in half a decade: the deep-learning algorithm BERT. BERT stands for “Bidirectional Encoder Representations from Transformers,” but unless you have a strong tech background, that probably does not help you understand its importance. Essentially, BERT is a deep-learning algorithm that analyzes the queries that users enter into the Google Search bar. The concept is designed to provide Google with a better understanding of the context of searches so that users can more easily find relevant web pages

How BERT Works

Much of human language is based on context. In English, for example, the word “sock” can mean comfy footwear or a punch in the face; “hot” can mean attractive, popular, or of high temperature; “content” can mean information or a state of satisfaction; and so forth. When humans read these words, their meanings are often determined by their placement in the sentence and the words around them. BERT analyzes search queries and assesses the context of the words, parsing them in much the same way that a human does. This allows the search to bring in more accurate results.

BERT is not the end-all, be-all of algorithms, though it is a drastic step forward in search engine optimization. The true import of the algorithm is that it employs deep-learning, a technique that allows BERT to continually learn about the context used by real humans and refine its actions based on that new data. There are several ways that this technology is sure to have an impact.

Applications of BERT

Of course, the primary way that BERT will benefit users is by improving search results. However, its emphasis on conversational, colloquial speech will potentially help users in other ways. For example, BERT’s ability to parse the way that real humans talk and type will allow users to type more naturally. In the past, users may have run into difficulties with Search if their query included words like “for” or “from” that were integral to the meaning of the question. Now, BERT can accurately analyze those words rather than simply ignoring them, leading to better search results without the user having to waste time thinking of more technical ways to type a sentence that comes naturally. It is also possible that BERT could be used to enhance or test the accuracy of language understanding in other technologies, such as Siri or Alexa, further improving voice-commanded implementations.

Writing with BERT in Mind

SEO specialists reacted to the news of the BERT update in expected fashion: a near panic about optimizing for the new algorithm so that their webpages show up first in queries. In reality, not much is changed on that front. BERT analyzes the queries themselves, not webpages, and is designed to better interact with the natural language processes humans already use. Google’s Public Search Liaison, Danny Sullivan, tweeted about the SEO anxiety, saying that there is no reason to optimize specifically for BERT. Instead, Google’s search engine will continue to look for “great content.”

Great content is still great content. As usual, SEO writers should focus on providing a clear focus with their writing, structuring their content well, and obeying web writing best practices. While BERT will help with some issues like Google’s past trouble with pronouns, it will not compensate for poorly written articles. Think of it this way: BERT allows Google’s search engine to act more like a human when reading and assessing queries, but even we humans can read a webpage and be lost about its ultimate “point.” A cluttered, unstructured webpage is unlikely to benefit from BERT.

One misconception about BERT that has permeated through the SEO world is the idea that BERT’s ability to parse natural language queries means that it is now better to optimize your content for longer searches. The notion is that, since BERT helps Google better understand the meaning of searches that are entered in the sometimes longwinded, stream-of-consciousness style of the spoken word, longer queries will become the norm and webpages should be designed around that eventuality. In fact, this is not the case. BERT simply helps the search engine parse those long entries and connect them to specific content on relevant websites. It is designed to ping quality content that matches the search, so it essentially adapts those longwinded searches to the content that already exists on your webpage.

The Reach of BERT

As SEO writers, we should embrace BERT as a tool for refining the relevance of search results. Rather than writing for some lower common denominator, the emphasis should always be about creating content that is specific, useful, factual, and well written. The improvements that BERT brings to the search engine world are important and represent a massive leap forward in natural language algorithms. Right now, Google plans to use BERT on 10% of search queries made by users. In other words, one out of every ten queries should produce more accurate results due to the ability of BERT to understand words within a greater context.

Ten percent of searches is nothing to sneeze at, though we should not expect BERT to set the world as we know it on fire. Essentially, the main difference we can expect to see is that users are more satisfied with the relevance of their search results, and webpages that are crafted for relevance and quality will rise to the top as they should. Search rankings, for now, should remain relatively unaffected, as only 10% of searches will use BERT. Eventually, the company may implement BERT on a larger scale, though its implementation currently taxes Google’s hardware quite a bit.

The World of Deep-Learning

BERT is the latest example of an algorithm that uses deep-learning to operate. Deep-learning is a process integral to many applications of artificial intelligence that mimics humans’ ability to grow and adapt. In short, deep-learning allows BERT and algorithms like it to structure and analyze chaotic data. A subset of machine learning, deep-learning requires the use of neural networks similar to the neurons in the human brain. These networks sift through data in a hierarchical fashion, which allows the algorithm to process information nonlinearly. This allows BERT to analyze data more quickly while also learning from its previous experiences with other sets of data. In theory, this means that BERT will continually improve its ability to parse search queries accurately.

Deep-learning is the way of the future, with artificial intelligence becoming useful in fields as varied as fraud detection and the sorting of social media data. As SEO writers and content creators, we need to embrace this trend and rejoice in its ability to recognize quality content and deliver search results that are tailored to the user’s needs. We are like students raising our hands in class, and BERT is like the teacher ready to call on us. Webpages scream out “pick me, pick me,” but BERT can spot the student who knows the answer to the question and will call upon her to speak before the class. It’s a crude metaphor, but the gist is simple. Focus on creating high-quality content that will meet users’ needs, and BERT will help those users find your webpage more easily than they could before.