Google brings in BERT to improve search results – TechCrunch
Google today announced one of the biggest updates to the search algorithm in recent years. Using new neural networking techniques to better understand the intentions of questions, Google says it can now offer more relevant results for about one in ten US searches in English (with support for other languages and sites coming later). For selected excerpts, the update is already live globally.
In a world of search updates, where algorithm changes are often much more subtle, an update that touches 10% of searches is quite a big deal (and will certainly keep the world's SEO experts up at night).
Google notes that this update will work best for longer, more conversational questions ̵[ads1]1; and in many ways this is how Google really wants you to search these days, because it is easier to interpret a full sentence than a series of keywords .
The technology behind this new neural network is called “Bidirectional Encoder Representations from Transformers,” or BERT. Google first talked about BERT last year and opened the code for implementation and pre-trained models. Transformers are one of the newer developments in machine learning. They work especially well for data where the sequence of elements is important, which obviously makes them a useful tool for working with natural language and hence search.
This BERT update also marks the first time Google uses its latest Tensor Processing Unit (TPU) chips to serve search results.
Ideally, this means Google Search is now better able to understand exactly what you're looking for and provide more relevant search results and feature snippets. The update started rolling out this week, so chances are you're already seeing some of the effects in your search results.