Google’s biggest update in years will change the way you search
If you’ve ever been caught in a situation where you’re trying to find something on Google but it seems you and Google are speaking different languages, that will be changing soon. Google has introduced its biggest search update function in years — and that’s good news for the average English speaker.
Through a new artificial intelligence function, Google Search can capture words’ nuances, understand their context and generally make it easier to search for items. Click here to read how Google is focusing on artificial intelligence.
As many of us have learned, even the simplest word can trip up Google, but this new function frees us from using a string of keywords to find information on a topic. Now you can actually ask a natural-sounding question and get what you’re looking for.
BERT joins Google Search
In this example provided by Google, while the English query isn’t perfect, it’s easy for the average reader to determine someone is asking whether a person traveling from Brazil to the U.S. needs a visa. But, in the example on the left, Google Search interprets the question as an American going to Brazil.
Under the new Google Search function, Brazilians are given information on traveling to the U.S. So what’s changed? Ask BERT.
Last year, Google introduced an open-source neural network-based technique for natural language processing pre-training called Bidirectional Encoder Representations from Transformers, also known as BERT.
What does this techno incantation mean? It means the technology enables anyone to train their own state-of-the-art question-and-answering system, according to a blog from Google Fellow and Vice President Pandu Nayak.
Related: Google AI computers will predict when you will die
BERT has learned not just to recognize individual words, but also what they mean when put together, as well as their context. This is deep machine learning, especially when we’re dealing with a language as complex as English. It can even figure out a query despite misspellings.
Basically, BERT is trying to do more than understand our words — it’s trying to understand our intent. This is major stuff.
How does this help our searches?
According to Nayak, Google conducted several tests to make sure BERT did make Google Search more helpful. For example:
Pre-BERT, Google Search results would show it understood math practice books are for students, so the word “adults” must mean “young adults.” Wrong.
Post-BERT, Google Search results demonstrate BERT understands the intent of your query — you’re an adult looking for math practice books made for adults. Imagine how much scrolling and query re-writing time you’ll save once BERT is active.
Nayak claims BERT will help Google Search better understand one in 10 English searches in the U.S. With billions of questions asked each day, that one in 10 can really add up.
Related: 7 things you didn’t know Google Search could do until now
Right now, BERT is only being deployed in English in the U.S; however, Google hopes the strides it makes in better training BERT will make extending to other languages easier.
Lest you think BERT is all-knowing, Nayak says it could stand to learn more. For example, when asked what Nebraska’s nearest neighbor is, BERT’s response was “South Nebraska.” As you can see, BERT is still far from fictional company Skynet’s capabilities.
Tags: Google, Nebraska, network, search results