What is BERT?
BERT is a natural language processing technology that Google announced in late 2018, and became a hot topic when it was implemented in Google’s
search engine
. It is an abbreviation for Bidirectional Encoder Representations from Transformers and is pronounced “BART”.
Natural language processing technology is a processing technology that allows computers to understand human language, and it has existed since before BERT, and is called NLP (Natural Language Processing). In other words, BERT is a type of NLP.
Features of BERT
The most distinctive feature of BERT is the way it processes words.
For example, if you enter “travel bags not for women” in a search, traditional NLP would result in “travel bags for women” being the top search result. Although the search keyword can be broken down into “women’s”, “janai”, and “travel bag”, it cannot be determined that “janai” depends on “women’s”. As a result, the clear nouns “women’s” and “travel bag” were recognized as search targets, and “janai”, whose meaning was difficult to understand, was ignored.
BERT, on the other hand, understands the context of search keywords. In the above example, it is determined that “Janai” means “not” and applies to “women’s”, and products registered as men’s or gender-free are brought to the top of the search results. *There are differences depending on the search keyword.
By adopting BERT for natural language processing, it is said that 10% of Google searches in English-speaking countries have improved. BERT is
an algorithm that
can efficiently learn new words, judge the context of search keywords, and output search results that are more in line with the user’s intentions.