Google BERT
BERT (Bidirectional Encoder Representations from Transformers) is an Artificial Intelligence model developed by Google and designed to interpret the context of words within a phrase, both at the level before and after the analyzed term.
Traditionally, search algorithms focused on single keywords or key phrases, but BERT can pick up linguistic nuances and understand complex or ambiguous queries . For example, in the query “can you pick up medication for someone who is sick?â€, BERT helps Google understand that the search intent is to know whether it is allowed to pick up medication for a third person, correctly interpreting the context.