Google BERT, one of the biggest leaps forward in the history of research.
we are dramatically improving the way we understand queries, which is the biggest leap in the past five years, and one of the biggest leaps in research history.
Example of result before Google BERT and after Google BERT
Definition of Google BERT
In automatic natural language processing, BERT, acronym of Bidirectional Encoder Representations from Transformers, is a language model developed by Google in 2018. This method has significantly improved automatic language processing algorithms. Wikipédia
Google BERT is an algorithm that helps better understand natural language.
Explanation of Google BERT by Olivier Duffez
(Lien sur image)
Explanation in video of Google BERT by Olivier Andrieu
Google BERT the algorithm capable of understanding: « The intention of research »
- Navigation request (mark)
- Transactional request (buy sell…)
- Information request (Who that how….?)
You now know:
- Let the search engine become “intelligent”. Example: capable of handling homonymy.
- BERT is particularly suitable:
To the questions
To featured Snippet
On the vocal
I remain at your disposal.