/* ---- Google Analytics Code Below */

Monday, February 11, 2019

BERT for Natural Language Understanding

Was just introduced to this again, worth a a look:

BERT Technology introduced in 3-minutes   By Suleiman Khan, Ph.D.  in Medium

Google BERT is a pre-training method for natural language understanding that performs various NLP tasks better than ever before.

BERT works in two steps, First, it uses a large amount of unlabeled data to learn a language representation in an unsupervised fashion called pre-training. Then, the pre-trained model can be fine-tuned in a supervised fashion using a small amount of labeled trained data to perform various supervised tasks. Pre-training machine learning models have already seen success in various domains including image processing and natural language processing (NLP). .... " 

No comments: