Loading…
Back To Schedule
Tuesday, October 8 • 1:00pm - 1:50pm
PRO WORKSHOP (AI): Taming the BERT: Transfer Learning for NLP

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Transfer learning enables using pretrained deep neural networks trained on various large datasets and adapt them for various tasks. Fine-tuning such pre-trained models in computer vision has been a far more common practice than training from scratch. In NLP, however, due to the lack of models pretrained on large corpus, the most common transfer learning technique had been fine-tuning pretrained word embeddings. These embeddings are used as the first layer of the model on the new dataset, and still require training from scratch with large amounts of labeled data to obtain good performance. Finally in 2018, several pretrained language models (ULMFiT, OpenAI GPT and BERT) emerged.These models are trained on very large corpus, and enable robust transfer learning for fine-tuning many NLP tasks with little labeled data. In this talk we'll learn the architecture of these pretrained language models. In particular, we'll share how different transfer learning techniques have been used with BERT to solve various downstream tasks in the NLP community.

AI DevWorld 2019 Speakers
avatar for Joan Xiao

Joan Xiao

Principal Data Scientist, Linc Global
Joan Xiao is a Principal Data Scientist at Linc Global, a commerce-specialized customer care automation company. In her role, she applies novel natural language processing techniques to improve customer experience. Previously she led machine learning and data science teams at various... Read More →


Tuesday October 8, 2019 1:00pm - 1:50pm PDT
AI DevWorld -- Workshop Stage 1