Presented By

June 18-21, 2019
Beijing, CN

The Unreasonable Effectiveness of Transfer Learning on NLP

This will be presented in English.

David Low (
11:1511:55 Thursday, June 20, 2019

必要预备知识 (Prerequisite Knowledge)

Basic knowledge of machine learning, deep learning and natural language processing concepts

您将学到什么 (What you'll learn)

- Understand how transfer learning allows data scientists to build accurate models without much data - Apply transfer learning onto natural language processing tasks

描述 (Description)

First, I will start with an introduction to Transfer Learning and its history. Transfer Learning has been proven to be a tremendous success in the Computer Vision field as a result of ImageNet competition. Followed by explanations on why would pre-trained models be handy for tackling Machine Learning problems with limited data and demonstrate how they could be used as fixed feature extractor for downstream tasks/applications.

In the past months, the Natural Language Processing field has witnessed several breakthroughs with transfer learning, namely ELMo, OpenAI Transformer, ULMFit and BERT. Pre-trained models derived from these techniques have been proven in achieving state-of-the-art results on a wide range of NLP problems. The use of pre-trained models come a long way since the introduction of word2vec and GloVe while these two approaches are considered as shallow in comparison.

As an example, I will walk through the codes/steps to fine tune a transfer learning model to achieve state-of-the-art accuracy (92%) on a real-world sentiment classification problem – Amazon Reviews dataset. In comparison to a FastText based model trained on full dataset (3.6 million samples), it takes just 1,000 samples of training data to produce a model that achieved similar performance.

Photo of David Low

David Low

David Low is currently the Co-founder and Chief Data Scientist at, building AI-powered chatbot to disrupt and shape the booming conversational commerce space with Deep Natural Language Processing. He represented Singapore and National University of Singapore (NUS) in Data Science Game’16 at France and clinched top spot among Asia and America teams. Recently David has been invited as a guest lecturer by NUS to conduct masterclasses on applied Machine Learning and Deep Learning topics. Prior to, he was a Data Scientist with Infocomm Development Authority (IDA) of Singapore.

Throughout his career, David has engaged in data science projects ranging from Manufacturing, Telco, E-commerce to Insurance industry. Some of his works including sales forecast modeling and influencer detection had won him awards in several competitions and was featured on IDA website and NUS publication. Earlier in his career, David was involved in research collaborations with Carnegie Mellon University (CMU) and Massachusetts Institute of Technology (MIT) on separate projects funded by National Research Foundation and SMART. As a pastime activity, he competed on Kaggle and achieved Top 0.2% worldwide ranking.

Leave a Comment or Question

Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?

Join the conversation here (requires login)