Introduction to Cyclical Learning Rates for Training Neural Nets  

Learning rate is one of the most important hyperparameters to consider while training a neural network for a given problem. However, choosing a suitable learning rate is often troublesome and sometimes it is time-consuming as well (most of the times in fact). In this session, I am going to talk about a technique called Cyclical Learning Rates which promises to give a systematic approach to choose a learning rate for training neural networks. Recently, a team of researchers at an AI education firm called fast.ai was able to beat many state-of-the-art models by using this technique as one of their main choices.

Overview of the Talk
  1. Why are learning rates used?

  2. Some existing approaches for choosing the right learning rate

  3. What are the shortcomings of these approaches?

  4. The need of a systematic approach for setting the learning rate – Cyclical Learning Rates (CLR)

  5. What is CLR?

  6. Some amazing results shown by CLR

  7. Conclusion.

 

Speaker



Sayak Paul


Sayak is a Project Instructor at DataCamp and actively contributes to the DataCamp community by writing tutorials/posts in Data Science and Machine Learning. Previously he worked at TCS Research and Innovation on Data Privacy. Sayak is carrying out research in Applied Machine Learning since his college days. Currently, he is working on the applications of Differential Transfer Learning for enhancing Cyber Security applications. Apart from this, Sayak regularly speaks about Machine Learning at different developer meetups and is a contributing author at one of the most popular online blogs for Data Science – Towards Data Science.


Buy Ticket
Social media & sharing icons powered by UltimatelySocial