Have you heard about the latest Natural Language Processing framework that was released recently? I don’t blame you if you’re still catching up with the superb StanfordNLP library or the PyTorch-Transformers framework!
There has been a remarkable rise in the amount of research and breakthroughs happening in NLP in the last couple of years.
I can trace this recent rise to one (seismic) paper – “Attention is All You Need” by Google AI in June 2017. This breakthrough has spawned so many new and exciting NLP libraries that enable us to work with text in ways that were previously limited to our imagination (or Hollywood).
Here is the interest in natural language processing according to Google searches in the last 5 years in the US:
We can see a similar pattern when we expand the search to include the entire globe!
Today, we have State-of-the-Art approaches for Language Modeling, Transfer Learning and many other important and advanced NLP tasks. Most of these involve the application of deep learning, especially the Transformer architecture that was introduced in the above paper.
So we decided to collate all the important developments in one place and in one neat timeline. You’ll love this infographic and should keep it handy during your own NLP journey.
I have listed a few tutorials below to help you get started with these frameworks:
Without any further ado, here is the infographic in all its glory! And if you want to download the high-resolution PDF (which you really should), head over here.
Thank you for this comprehensive list and associated tutorials at one place.
This is a very good infographic that captures progress in NLP in the recent past.