A Complete List of Important Natural Language Processing Frameworks you should Know (NLP Infographic)

[email protected] Sanad 14 Jun, 2020 • 2 min read

Overview

  • Here’s a list of the most important Natural Language Processing (NLP) frameworks you need to know in the last two years
  • From Google AI’s Transformer to Facebook Research’s XLM/mBERT, we chart the rise of NLP through the lens of these seismic breakthroughs

 

Introduction

Have you heard about the latest Natural Language Processing framework that was released recently? I don’t blame you if you’re still catching up with the superb StanfordNLP library or the PyTorch-Transformers framework!

There has been a remarkable rise in the amount of research and breakthroughs happening in NLP in the last couple of years.

I can trace this recent rise to one (seismic) paper – “Attention is All You Need” by Google AI in June 2017. This breakthrough has spawned so many new and exciting NLP libraries that enable us to work with text in ways that were previously limited to our imagination (or Hollywood).

Here is the interest in natural language processing according to Google searches in the last 5 years in the US:

nlp frameworks

We can see a similar pattern when we expand the search to include the entire globe!

Today, we have State-of-the-Art approaches for Language Modeling, Transfer Learning and many other important and advanced NLP tasks. Most of these involve the application of deep learning, especially the Transformer architecture that was introduced in the above paper.

So we decided to collate all the important developments in one place and in one neat timeline. You’ll love this infographic and should keep it handy during your own NLP journey.

I have listed a few tutorials below to help you get started with these frameworks:

Without any further ado, here is the infographic in all its glory! And if you want to download the high-resolution PDF (which you really should), head over here.

A computer science graduate, I have previously worked as a Research Assistant at the University of Southern California(USC-ICT) where I employed NLP and ML to make better virtual STEM mentors. My research interests include using AI and its allied fields of NLP and Computer Vision for tackling real-world problems.

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear

Deep Chatterjee
Deep Chatterjee 07 Sep, 2019

Thank you for this comprehensive list and associated tutorials at one place.

Kailash Atal
Kailash Atal 27 Sep, 2019

This is a very good infographic that captures progress in NLP in the recent past.

Related Courses