Free Resources for Beginners on Deep Learning and Neural Network

avcontentteam 24 May, 2020 • 10 min read

Introduction

Machines have already started their march towards artificial intelligence. Deep Learning and Neural Networks are probably the hottest topics in machine learning research today. Companies like Google, Facebook and Baidu are heavily investing into this field of research.

Researchers believe that machine learning will highly influence human life in near future. Human tasks will be automated using robots with negligible margin of error. I’m sure many of us would never have imagined such gigantic power of machine learning.

To ignite your desire, I’ve listed the best tutorials on Deep Learning and Neural Networks available on internet today. I’m sure this would be of your help! Take your first step today.

Time for some motivation here. You ‘must’ watch this before scrolling further. This ~3min video was released yesterday by Google. Enjoy!

 

Time for proceed further. Firstly, let’s understand Deep Learning and Neural Network in simple terms.

 

What is Neural Network?

The concept of Neural Network began way back in 1980s. But, has gained re-ignited interest in recent times. Neural network is originally a biological phenomenon. Neural network is a ‘network’ of interconnected neurons which maintain a high level of coordination to receive and transmit messages to brain & spinal cord. In machine learning, we refer Neural Network as ‘Artificial Neural Network’.

Artificial Neural Network, as the name suggests, is a network (layer) of artificially created ‘neurons’ which are then taught to adapt cognitive skills to function like human brain. Image Recognition, Voice Recognition, Soft Sensors, Anomaly detection, Time Series Predictions etc are all applications of ANN.

 

What is Deep Learning?

In simple words, Deep Learning can be understood as an algorithm which is composed of hidden layers of multiple neural networks. It works on unsupervised data and is known to provide accurate results than traditional ML algorithms.

Input data is passed through this algorithm, which is then passed through several non-linearities before delivering output. This algorithm allows us to go ‘deeper’ (higher level of abstraction) in the network without ending up writing lot of duplicated code, unlike ‘shallow’ algorithms. As it goes deeper and deeper, it filter the complex features and combines with those of previous layer, thus better results.

Algorithms like Decision Trees, SVM, Naive Bayes are ‘shallow’ algorithm. These involves writing lot of duplicated code and cause trouble reusing previous computations.

Deep Learning through Neural Network and takes us a step closer to Artificial Intelligence.

 

What do Experts have to say?

Early this years, AMAs took place on Reddit with the masters of Deep Learning and Neural Network. Considering my ever rising craze to dig latest information about this field, I got the chance to attend their AMA session. Let’s see what they have to said about the existence and future of this field:

1

Geoffrey Hinton said, ‘The brain has about 1014 synapses and we only live for about 109 seconds. So we have a lot more parameters than data. This motivates the idea that we must do a lot of unsupervised learning since the perceptual input (including proprioception) is the only place we can get 105 dimensions of constraint per second.’

 

2

                                                                                                                                                                  Yann LeCunn, on emotions in robot, said, ‘Emotions do not necessarily lead to irrational behavior. They sometimes do, but they also often save our lives. If emotions are anticipations of outcome (like fear is the anticipation of impending disasters or elation is the anticipation of pleasure), or if emotions are drives to satisfy basic ground rules for survival (like hunger, desire to reproduce), then intelligent agent will have to have emotions’

 

3

                                                                                                                                                              Yoshua Bengio said, ‘Recurrent or recursive nets are really useful tools for modelling all kinds of dependency structures on variable-sized objects. We have made progress on ways to train them and it is one of the important areas of current research in the deep learning community. Examples of applications: speech recognition (especially the language part), machine translation, sentiment analysis, speech synthesis, handwriting synthesis and recognition, etc.’

 

4

                                                                                                                                                              Jurgen Schmidhuber says, ’20 years from now we’ll have 10,000 times faster computers for the same price, plus lots of additional medical data to train them. I assume that even the already existing neural network algorithms will greatly outperform human experts in most if not all domains of medical diagnosis, from melanoma detection to plaque detection in arteries, and innumerable other applications.’

 

P.S. I am by no means an expert on Neural Networks. In fact, I have just started my journey in this fascinating world. If you think, there are other free good resources which I have not shared below, please feel free to provide the suggestions

 

Below is the list of free resources useful to master these useful concepts:

 

Courses

Machine Learning by Andrew Ng: If you are a complete beginner to machine learning and neural networks, this course is the best place to start. Enrollments for the current batch ends on Nov 7, 2015. This course provides a broad introduction to machine learning, deep learning, data mining, neural networks using some useful case studies. You’ll also learn about the best practices of these algorithms and where are we heading with them

 

Neural Network Course on Coursera: Who could teach Neural Network better than Hinton himself? This is a highly recommended course on Neural Network. Though, it is archived now, you can still access the course material. It’s a 8 week long course and would require you to dedicate atleast 7-9 hours/week.  This course expects prior knowledge of Python / Octave / Matlab and good hold on mathematical concepts (vector, calculus, algebra).

 

In addition to the course above, I found useful slides and lecture notes of Deep Learning programs from a few top universities of the world:

Carnegie Mellon University – Deep Learning : This course ended on 21st October 2015. It is archived now. But, you can still access the slides shared during this course. Learning from slide is an amazing way to understand concepts quickly. These slides cover all the aspects of deep learning to a certain point. I wouldn’t recommend this study material for beginners but to intermediates and above in this domain.

 

Deep Learning for NLP – This conference happened in 2013 on Human Language Technologies. The best part was, the knowledge which got shared. The slides and videos and well accessible and comprises of simple explanation of complex concepts. Beginners will find it worth watching these videos as the instructor begins the session from Logistic Regression and dives deeper into the use of machine learning algorithms.

 

Deep Learning for Computer Vision – This course was commenced at the starting of year 2015 by Columbia University. It focuses on deep learning techniques for vision and natural language processing problems. This course embraces theano as the programming tool. This course requires prior knowledge in Python and NumPy programming, NLP and Machine Learning.

 

Deep Learning: This is an archived course. It happened in Spring 2014. It was instructed by Yann LeCunn. This is a graduate course on Deep Learning. The precious slides and videos are accessible. I’d highly recommend this course to beginners. You’d amazed by the way LeCunn explains. Very simple and apt. To get best out of this course, I’d suggest you to work on assignments too, for your self evaluation.

 

Books

6

 

This book is written by Christopher M Bishop. This book serves as a excellent reference for students keen to understand the use of statistical techniques in machine learning and pattern recognition. This books assumes the knowledge of linear algebra and multivariate calculus. It provides a comprehensive introduction to statistical pattern recognition techniques using practice exercises.

 

 

Because of the rapid development and active research in the field, there aren’t many printed and accessible books available on Deep Learning. However, I found that Yoshua Bengio, along with Ian Goodfellow and Aaron Courville is working on a book. You can check its recent developments here.

Neural Networks and Deep Learning: This book is written by Michael Neilson. It is available FREE online. If you are good at learning things at your own pace, I’d suggest you to read this book. There are just 6 Chapters. Every chapters goes in great detail of concepts related to deep learning using really nice illustrations.

 

Blogs

Here are some of the best bet I have come across:

Beginners

Introduction to Neural Networks : This is Chapter 10 of Book, ‘The Nature of Code’. You’ll find the reading style simple and easy to comprehend. The author has explained neural network from scratch. Along with theory, you’ll also find codes(in python) to practice and apply them at your them. This not only would give you confidence to learn these concept, but would also allow you to experience their impact.

 

Hacker’s Guide to Neural Networks : Though, the codes in this blog are written in Javascript which you might not know. I’d still suggest you to refer it for the simplicity of theoretical concepts. This tutorial has very little math, but you’ll need lots of logic to comprehend and understand the following parts.

 

Intermediates

Recurrent Neural Network Part 1, Part 2, Part 3, Part 4 : After you are comfortable with basics of neural nets, it’s time to move to the next level. This is probably the best guide you would need to master RNN. RNN is a form of artificial neural network whose neurons send feedback signals to each other. I’d suggest you to follow the 4 parts religiously. It begins RNN from basics, followed by back propagation and its implementation.

 

Unreasonable Effectiveness of RNN: Consider this as an additional resource on RNN. If you are fond of seeking options, you might like to check this blog. It start with basic definition of RNN and goes all the way deep into building character models. This should help you give more hands on experience on implementing neural networks in various situations.

 

Backward Propogation Neural Network: Here you’ll find a simple explanation on the method of implementing backward propagation neural network. I’d suggest beginners to follow this blog and learn more about this concept. It will provide you a step by step approach for understanding neural networks deeply.

 

Deep Learning Tutorial by Stanford: This is by far the best tutorial/blog available on deep learning on internet. Having been recommended by many, it explains the complete science and mathematics behind every algorithm using easy to understand illustrations. This tutorial assumes basic knowledge of machine learning. Therefore, I’d suggest you to start with this tutorial after finishing Machine Learning course by Andrew Ng.

 

Videos

Complete Tutorial on Neural Networks : This complete playlist of neural network tutorials should suffice your learning appetite. There were numerous videos I found, but offered a comprehensive learning like this one.

 

Note: In order to quickly get started, I’d recommend you to participate in Facial keypoint Detection Kaggle competition. Though, this competition ended long time back, you can still participate and practice. Moreover, you’ll also find benchmark solution for this competition. Here is the solution: Practice – Neural Nets. Get Going!

 

Deep Learning Lectures: Here is a complete series of lectures on Deep Learning from University of Oxford 2015. The instructor is Nando de Freitas. This tutorials covers a wide range of topics from linear models, logistic regression, regularization to recurrent neural nets. Instead of rushing through these videos, I’d suggest you to devote good amount of time and develop concrete understanding of these concepts. Start from Lecture 1.

 

Introduction to Deep Learning with Python: After learning the theoretical aspects of these algorithm, it’s now time to practice them using Python. This ~1 hour video is highly recommended to practice deep learning in python using theano.

 

Deep Learning Summer School, Montreal 2015: Here are the videos from Deep Learning Summer School, Montreal 2015. These videos covers advanced topics in Deep Learning. Hence, I wouldn’t recommend them to beginners. However, people with knowledge of machine learning must watch them. These videos will take your deep learning intellect to a new level. Needless to say, they are FREE to access.

 

Also See: Top Youtube Videos on Machine Learning, Deep Learning and Neural Networks

 

Research Papers

I could list here numerous paper published on Deep Learning, but that would have defeated the purpose. Hence, to highlight the best resources, I’ve listed some of the seminal papers in this field:

Deep Learning in Neural Networks

Introduction to Deep Learning

Deep Boltzmann Machines

Learning Deep Architectures for AI

Deep Learning of Representations: Looking Forward

Gradient based training for Deep Architechture

 

End Notes

By now, I’m sure you have a lot of work carved out for yourself. I found them intimidating initially, but these videos and blogs totally helped me to regain my confidence. As said above, these are free resources and can be accessible from anywhere. If you are a beginner, I’d recommend you to start with Machine Learning course by Andrew Ng and read through blogs too.

I’ve tried to provide the best possible resources available on these topics at present. As mentioned before, I am not an expert on neural networks and machine learning (yet)! So it is quite possible that I missed out on some useful resource. Did I miss out any useful resource? May be! Please share your views / suggestions in the comments section below.

If you like what you just read & want to continue your analytics learning, subscribe to our emails, follow us on twitter or like our facebook page.

avcontentteam 24 May 2020

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear

Leandro Guerra
Leandro Guerra 04 Nov, 2015

Really great post! Thank you for that!

The.R.Enthusiast
The.R.Enthusiast 04 Nov, 2015

This article is GOLD. Sharing it right away!

Abhishek
Abhishek 04 Nov, 2015

I recently began diving into Neural networks and you have this treasure of knowledge for me. Thanks for sharing with us!

Vijay Rajan
Vijay Rajan 04 Nov, 2015

We only live for about 10^9 seconds?Nice article, by the way!

Vijay Rajan
Vijay Rajan 05 Nov, 2015

Quote from this article:"Geoffrey Hinton said, ‘The brain has about 10^14 synapses and we only live for about 10^9 seconds."10^9 seconds approximately equals 31 years:10^9 seconds = 1,000,000,000 seconds = 1,000,000,000/(60*60) = 277,778 hours = 11,574 days = approximately 31.79 yearsDespite this glaring mistake, the article itself is excellent!!!

Analytics Vidhya Content Team
Analytics Vidhya Content Team 05 Nov, 2015

Hi VijayI'll try to forward your query to Geoff Hinton :P

Lawal
Lawal 11 Nov, 2015

I do appreciate the free knowledge for us to grow with. You are one of those who make us feel that science is still an illumination. Thank you very much.

Murtuza Dahodwala
Murtuza Dahodwala 03 Nov, 2017

I want to learn about deep learning(Neural Nets) using Keras(Tensorflow/Theano). where can i find a good structured article??

Related Courses

image.name
0 Hrs 72 Lessons
4.84

A Comprehensive Learning Path for Deep Learning in 2023

Free

  • [tta_listen_btn class="listen"]