In this article, we will learn very basic concepts of Recurrent Neural networks. So fasten your seatbelt, we are going to explore the very basic details of RNN with PyTorch.
3 terminology for RNN:
Input: Input to RNN
Hidden: All hidden at last time step for all layers
Output: All hidden at last layer for all time steps so that you can feed the hidden to next layer
Unidirectional RNN with PyTorch Image by Author
In the above figure we have N time steps (horizontally) and M layers vertically). We feed input at t = 0 and initially hidden to RNN cell and the output hidden then feed to the same RNN cell with next input sequence at t = 1 and we keep feeding the hidden output to the all input sequence. Implementation-wise in PyTorch, if you are new to PyTorch, I want to give you a very very useful tip, and this is really very important If you will follow I guarantee you will learn this quickly: And the tip is- Care more about the Shape
Assume we have the following one dimension array input data (row = 7 , columns=1 )
Data Type 1 (Image by author)
And we created sequential data and the label as shown above. Now we need to break this one into batches. Let’s say we take batch size = 2.
Input To RNN
Input data: RNN should have 3 dimensions. (Batch Size, Sequence Length and Input Dimension)
Then the input data will look like below. the only thing is the third dimension changed to 2 (2 is the number of features).
Up to this point, we have discussed Input type 2: of shape (Batch Size, Sequence Length, Input Dimension). If we want to change this into Input type 1 we need to permute the input. To achieve this just switch batch dimension with sequence dimension in the input data, like below
inp.permute(1,0,2) # switch dimension 0 and dimension 1
If you notice now the first dimension is 3, not 2, and it is our sequence length. And the second dimension is 2 which is batch size. And the third dimension is 2 which is the input dimension/ features. And our input shape is = (3, 2, 2) which is input type 1. If you are confusing a little bit then spend some time on this image-
Let’s implement our small Recurrent Neural Net class, Inherit the base class nn.Module. HL_size = hidden size we can define as 32, 64, 128 (again better in 2’s power) and input size is a number of features in our data (input dimension). Here input size is 2 for data type 2 and 1 for data type 1.
The media shown in this article are not owned by Analytics Vidhya and are used at the Author’s discretion.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.