Solving multiple time series (more than 100 million time series) in a single shot has always been a challenging task for traditional machine learning models. LSTMs are capable of solving multi-time series problems with a capability to learn embeddings of categorical features for each object (time series).
This hack session will involve end-to-end Neural Network architecture walkthrough and code running session in PyTorch which includes data loader creation, efficient batching, Categorical Embeddings, Multilayer Perceptron for static features and LSTM for temporal features.
The session details the creation of data loaders in PyTorch which includes a step-by-step code walkthrough to create temporal (Day of the week, Week, Days, etc.) as well as static (Items, Stores, etc.) embeddings along with input dataset. Batching in PyTorch and creating tensors for each type of feature. Neural Net Code Modules for learning embeddings along with Multilayer Perceptron, LSTMs. Finally the model inference and experimental setup which includes inferring results and creating an experimental setup to analyze the loss with each iteration for every epoch in order to get the neural network right.
Key Takeaways for the Audience
- Deep neural network architecture for multiple time series in PyTorch
- Learning embeddings for all the categorical features with a varying number of classes
- Code-level understanding of the seq2seq encoder-decoder LSTM model
- Infusing static and temporal features separately into a network so as to avoid any data duplication when modeling big data
- Setting up and running experiments for tuning and making network changes for improved accuracy
Check out the below video to know more about the session.