Table of Contents :
- Introduction and Recommendation Framework
- Evaluating Recommendation Systems
- Content Based Recommendations
- Neighborhood Based Collaborative Filtering
- User and Item Based Collaborative Filtering
- KNN Recommendations
- Matrix Factorisation
- Deep Learning — Introduction
- Restricted Boltzmann Machines
- Amazon DSSTNE and Sage Maker
- Real-World Challenges and Solutions
Deep Scalable Sparse Tensor Neural Engine(DSSTNE)
Amazon open-sourced its recommender engine called DSSTNE, which makes it easy to apply deep neural networks to massive, sparse data sets and produce great recommendations at large scale.
- Works excellently well with sparse data
- No code required!
- Open source
How to use?
- Convert data into a specific format
- Write config file
There is no code involved! All you have to do is write the config file as shown as above and remember some terminal commands.
In the above config file,
- ‘Kind’ is set to feed forward auto encoder. Specifically, Sparse Encoders.
- Here, sparsity is constraints on hidden layer to force find patterns even if there are less number of hidden nodes
- ‘SparsenessPenality’ defines above metioned ‘constraints’. More details can be found at Andrew Ng’s CS294A (Sparse Autoencoders)
A point to remember is that here, if user has rated a movie, it is taken as binary value 1. Not matter what the rating is. And if user hasn’t rated any movie, binary value 0 is alloted. This approach works surprisingly well!
You may try the same approach as above in RBMs aswell.
- Amazon’s DSSTNE is available in github. Use linux if planning to use GPU for training purpose.
- Data is needed in NetCDF format. Arrange data such that every row must have single user followed by list of rating inputs for that user.
- First row of the data file is taken as first input node
- Amazon gives AWK script
- Component of Amazon’s web services
- Create notebooks on AWS (Large scale Models)
- protobuf is the data format used