Transformer-based Recommendation System
Item-Item based collaborative filtering using a BERT based Recommendation model
Encoder-based Self-Attention transformers are very good at predicting the next character for a Natural Language Generation task. In the case of language, these models predict the next character as they can attend to know the importance of tokens/characters around…