anupammePaper Reading #5: Unbounded cache model for online language modelling with open vocabularyWhy this paper:Jan 2, 2018Jan 2, 2018
anupammePaper Reading #4: Controllable Invariance through Adversarial Feature LearningWhy this paper:Jan 2, 2018Jan 2, 2018
anupammePaper Reading #3: Poincaré Embeddings for Learning Hierarchical RepresentationsWhy this paper: I believe this paper opens new directions of research in deep learning applied to language modelling. Language does not has…Jan 1, 2018Jan 1, 2018
anupammePaper Reading #2: Plan, Attend, Generate: Planning for Sequence-to-sequence Models by Dutil et. al.Why this paper:Jan 1, 2018Jan 1, 2018
anupammePaper Reading #1: Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies by Linzen…Why this paper: Recurrent Neural Networks (RNN( and its family of neural networks are known to very good at many language modelling tasks…Jan 1, 2018Jan 1, 2018