Computronium Blog
Published in

Computronium Blog

Language Generation with Recurrent Models

LSTM, Sampling, Smart Code Completion Tool

How Do You Generate Sequence Data?

The general way is to train a machine learning model then ask it to predict the next token, whether they be characters or words or n-grams. A model with this predictive capability is called a Language Model. The model is basically learning…

--

--

--

Discovering Natural Language Processing and Machine Learning || Every Monday

Recommended from Medium

Use cases for graph databases

YOLOv5 Transfer Learning

The Lord of The RNN

YDKYK: The Machine Learning Behind Your March Madness Picks (Part 1)

An Overview of Machine Learning Algorithms

Neural Machine Translation with Transformers

Azure Machine learning Deep reinforcement learning

MINE: Mutual Information Neural Estimation

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jake Batsuuri

Jake Batsuuri

I write about software && math. Occasionally I design && code. Find my stuff batsuuri.ca

More from Medium

Explainable AI and Visual Interpretability -Introduction (Part 2)

Dalle Mini Is Amazing — And You Can Use It!

Transformer’s Data Loader

Bayesian Elbow Detection with tensorflow_probability