MLearning.ai
Published in

MLearning.ai

Meta AI shares the OPT model with the NLP community

The Open Pre-trained Transformer(OPT) is available for the NLP researchers. The Natural Language Processing embraces the open source.

Introduction

The Meta AI released the Open Pre-trained Transformer(OPT) with 175 billion parameters. It is the biggest NLP model made available to the NLP researchers.

The MetaAI grants access to the largest version of the model via special request form here.

--

--

--

Data Scientists must think like an artist when finding a solution when creating a piece of code. ⚪️ Artists enjoy working on interesting problems, even if there is no obvious answer ⚪️ linktr.ee/mlearning 🔵 Follow to join our 18K+ Unique DAILY Readers 🟠

Recommended from Medium

How AI Can be One of the Disruptive technology in history.

How to Become Better with Deep Learning Online Course

Natural Language Processing and Sentiment Analysis

How to Distort and Add Retouch to the Face Using Spark AR Studio

Machine learning for beginners

How The COVID-19 Pandemic Will Change The Future Of Work

AI in a

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Teemu Maatta

Teemu Maatta

Machine Learning Engineer. Top writer in Natural Language Processing (NLP). Multimodal learning. Artificial General Intelligence (AGI). DALL·E 2. GPT-3.

More from Medium

MUM: how new Google’s algorithm will change NLP

The Advent of AI Coding

Meta-Learning the Huggingface Way

Open Pretrained Transformer (OPT) Is a Milestone for Addressing Accessibility