Published in


Meta AI Open-Sources a 175B Parameter Language Model: GPT-3 Comparable Performance at One-Seventh the Compute Cost

Today’s state-of-the-art large language models (LLMs) can have more than 100 billion parameters — a number that is regularly rising — and have achieved astounding performance on complex natural language processing (NLP) tasks such as writing articles, solving math problems, question answering and…




We produce professional, authoritative, and thought-provoking content relating to artificial intelligence, machine intelligence, emerging technologies and industrial insights.

Recommended from Medium

What Did I Learn About Artificial Intelligence (AI)?

Can Blockchain do AI and data labeling. Effect.AI thinks they can, and will!


What’s All the Chatter about Intelligent Workplaces?

How to become Neo from the Matrix with Data Science…

Bengio and Mila Researchers Use GAN Images to Illustrate Impact of Climate Change

AI’s bias problem — and what we need to do to solve it

Wenzhou Medical University Eye Hospital forms artificial intelligence strategic alliance with…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store


AI Technology & Industry Review — | Newsletter: | Share My Research | Twitter: @Synced_Global

More from Medium

Tsinghua U & BAAI’s CogView2 Achieves SOTA Competitive Text-to-Image Generation With 10x Speedups

Meta AI is Studying the Human Brain to Build Better Language Models

GPT-3’s closest competitor!

GauGan: Generating Photorealistic Images from Drawings