OpenAI launches Codex, an API for translating natural language into code
OpenAI recently released Codex, an AI system that translates natural language into code. It is able to understand more than a dozen programming languages and can interpret commands in plain English and execute them, making it possible to build natural language interfaces for apps. Codex also powers GitHub Copilot which provides suggestions for whole lines of code inside development environments.
Now that machines can learn, can they unlearn?
Machine unlearning is a nascent field of research that deals with the question “can we remove all influence of…
DeepMind aims to marry deep learning and classic algorithms
DeepMind has recently published a paper about Neural Algorithmic Reasoning (NAR). NAR posits that if deep learning methods were better able to mimic algorithms, we may see deep learning methods achieve the same sort of generalization as algorithms.
Read the arXiv paper here
Are we in an AI overhang?
An overhang is when you had the ability to build transformative AI for quite some time, but you haven’t because no one has realized it’s possible. …
Do neural nets really need to be so big?
Researchers for several decades have shown that with “pruning” they can reduce the number of parameters in a neural net by as much as 90% without impeding the ability of the model to succeed at the tasks it was trained for. The major shortcoming to this technique was that this could only be done after the model was trained.
However, in 2019, MIT researchers showed that parameters pruned after training could have been pruned before or early in training without affecting the network’s capability. …
AI 21 Labs train a massive language model to rival OpenAI’s GPT-3
AI21 Labs is planning to release its 178 billion parameter language model. With 3 billion more parameters than the reigning incumbent, OpenAI’s GPT-3, AI21 Labs are hoping to challenge OpenAI’s dominance in the “natural language processing-as-a-service” field.
Building architectures that can handle the world’s data
Learn more about DeepMind’s Perceiver IO, a more general-purpose architecture that can be used to process images, point clouds, audio, video, as well as any combination of the former. …
OpenAI releases Triton, a programming language for AI workload optimization
OpenAI recently announced the release of Triton, an open-source, Python-like language for writing highly efficient GPU code for AI. Triton aims to automate optimizations so that developers can focus on the high-level logic of their code.
Report 75% of developers say they’re responsible for data quality
Nearly 75% of developers say they are responsible for managing the quality of the data used in their applications — up from 50% in the same survey last year.
These are the startups applying AI to tackle climate change
Climate insurance, precision…
DeepMind says it will release the structure of every protein known to science
MIT Tech Review
DeepMind has announced that they will release the structure of over 350,000 proteins — nearly every protein in the human body and the shapes of thousands of other proteins found in the most widely studied organisms.
Hubble is back!
After the nail-biting month in safe mode, the Hubble Space Telescope has been given a new lease on life and is back in action to help us understand the mysteries of the universe!
Nvidia releases TensorRT8 for faster AI inference
Nvidia has released the…
Data scientists are from Mars and software developers are from Venus
Data scientists and software developers operate in deceivingly disparate ways, and appreciating these differences is vital to integrating best practices into an organization.
AI is harder than we think: 4 key fallacies in AI research
AI has fallen short of many predictions made over the last few decades — this is likely because we don’t yet truly understand the nature and complexity of human intelligence.
The great chip crisis threatens the promise of Moore’s Law
MIT Tech Review
Over the last year, the promise of Moore’s Law has begun…