The Next Generation Of Artificial Intelligence

SHAIK SAMEERUDDIN
Oct 15 · 9 min read

ODSC - Open Data Science Benjamin Obi Tayo Ph.D. Luke Posey Benjamin Obi Tayo Ph.D. Isaac Faber Jacqueline Nolis Rutger Ruizendaal Stephen Muskett, M.S.Ed Joshua Davidson Ken Jee Madhav Bahl Jacqueline Nolis Sameer Patel

Image for post
Image for post
A.I

The artificial intelligence field is moving quickly. At the 2012 ImageNet competition, it was just 8 years since the modern era of deep learning started. Since then, developments in the sector have been breathtaking and relentless.

This breakneck speed, if anything, is just accelerating. The field of AI will look very different five years from now than it does today. Methods that are now considered cutting-edge will have become outdated; methods will be mainstream that is nascent today or on the fringes.

What is the next artificial intelligence generation going to look like? What novel AI approaches are going to unlock currently unimaginable technology and market opportunities? This paper highlights three new fields within AI that are poised in the years ahead to redefine the sector and society. Please study now.

1. Unattended Learning

Supervised learning is the dominant model in the field of AI today. AI models learn from datasets in supervised learning that humans have curated and classified according to predefined categories. (The word “supervised learning” derives from the fact that the data is prepared by human “supervisors” in advance.)

Recommended For You:

Although supervised learning, from autonomous vehicles to voice assistants, has driven tremendous progress in AI over the past decade, it has significant limitations.

The method of marking thousands or millions of data points manually can be incredibly costly and cumbersome. A big bottleneck in AI has been the fact that humans have to mark data by hand before machine learning models can consume it.

Image for post
Image for post
Unplash

Supervised learning reflects a small and circumscribed type of learning at a deeper stage. Instead of being able to explore and absorb all the latest data, relationships, and implications in a given dataset, supervised algorithms focus only on the concepts and categories identified in advance by researchers.

In comparison, unsupervised learning is an AI method in which algorithms learn from data without labels or guidance provided by humans.

In artificial intelligence, many AI leaders see unsupervised learning as the next great frontier. “UC Berkeley Professor Jitendra Malik put it even more colorfully:” Labels are the heroin of the machine learning researcher, “in the words of AI legend Yann LeCun:” The next AI revolution will not be overseen.

Unsupervised learning more closely mirrors the way people think about the world: without the need for “training wheels” in supervised learning, through open-ended experimentation and inference. One of its fundamental benefits is that much more unlabeled data will still be available in the world than branded data (and the former is much easier to come by).

In the words of LeCun, who prefers the closely related term “self-supervised learning”: “A portion of the input is used as a supervisory signal in self-supervised learning to estimate the remaining portion of the input …. More knowledge of the world’s structure can be acquired from self-supervised learning than from [other AI paradigms] since the data is infinite and the sum of fees can be acquired.”

Unsupervised learning has also had a transformative effect on the production of natural languages. Thanks to a modern unsupervised learning framework known as the Transformer, which emerged about three years ago at Google, NLP has seen tremendous progress recently. (For more on Transformers, see # 3 below.)

At an earlier point, attempts to extend unsupervised learning to other areas of AI continue, but rapid progress is being made. To take one example, a company called Helm.ai aims to leapfrog the pioneers in the autonomous vehicle industry using unsupervised learning.

As the key to improving human-level AI, many researchers see unsupervised learning as the key. According to LeCun, “the biggest challenge of the next few years in ML and AI is mastering unsupervised learning.”

2. Federated Instruction

Image for post
Image for post
Unplash

Data privacy is one of the overarching problems of the modern age. Since data is the lifeblood of modern artificial intelligence, problems with data privacy play an essential (and sometimes limiting) role in the trajectory of AI.

Methods that allow AI models to learn from datasets without compromising their privacy, privacy-preserving artificial intelligence, is thus becoming an increasingly important pursuit. Federated learning is perhaps the most promising path to privacy-preserving AI.

In early 2017, researchers at Google first proposed the notion of federated learning. Interest in federated learning has exploded over the past year: in the first six months of 2020, over 1,000 research papers on federated learning were released, compared to just 180 in all of 2018.

Collecting all training data in one place, mostly in the cloud, and then training the model on the data is the traditional approach to creating machine learning models today. But for most of the world‘s data, which can not be transferred to a central data repository for privacy and security purposes, this approach is not feasible. This causes traditional AI methods to be off-limits.

By flipping the conventional approach to AI on its head, Federated learning solves this issue.

Federated learning leaves the data where it is, spread through multiple devices and servers on the edge, instead of requiring one single dataset to train a model. Instead, several iterations of the model are sent out and locally trained on each subset of data, one to each computer with training data.

Training of AI models on personal data spread through billions of mobile devices was the initial federated learning use case. “Modern mobile devices have access to a wealth of data suitable for machine learning models, as these researchers summarised:” This rich data, however, is often vulnerable to privacy, large in quantity, or both, which may prevent logging to the data center…. We recommend an alternative that leaves the training data distributed on mobile devices and learns a shared model by aggregating local co-cooperation.

More recently, for the application of federated learning, healthcare has emerged as an especially promising area.

It’s simple to see why. There are, on the one hand, a large number of use cases of AI use in healthcare. On the other hand, health data, particularly the personally identifiable information of patients, is highly sensitive; its use and movement are limited by a thicket of regulations such as HIPAA. Without ever moving confidential health records from their source or exposing them to privacy violations, federated learning may allow researchers to build life-saving healthcare AI tools.

In order to pursue federated learning in health care, a host of startups have arisen. Paris-based Owkin is the most established; Lynx. MD, Ferrum Health, and Secure AI Labs involve earlier-stage players.

Federated learning can one day play a central role in the development of any AI application involving sensitive data beyond healthcare: from financial services to autonomous vehicles, from cases of government use to consumer products of all kinds. Federated learning, combined with other privacy-preserving strategies such as differential privacy and homomorphic encryption, can provide the key to unlocking the enormous potential of AI while minimizing the thorny data privacy problem.

Today, the surge of data protection laws being enforced globally (starting with GDPR and CCPA, with more laws coming soon) would only speed up the need for these strategies to protect privacy. Expect federated learning in the years ahead to become an essential component of the AI technology stack.

3. The Transformers

In natural language processing, we have reached a golden era.

The release of GPT-3 by OpenAI, the most powerful language model ever developed, captivated the world of technology this summer. In NLP, it has set a new standard: it can write impressive poetry, build code for working, compose insightful business memos, write papers about itself, and so much more.

GPT-3 is only the latest (and largest) in a series of similarly architected NLP models that redefine what is possible in NLP: Google’s BERT, OpenAI’s GPT-2, Facebook’s Roberta, and others.

The Transformer is the core technological innovation underlying this revolution in language AI.

A landmark 2017 research paper introduced transformers. All state-of-the-art NLP techniques were previously focused on recurrent neural networks ( e.g., LSTMs). Recurrent neural networks, by definition, process data sequentially, that is one word at a time, in the order in which the words appear.

The great innovation of Transformers is to make language processing parallel: all the tokens are analyzed at the same time rather than in the order in a given body of the text. Transformers rely heavily on an AI mechanism referred to as attention to enable this parallelization. Attention helps a model to understand the relationships between words irrespective of how far apart they are and to decide the most appropriate words and phrases to “pay attention to” in a passage.

Why is parallelization so precious? Because it makes transformers much more computationally efficient than RNNs, which means that on much larger datasets they can be trained. GPT-3 has been educated on approximately 500 billion words and consists of 175 billion parameters that dwarf any current RNN.

Thanks to the popularity of models like GPT-3, transformers have been associated almost exclusively with NLP to date. But a groundbreaking new paper that successfully applies Transformers to computer vision was published only this month. This work could presage a new era of computer vision, many AI researchers believe. (“My take is: goodbye convolutions,” as well-known ML researcher Oriol Vinyals put it simply).

Although leading AI companies such as Google and Facebook have started to bring models based on Transformers into production, most organizations remain in the early stages of producing and marketing this technology. OpenAI has announced plans to make GPT-3 via API commercially available, which could seed on top of it a whole ecosystem of start-ups building applications.

Image for post
Image for post
Unplash

Expect Transformers to serve as the basis, beginning with natural language, for a whole new generation of AI capabilities in the years ahead.

My advice to you is to be open-minded and think outside of the box while you are looking for a career in data science. It will give you a competitive edge in your career in data science.

Bio: Shaik Sameeruddin I help businesses drive growth using Analytics & Data Science | Public speaker | Uplifting students in the field of tech and personal growth | Pursuing b-tech 3rd year in Computer Science and Engineering(Specialisation in Data Analytics) from “VELLORE INSTITUTE OF TECHNOLOGY(V.I.T)”

Career Guide and roadmap for Data Science and Artificial Intelligence &and National & International Internship’s, please refer :

More articles for your data science journey:

Gain Access to Expert View — Subscribe to DDI Intel

Data Driven Investor

empowering you with data, knowledge, and expertise

Sign up for DDIntel

By Data Driven Investor

In each issue we share the best stories from the Data-Driven Investor's expert community. Take a look

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

SHAIK SAMEERUDDIN

Written by

I help businesses drive growth using Analytics & Data Science | Public speaker | Uplifting students in the field of tech and personal growth

Data Driven Investor

empowering you with data, knowledge, and expertise

SHAIK SAMEERUDDIN

Written by

I help businesses drive growth using Analytics & Data Science | Public speaker | Uplifting students in the field of tech and personal growth

Data Driven Investor

empowering you with data, knowledge, and expertise

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store