2020 in Review With Viral B. Shah

Synced
SyncedReview
Published in
5 min readDec 28, 2020

In 2020, Synced has covered a lot of memorable moments in the AI community. Such as the current situation of women in AI, the born of GPT-3, AI fight against covid-19, hot debates around AI bias, MT-DNN surpasses human baselines on GLUE, AlphaFold Cracked a 50-Year-Old Biology Challenge and so on. To close the chapter of 2020 and look forward to 2021, we are introducing a year-end special issue following Synced’s tradition to look back at current AI achievements and explore the possible trend of future AI with leading AI experts. Here, we invite Dr. Viral B. Shah to share his insights about the current development and future trends of artificial intelligence.

Meet Dr. Viral B. Shah

Dr. Viral B. Shah is one of the creators of the Julia language and co-founder and CEO of Julia Computing. Julia combines the ease of use of Python with the speed of C. It has been downloaded over 10 million times, and is now taught at MIT, Stanford, and many universities worldwide. The Julia co-creators were recently awarded the prestigious James H. Wilkinson prize for Numerical software. Viral has a Ph. D. in Computer Science from the University of California, Santa Barbara.

Julia Computing was founded in 2015 by all the creators of Julia. JuliaHub is the company’s flagship product that helps Julia developers effortlessly use Julia at scale in enterprises. Julia Computing also provides support, consulting and training to enterprises worldwide. In partnership with Pumas-AI, Julia Computing has brought Pumas.jl to pharmaceutical companies. The company is collaborating with industry and university partners to bring innovative new AI products to market leveraging the JuliaHub platform.

The Best AI Technology Developed in the Past 3 to 5 Years: “Scientific Machine Learning”

The progress made in the last few years is truly amazing. Tasks such as image recognition and speech that would easily foil a computer in a Turing test as recently as 5 years ago are now routinely accomplished by computers. While self-driving cars using computer vision are yet to become commonplace, everyone is using sentence completion and grammar correction features in Google Docs. Equally impressive has been the pace of innovation in hardware development for supporting such workloads.

Personally, for me what has been interesting has been the application of AI in scientific fields — what we often refer to as scientific machine learning. Physics Informed Neural Networks have provided a nice framework to train neural networks on scientific models. Neural network architectures have been devised for candidate screening in drug development, devising new materials, and a variety of interesting scientific problems. While applications in vision, language and speech have become commonplace, we are only barely scratching the surface in the field of scientific machine learning.

The Most Promising AI Technology in the Next 1 to 3 Years: “Differentiable Programming”

Differentiable Programming is likely to be the most promising AI technology in the next few years. The basic idea behind the concept of differentiable programming is to take the basic components of deep learning algorithms — Automatic Differentiation, Optimization, and GPU execution, and make them available to a broader class of models. Models where layers may be either dense or sparse, models that have control flow and recursion, models where layers can be a differential equation, and so on. After all, models are just programs.

Many programming language designers are adding capabilities for differentiable programming to their languages. While Julia was one of the first languages to embark on this path, other languages such as Swift, Kotlin, and F# are also adding such capabilities at the language level.

Especially in the context of Julia, differentiable programming capabilities are enabling new science. Approaches such Physics Informed Neural Networks, Universal Differential Equations, Echo State Networks and much more are available through the Julia SciML organization and work effortlessly within the Julia ecosystem. These approaches make it possible to automatically infer scientific knowledge from sparse datasets, significantly improve the speed of simulation, bring new hardware into scientific computing workflows and push the boundaries on computational discovery.

Differentiable programming is enabling scientific applications as diverse as climate science, battery modeling, personalized medicine, HVAC system design, geothermal energy production, materials design and much more.

The Biggest Challenge in the Field of AI: “Hardware”

There is a challenge on hardware with ever increasing model complexity. Models such as GPT-3 consume huge amounts of power, while using the best hardware available today. New hardware that gives an order of magnitude more improvement in performance per watt and performance per dollar is essential to democratize the field.

At the same time, software developers need to pay attention to building composable abstractions for AI algorithms. Differentiable programming is one clear idea that language developers are incorporating. Programming hardware such as GPUs and TPUs continues to be difficult for the average programmer. A majority of the usage comes through a few libraries, leaving much of the potential of new hardware untapped due to inadequacies in programming environments.

The Latest Noteworthy Development: “NeuralSim”

Advancing science through computing is a topic the Julia community is passionate about. Specifically, we at Julia Computing are deeply engaged in the field of scientific machine learning and have made some significant advances recently.

Julia’s differentiable programming capabilities are getting quite an overhaul which will be available to users in 2021. In addition, we have developed new methods that can successfully train surrogates over systems of non-linear stiff differential equations that are common in scientific simulations. We will be packaging our advances in a software tool — NeuralSim — and bringing it to the market in 2021. NeuralSim will make it possible for scientists and engineers to dramatically simplify their design workflows by leveraging computational techniques (enabled through cloud computing) and accelerate the pace of innovation and time to market.

Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how China has leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle. Along with this report, we also introduced a database covering additional 1428 artificial intelligence solutions from 12 pandemic scenarios.

Click here to find more reports from us.

We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global