The Future of AI: How Can Model Developers Keep Up?

Smarter.ai
Smarter.ai
Published in
11 min readDec 21, 2021

Dynamic, exciting, and constantly evolving, the artificial intelligence sector continues to be one of the most fascinating in the world of tech.

With 83% of businesses noting that AI is something that they want to prioritize in the coming years, it’s unsurprising that demand for data scientists is currently through the roof.[1] Indeed, Google alone requires 30,000 people to work on their machine learning models, showcasing just how much the job market is shifting in favor of programmers.[2]

Based on this information, creators may be wondering what the future of machine learning holds for them. It becomes a bit of a minefield out there when technological advancements are outpacing what most people can keep track of.

We’d say that preparedness usually yields the best results for algorithm creators, as it allows them to better assess datasets and make necessary adjustments to models over time.

So, to make your life as an AI developer easier and keep you on the right path, we’ve compiled a list of the most important trends to consider when it comes to the future of machine learning.

Covering everything from reusable solutions to the explosion of computing power, we’ll keep you ahead of the pack and help you reach your full potential.

After all, knowledge is power, right?

The Future of AI

Reusable models/marketplace sharing

There’s nothing wrong with new AI models, but the catchy phrase “reduce, reuse, recycle” needn’t only apply to bottles and cardboard packaging — it can be relevant when it comes to our technology too.

Because of AI’s undeniable carbon footprint, developers, businesses, and governments need to recognize the importance of keeping AI sustainable and ethical. Although we can reduce AI’s carbon footprint in other ways, one of the most efficient fixes is found in reusable models and transfer learning.

If you’re unsure of what transfer learning entails, it involves using AI models that were specifically trained for one use and repurposing them for another. This model ‘upcycling’ is particularly popular in the image recognition and natural language processing (NLP) worlds, as both technologies adapt well to the training required.[3]

Not only is transfer learning highly innovative but reusing AI solutions to solve common business problems should save businesses and governments precious time and money — and who can say no to that?

Gone are the days of the “isolated learning paradigm”, where it was believed that deep learning algorithms could be built for a single specified purpose.[4] Sure, it was previously akin to a pipe dream, but technology has now reached a point where pre-trained models can solve problems with little data, little training, and fewer cash injections.[5]

Of course, the problem you’re trying to solve with transfer learning must be related to the original issue a model was trained to handle. However, with a bit more research and legwork, who knows what the future of artificial intelligence could hold?

Training models without data

Although typical machine learning models need to process an almost offensive number of examples to recognize a pattern, humans don’t face this issue as they are naturally adept at visual recognition.

When it comes to data processing, machines may be more technologically advanced than humans, but they’re still unable to call a spade a spade until they’ve seen almost every example under the sun!

There’s nothing wrong with today’s machine learning models per se, but there’s ample room for enhanced efficiency in the industry that can be accomplished by feeding machines less data over time to achieve the same outcomes.

As a result, the growing ideal of no-shot and one-shot AI is beginning to come into focus and is quickly gaining traction as one of the major trends to watch for the future of artificial intelligence.

In basic terms, one-shot learning involves giving a model a handful of examples (or one, ideally!), and allowing it to learn a “distance function” between images to classify any future categories it may encounter.[6]

If you’re looking for a real-world example of this technique, you may have seen it being used at passport control or on your iPhone’s Face ID system.

Over time, the deployment of one-shot learning should reduce the costs of data collection and the computational costs associated with traditional machine learning by removing a considerable percentage of the total data required for training.[7]

Less-than-one-shot learning

We’ve already discussed the potential of one-shot AI, but what if developers could do one better by implementing less-than-one-shot learning?

A relatively new concept, less-than-one-shot AI was realized in a study undertaken by the University of Waterloo in Ontario in 2020.

After extensive research, the university discovered that AI would eventually be able to recognize more objects than it was trained on, requiring less than one sample per class. This feat would be accomplished using “soft” labels that distill data through shared features.[8]

According to Ilia Sucholutsky of TechTalks, less-than-one-shot learning “[creates] small synthetic datasets that train models to the same accuracy as training them on a full set”.[9] If these data sets can be shrunk even further over time, there should be massive efficiency gains up for grabs for both developers and businesses.

Huzzah!

As this is a fresh concept involving K-NN (the “k-nearest neighbors” algorithm), modelers and creators can get ahead of the game by investing their time in this up-and-coming modeling trend.[10] Despite there being little data available at present for less-than-one-shot learning, as research intensifies, we feel that AI creators should be on hand to strike while the iron’s hot.

There are limits to both one-shot and less-than-one-shot learning models, such as sensitivity to variations over time that are difficult to resolve. However, the potential pros of these concepts more than outweigh their cons.[11]

Keep up to date with programming languages

If you’re hoping to have a dazzling future in the world of AI, then you’ll need to keep up to date with programming languages.

It looks as though Python is set to remain king of the programming languages, with over 57% of developers stating that they would pick it as their language of choice.[12] This is largely due to its logical structure, readability, and quick iteration rates.

However, if you have a passion for survival models that involve biostatistics, being well-versed in R is bound to prove beneficial.[13] R is quite tricky to learn though, so bear this in mind before attempting to master it in a single night while nursing a double espresso.

If you’re wondering how the relatively old-school C++ plays into the future of AI, it’s still regarded as an excellent choice for deploying models and is one of the finest general-purpose and high-performance languages in the programming sphere.[14] Extremely speedy with low-level memory control, this language remains a leading player in the gaming and desktop application worlds.[15]

Due to the constant ‘scaling-up’ of AI, fully understanding machine learning and being able to effectively apply programming languages should stand you in great stead when it comes to dealing with new algorithms and building upon existing ones without floundering.

Realistically, the more comprehensive your knowledge is, the better your chance of breaking into the world of big tech where models are highly complex and require the finest talent to maintain and deploy.

Google? Microsoft?

They’ll soon be knocking at your door if you keep your skills up.

Improvements to computing power

The explosion of computing power over the years has been impressive, and until 2012, advancements roughly followed Moore’s Law.

For context, this law states that the “number of transistors on a microchip doubles about every two years, though the cost of computers is halved”.[16] However, because chips are constantly improving and computing power is ramping up year on year, Moore’s Law is unlikely to be relevant past 2025.[17]

Indeed, AI systems are growing so quickly, that the computing power used in large training systems has “been increasing exponentially with a 3.4-month doubling time” instead of the law’s predicted two-year timeframe.[18]

If next-level chip enhancements weren’t enough to excite you as a budding developer, perhaps the rise of quantum computing will tickle your fancy.

With one of Google’s quantum computers able to solve an impossibly difficult problem in 200 seconds that would take any traditional computer 10,000 years to figure out, the potential gains for efficiency and productivity in the AI world are staggering.[19]

If you’re wondering how this accomplishment is possible, it’s because normal computers can only deal with one data state at a time, whereas quantum computers can simultaneously process multiple data states in the same period.[20] Not only is this impressive from a technological standpoint, but it leads to much faster processing speeds for businesses.

Appealing? Absolutely.

Quantum computing will likely lead to a productivity race between tech’s biggest names (there are whispers that IBM is already clamoring for a slice of the pie with its promise of a 1000-qubit quantum computer by 2023!),[21] propelling the industry to heights thought impossible a mere decade ago.

Automatic Machine Learning

To round things off, let’s discuss one of the most important AI developments in recent years: Automatic Machine Learning (AutoML).

Typical machine learning requires a lot of time, resources, and manpower to operate, making it incredibly expensive. In a nutshell, this is what automated machine learning seeks to counter.

Intending to automate the end-to-end experience of machine learning models, AutoML hopes to remove the significant barriers to entry that exist when data processing and algorithms are handled by humans. Not only will this allow data scientists to work on more complex tasks, but it should offer more organizations a route into the AI world.

If you’re looking for an example of this exciting concept in action, just turn to Google’s Cloud AutoML model. Using Neural Architecture Search and just a pinch of transfer learning, this algorithm sought to advance the AI world by “mak[ing] AI experts even more productive, advanc[ing] new fields in AI, and help[ing] less-skilled engineers build powerful AI systems they previously only dreamed of”.[22]

It succeeded at doing so, creating “fast, scalable, and easy-to-use AI offerings” centered around video and image analysis, speech recognition, and multi-language processing that several businesses use today.[23]

If AutoML is taken one step further and paired with low-code and no-code AI that’s geared towards non-experts, it should allow businesses to solve their issues more cheaply than with custom solutions. Given enough time, this should mean that accessibility in the tech world is no longer a panacea that’s out of reach.

However, with only 37% of companies using AI in the workplace as of 2019 despite the increased availability of low-code and no-code options, there’s still much work to be done when it comes to convincing businesses of its worth.[24]

There will always be teething problems when using any form of AI for the first time, but AutoML allows all businesses to harness excellent models without hiring expensive data scientists to manage them. With so few data scientists available for hire (QuantHub has noted that there was a shortage of 250,000 data professionals in 2020 alone), Automated Machine Learning should help struggling businesses become more efficient without employing specialists.

A few words to leave you with…

As a creator, the best thing you can do for your career is to future-proof yourself.

It may sound rather vague, but you should always aim to do everything you can to stay on top of the future AI trends that may give you a leg up in the industry.

By excelling in several programming languages, understanding the importance of low-code AI, and keeping track of computing developments, you can make your skillset completely invaluable to modern businesses.

With the future of AI set to be full of automation and democratization, knowing how to repurpose machine learning solutions to save companies time, effort, and manpower is sure to make you one of the most in-demand professionals in the field.

If nothing else, we can take a note from Dorothy Gale when it comes to the future of machine learning.

With progression being the name of the game, we certainly won’t be in Kansas anymore.

[1] Falon Fatemi, “3 ways artificial intelligence is transforming business operations”, last modified May 29, 2019, Forbes, https://www.forbes.com/sites/falonfatemi/2019/05/29/3-ways-artificial-intelligence-is-transforming-business-operations/?sh=7cdf7b976036

[2] Bruno Jacobson, “Will programmers have a job in the future?”, last modified May 15, 2019, Futures Platform, https://www.futuresplatform.com/blog/will-programmers-have-job-future

[3] Cem Dilmegani, “Transfer Learning in 2021: what it is & how it works”, AI Multiple, last modified July 5, 2020, https://research.aimultiple.com/transfer-learning/

[4] Harley Davidson Regua, “Introducing Transfer Learning as Your Next Engine to Drive Future Innovations”, Data Driven Investor, February 27, 2020, https://medium.datadriveninvestor.com/introducing-transfer-learning-as-your-next-engine-to-drive-future-innovations-5e81a15bb567

[5] Sebastian Ruder, “Transfer Learning — Machine Learning’s Next Frontier”, last modified 21 March 2017, https://ruder.io/transfer-learning/index.html#whytransferlearningnow

[6] Connor Shorten, “One-Shot Learning”, last modified Jan 24, 2019, https://connorshorten300.medium.com/one-shot-learning-70bd78da4120

[7] Cam Dilmegani, “What is Few-Shot Learning (FSL)? Methods & Applications”, last modified November 26, 2021, https://research.aimultiple.com/few-shot-learning/

[8] Karen Hao, “A radical new technique lets AI learn with practically no data”, last modified October 16, 2020, https://www.technologyreview.com/2020/10/16/1010566/ai-machine-learning-with-tiny-data/

[9] Ben Dickson, “Machine learning with one less example”, last modified October 1, 2020, TechTalks, https://bdtechtalks.com/2020/10/01/less-than-one-shot-machine-learning/

[10] Alex Woodie, “Researchers demonstrate less-than-one shot machine learning”, Datanami, last modified October 19, 2020, https://www.datanami.com/2020/10/19/researchers-demonstrate-less-than-one-shot-machine-learning/

[11] Ben Dickson, “What is one-shot learning”, TechTalks, last modified August 12, 2020, https://bdtechtalks.com/2020/08/12/what-is-one-shot-learning/

[12] Claire D. Costa, “Top Programming Languages for AI Engineers in 2021”, last modified March 17, 2020, Towards Data Science, https://towardsdatascience.com/top-programming-languages-for-ai-engineers-in-2020-33a9f16a80b0

[13] Michael Slupski, “What do experts say about the future of machine learning (and Python)?”, last modified unknown, STX Next, https://www.stxnext.com/blog/future-of-machine-learning-and-python-expert-opinions

[14] Kira Belova, “Top AI Programming Languages in 2021”, last modified 18 May, 2021, PixelPlex, https://pixelplex.io/blog/top-ai-programming-languages/

[15] Monomita Chakraborty, “What are the best programming languages for artificial intelligence”, last modified February 14, 2021, Analytics Insight, https://www.analyticsinsight.net/what-are-the-best-programming-languages-for-artificial-intelligence/

[16] Carla Tardi, “Moore’s Law”, Investopedia, last modified February 23, 2021, https://www.investopedia.com/terms/m/mooreslaw.asp

[17] Stephen McBride, “These 3 Computing Technologies Will Beat Moore’s Law”, Forbes, last modified April 23 2019, https://www.forbes.com/sites/stephenmcbride1/2019/04/23/these-3-computing-technologies-will-beat-moores-law/?sh=7daad52137b0

[18] Cem Dilmegani, “Ultimate Guide to the State of AI Technology in 2021”, last modified October 26, 2019, https://research.aimultiple.com/ai-technology/

[19] Jon Porter, “Google wants to build a useful quantum computer by 2029”, last modified May 19, 2021, The Verge, https://www.theverge.com/2021/5/19/22443453/google-quantum-computer-2029-decade-commercial-useful-qubits-quantum-transistor

[20] Charles Riley, “Google claims its quantum computer can do the impossible in 200 seconds”, CNN Business, last modified October 23, 2019, https://edition.cnn.com/2019/10/23/tech/google-quantum-supremacy-scn/index.html

[21] Adrian Cho, “IBM promises 1000-qubit quantum computer — a milestone — by 2023”, Science.org, last modified 15 September 2020, https://www.science.org/content/article/ibm-promises-1000-qubit-quantum-computer-milestone-2023

[22] Jia Lee, “Cloud AutoMl: making AI accessible to every business”, last modified January 17, 2018, Google Cloud, https://cloud.google.com/blog/topics/inside-google-cloud/cloud-automl-making-ai-accessible-every-business

[23] https://cloud.google.com/products/ai

[24] Dmitry Dolgorukov, “How no-code AI is changing business”, last modified July 30, 2021, Forbes, https://www.forbes.com/sites/forbesfinancecouncil/2021/07/30/how-no-code-ai-is-changing-business/?sh=62796af732a8

--

--