So WTF is Artificial Intelligence Anyway?

Jorge Garcia
D of Things
Published in
9 min readJan 12, 2021

According to Encyclopedia Britannica, artificial intelligence (AI) can be defined as:

“The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. The term is frequently applied to the project of developing systems endowed with the intellectual processes characteristic of humans, like the ability to reason, discover meaning, generalize, or learn from previous experiences.”

By now, we have all heard about how AI can make it possible for computers, machines and other electronic devices to perform increasingly complex and human-like tasks.

As all this sounds almost like magic, with machines performing increasingly complex tasks -from new gaming computers to self-driving cars — in reality most of AI technologies rely on the blend of software methods and technologies that imply collecting, processing and recognizing patterns within large amounts of data.

So, how does AI Works?

AI’s development initiated as an effort towards the creation of systems with similar human-intelligence capabilities having, according to tutorialspoint.com, two main goals:

  • The Creation of Expert Systems − The systems which exhibit intelligent behavior, learn, demonstrate, explain, and advise its users.
  • The Implementation of artificial “Human Intelligence” in Machines − Creating systems that understand, think, learn, and behave like humans.

By providing these devices with key abilities like learning from experience and adjusting to the type of input received, software providers enable them to change and adapt to produce insights by detecting these variances.

Through its evolution, AI has been continuously incorporating technological contributions from many sciences and disciplines, ranging from mathematics to biology and computer sciences and has evolved in parallel with many other sub-field disciplines or subareas of AI (Figure 1).

Some of these, subareas include:

  • Machine learning (ML). Basically, ML uses methods like statistics, neural networks, operations research and others to automate the analytical model building process that makes it possible to find hidden patterns and insights using large data sets. You can check WTF is ML here.
  • Neural networks. A neural network is a specific type of machine learning method built of interconnected units (network) able to iteratively process data by responding to external inputs and relaying data between each unit. The process requires multiple runs over the data set to find connections and derive meaning from undefined data.
  • Deep learning. A special case of ML that applies neural networks, is composed of many layers of processing units. The network has taken advantage of continuous advances in computing power as well as new training techniques to “learn” complex patterns within large data sets. Among its many applications image and speech recognition can be included as preponderant ones. Check WTF is Neural Network here.
  • Natural language processing (NLP), or the technology that gives computers the ability to “understand” and generate human language, written or in speech. Today, NLP includes human-computer interaction, in which both devices and humans communicate using normal, everyday language to interact.
  • Computer vision. Relying on some previous technologies mentioned and especially on pattern recognition deep learning, computer vision aims to recognize what’s in an existing image or video. By analyzing and understanding images, computers and devices can capture images or videos in real-time and interpret with accuracy what they contain.
  • Cognitive computing. A more recent addition to the AI field, cognitive computing aims also to provide information and insights that allow improving the decision-making, while enabling natural interaction between computers, devices and users. The main objective is to enable machines to mimic human processes and provide insights in a “human natural” fashion (language, images, etc.)
Figure 1. Some of many AI sub-fields

In essence, AI consists on developing algorithms and models that can ingest large amounts of data and, using an iterative process, progressively learn and adapt to improve the information outcome.

With each iteration AI learns and acquires a new “skill” which enables it to improve the way it performs a classification or a prediction.

Today AI and many of its sub-fields are especially suited to approach problems that require working with data:

  • In large amounts
  • that it’s not structured, well organized or well formatted
  • That changes constantly

As AI finds structure and regularities in data, the algorithm keeps improving and acquires a skill, the algorithm keeps executing until it can accurately classify or predict.

A key aspect of AI models is they adapt when given new data, which allows the model to adjust through training.

Traditional and AI based programs have important differences between them, while traditional programs are coded using a set of precise instructions and rules to answer specific questions, AI programs are flexible to allow the answering of generic questions.

According to Dr. Rajiv Desai, there are important differences between traditional and AI based software solutions which include, processing, nature of the data input and structure, among others (Figure 2):

Figure 2. Conventional programming vs AI programming (Credit: Dr. Rajiv Desai, An Educational Blog)

As opposed to conventional coding where the code is key to guide the process data in AI is, rather than the algorithm, the key value. Conventional software programs are provided with data and are told how to solve a problem, while AI programs work on exploiting inference capabilities to gain knowledge about a specific domain.

The following table (Figure 3), provided also by Dr. Rajiv Desai illustrates main differences between programming with and without AI.

Figure 3. Programming with and without AI programing (Credit: Dr. Rajiv Desai, An Educational Blog)

Due to its modular nature, AI is, in many cases, incorporated within existing applications rather than sold as an individual solution, although a new generation of programming platforms exist that enable users and organizations to develop AI-based applications.

Good But Still, What’s with All the Recent Hype With AI?

While we can date the initial development back to the 1940s -somehow in parallel to the own evolution of computer systems- it’s until recent years that AI has become almost omnipresent in any type of software system available, why?

While today traditional computer programs can perform simple and increasingly complex tasks and analysis of data ―especially due to advances in computer processing speed as well as memory and storage power―, new business models keep increasing the demand for systems that can provide better insights and even act or decided on them, such is the case with new technologies like as mobility, cloud computing or the internet of things.

All the previous is triggering the need for systems capable to analyze, predict and autonomously improve, features that traditional systems don’t have.

So, aside from the new AI-based applications that keep emerging, due to its modular nature, AI with all its sub set of methods and technologies have the capability to embed “intelligence” to existing software applications so today, a myriad of computers and new devices already in the market are being improved with new AI capabilities, a reason why more and more applications are being infused with this pervasive technology.

So, today, a myriad of services, ranging from conversational platforms, to bots and smart machines are now being applied exponentially to more software and products to improve their services at homes, workplace, and even on the streets.

From Siri, added as a feature to all Apple products , to the brand new autonomous database services offered by the Oracle DWH Automation service, many products are now poised to be infused with advanced AI capabilities.

How About the Potential Applications of AI?

As mentioned, software applications in all industries and business areas keep incorporating small and big pieces of AI functionality within their domains. A good sample of the many current uses of AI include:

  • Cybersecurity. A growing number of organizations incorporate AI and ML algorithms to, for example, detect malware. ML algorithms and models can predict with increasing accuracy which files carry malware by looking into patterns within the file or how the data was accessed which can signal its presence.
  • Fraud Detection. As AI and ML algorithms improve and become more efficient, so the solutions to detect potential fraud. New systems for this purpose now incorporate AI for spotting and predicting potential cases of fraud across diverse fields, including banking or online purchasing sites. Organizations use AI’s capabilities to continuously improve their mechanisms for spotting potential cases of fraud by comparing millions of transactions and being able to distinguish between legitimate and fraudulent transactions.
  • Health Care. New AI applications can now provide personalized medicine and X-ray readings by analyzing images while AI based personal health care assistants can remind you to take your pills, exercise or eat healthier relying on the analysis of your personal health data.
  • Manufacturing. As data is streamed from connected equipment, AI based software can analyze manufacturing equipment’s data and forecast expected load and demand or predict its maintenance cycle by using specific types of deep learning networks that use sequence data.
  • Retail. AI can now provide retailers with virtual shopping capabilities, offer personalized services and recommendations for users while also gain efficient stock management and site layout via improved analysis and insight provided by AI.
  • Sports. New AI based solutions in sports can now be applied for image capturing of game plays and provide coaches with reports that can help them improve game tactics and strategy.

A we can see from the samples above, there are several cases where AI can be effectively applied for process improvement, efficient analysis, and better decision making.

How About the Software Available and its Adoption in an Organization?

Despite AI sound like a complicated and worst of all, expensive technology to adopt, currently AI has become accessible for almost any type of organization.

Now AI is embedded in so many software solutions that organizations of all sizes can adopt AI in some form and for a great deal of business uses that, it wouldn’t even be surprising if you are already using some AI enabled solution and not being unaware of it.

So, where should we start using AI within our organization? Well, this will depend on your organization’s budget, use case(s) complexity and existing expertise of AI to define what type of AI and consequently, what type of provider and vendor should we pick.

A good starting point would be to classify categories, those of companies offering AI solutions to understand in general the varied types of AI companies and how they could potentially help us to adopt some form of AI within our organization
.
In her blog The 3 major categories of AI companies, Catherine Lu, makes an interesting classification of AI companies, dividing them in three main categories:

“Data science consulting firms are defined by their low level of productization. Their main advantage is that it’s easier for them to deliver great results, as AI models require customization and are highly dependent on customer data. Their disadvantage is that they cannot scale quickly. For companies that are expected to be high growth, they will need to figure out how to move out of this category.”

“AI platform companies offer to be the underlying infrastructure on top of which specific AI solutions live. They can allow end users to import data, perform data wrangling and transformations, train models, and perform model validation.”

This includes platforms like H2O.ai and Databricks.

“Vertical AI companies solve a particular business problem or set of problems with a productized solution. They enable their enterprise customers to achieve additional lift from AI without needing to build or maintain models in-house. Examples on this end are more numerous.”

This includes companies like DigitalGenius (customer support), Entelo (recruiting), Cylance (cybersecurity), or DataVisor (fraud detection).”

On a brief note, while Ms. Lu emphasizes her belief that vertical AI companies will be those that succeed due to their ability to provide productized solutions that scale, recent development and evolution of low code technologies make me think a bit different, as they are enabling a larger number of organizations to instead of adopt vertical solutions, to acquire AI development platforms that have lower learning curves and, consequently, enable the production of custom solutions with lesser effort but more customized capabilities.

Examples? Some include IBM (Watson), Amazon and Microsoft.

So… What’s in it for Me and my Organization?

Well, in short, AI could offer effective ways to achieve improvement in different fronts, including business operation, analytics efficiency as well as decision making improvement.
In wider view, the benefits of AI adoption can come in different forms, AI solutions deployed properly can allow organizations to streamline and improve operations via automation and adaptation while also improving analysis processes to increase accuracy and improve chances of successful decisions.

Whether your organization decides to go easy and adopt a proven vertical AI solution or jump directly to developing AI solutions in-house, as more and more software providers keep infusing AI to their software offerings, it is only natural to expect AI will keep continuously evolving and, as it does, will be improving the way many software solutions work.

So, while science fiction novels and movies portray AI as machines and robots that can and will eventually rule the world, in reality, up to now AI is more about enhancing than replacing what humans can do, or can’t?

Originally published at http://www.dofthings.com.

--

--

Jorge Garcia
D of Things

Industry Analyst & proud father of 3. Opinions are mine. Montreal