Elixir — The other side of (Symbolic) AI

Lorenzo Sinisi
elixir-bytes
Published in
4 min readJul 15, 2023

In today’s tech landscape, we find ourselves inundated with an abundance of machine learning buzzwords and concepts, including the intriguing concept of LLM (Large Language Model).

However, amidst this sea of hype, it’s worth noting that projects like Axon and Bumblebee have positioned Elixir as a true pioneer at the forefront of what is currently achievable in the field of machine learning. With its groundbreaking advancements, Elixir demonstrates its exceptional potential and effectiveness in harnessing the power of ML, pushing the boundaries of what we thought was possible.

But is LLM and ML all we can do?

Enter Symbolic AI (well, we did enter in the 80s). Machine learning (ML) and Symbolic AI represent two distinct approaches in the field of artificial intelligence. ML focuses on inferring rules or patterns from data by using statistical techniques, allowing the system to learn and make predictions or classifications based on the learned rules. It relies on probabilistic reasoning and generalizes from observed examples to handle new data. On the other hand, Symbolic AI operates on predefined rules or symbolic representations of knowledge, where explicit rules are provided as inputs to the system. It can then apply deductive reasoning to infer deterministic outputs based on the given rules. While ML excels at handling complex and uncertain data, Symbolic AI offers explicit control over logical reasoning and can provide deterministic results based on predefined rules. Both approaches have their strengths and are used in different AI applications based on the requirements and nature of the problem at hand.

For example: generating a “creative” text or a new image is obviously much better done by a probabilistic algorithm. But would you trust a medical diagnosis from a probabilistic machine that “kind” of gets it “most of the times” but not sure it will “always in the same way”? Certainly not, and this is where Symboilc AI can help us all.

The rules in Symbolic AI are added by humans with what is usually called a DSL (Domain specific language) and so they are perfectly introspectable and testable.

I have ventured into building a Symbolic AI engine in Elixir and I would like to show you what we can do with it. I created a DSL (Sanskrit), a Rule Engine (Retex) and a project that glues all of that together in something called “Neural Bridge” (link)

So in Elixir we can now easily build expert systems with the help of domain experts. They simply have to learn a DSL which is very english-like and provide devs with the ruleset to solve a specific problem. Let’s see some examples of how we can do that:

Example 1 : calculate the net salary of an employee in the UK

rules = [
NeuralBridge.Rule.new(
id: 1,
given: """
Person's salary is equal $salary
""",
then: """
let $monthly_salary = div($salary, 12)
Person's monthly_salary is $monthly_salary
"""
),
NeuralBridge.Rule.new(
id: 2,
given: """
Person's monthly_salary is equal $monthly_salary
""",
then: """
let $payout = mult($monthly_salary, 0.64)
Salary's net_amount is $payout
"""
),
NeuralBridge.Rule.new(
id: 2,
given: """
Salary's net_amount is equal $amount
""",
then: """
Salary's net_amount is $amount
"""
)
]

facts = """
Person's salary is 60000
Person's employment_type is "Full-time"
Person's location is "UK"
"""

NeuralBridge.Session.new("uk")
|> NeuralBridge.Session.add_rules(rules)
|> NeuralBridge.Session.add_facts(facts)
|> Map.fetch!(:inferred_facts)

Example 2: dynamic pricing

In this example, the rules calculate the discount to be applied to a customer based on the number of items they have bought in a month. The discount percentages are determined as follows:

  • If the customer has bought 5 items, the discount percentage is set to 20%.
  • If the customer has bought less than 2 items, the discount percentage is set to 0%.
  • If the customer has bought exactly 3 items, the discount percentage is set to 10%.
rules = [
NeuralBridge.Rule.new(
id: 1,
given: """
Customer's number_of_items_bought is equal 5
""",
then: """
Customer's discount_percentage is 0.2
"""
),
NeuralBridge.Rule.new(
id: 1,
given: """
Customer's number_of_items_bought is lesser 2
""",
then: """
Customer's discount_percentage is 0.0
"""
),
NeuralBridge.Rule.new(
id: 1,
given: """
Customer's number_of_items_bought is equal 3
""",
then: """
Customer's discount_percentage is 0.1
"""
)
]

facts = """
Customer's number_of_items_bought is 5
"""

[
%Retex.Wme{
identifier: "Customer",
attribute: "discount_percentage",
value: 0.2
}
] =
NeuralBridge.Session.new("uk")
|> NeuralBridge.Session.add_rules(rules)
|> NeuralBridge.Session.add_facts(facts)
|> Map.fetch!(:inferred_facts)

And more, you can write rules of any complexity and to understand a bit better how it works check out the README file of the project https://github.com/lorenzosinisi/neural_bridge/blob/master/README.md.

The goal of this article was to inspire you into looking in more than just the direction of LLM and Transformer technology and open the door to tech which is very old but could bring huge benefits to your organization without requiring billions of dollars in server infrastructure. Actually the size of an “ai” as such is just a few MB and it can handle from a few hundred to hundreds of thousands of rules. 👽

--

--