AI and The Art of Human-Like Tasting

By Gabriela Padilla

Gabriela Padilla
Insights of Nature
6 min readDec 27, 2023

--

Thinking about a future robot chef is pretty cool, right? We’ve got the tech to mimic human sight and hearing quite well.

But what about our sense of taste?

Identifying taste is a difficult process. When it comes to cooking, it is one of the most essential senses!

Why should we care about AI models trained to taste?

For industries like food and beverage, cosmetics, and even pharmaceuticals, understanding human taste preferences is crucial. AI models can be trained on vast datasets of consumer feedback, reviews, and sensory data to help companies develop products that cater to specific tastes and preferences. Harnessing AI to anticipate consumer tastes, behaviors, and responses to novel food items increases overall customer satisfaction and engagement.

AI models can be used to detect changes in product quality by analyzing taste profiles. This can be particularly useful in ensuring consistency in large-scale production processes.

AI can be used to reshape the landscape of crop breeding. With the introduction of this AI-driven methodology, there emerges a significant opportunity to emphasize and enhance both the taste and nutritional content of crops.

Let’s dive into how humans sense taste

The ability to taste comes from tiny molecules released when we chew, drink, or digest food; these molecules stimulate special sensory cells in the mouth and throat. These taste cells are clustered within the taste buds of the tongue and roof of the mouth, and along the lining of the throat.

When the taste cells are stimulated, they send messages through three specialized taste nerves to the brain, where specific tastes are identified. Taste cells have receptors that respond to one of at least five basic taste qualities: sweet, sour, bitter, salty, and umami.

Another chemosensory mechanism

The common chemical sense, involves thousands of nerve endings, especially on the moist surfaces of the eyes, nose, mouth, and throat. These nerve endings triggers sensations such as the coolness of mint and the burning of chili peppers. Other specialized nerves create the sensations of heat, cold, and texture. When consuming food, the sensations from the five taste qualities, with the sensations from the common chemical sense and the sensations of heat, cold, and texture, combine with a food’s aroma to produce a perception of flavor.

To make this happen, we can use different approaches

A recent study delved deep into the realm of taste prediction, leveraging a dataset encompassing 2601 molecules. This research offers a thorough examination of both well-established and recently introduced molecular feature representations tailored for taste prediction.

The core objective is to discern the optimal neural network architecture that can deliver enhanced precision and adaptability for datasets primarily oriented towards small molecules.

Visual representation of input, molecular embedding, and classifier of the utilized models.

We can divide this process into 3 steps:

1.Feature Representation of Molecules: Molecule features refer to specific characteristics or attributes of molecules that are used to describe or represent them in computational or mathematical models.

In studies about how molecules interact, we often use fingerprints, convolutional neural networks (CNN), and graph neural networks (GNN). These methods have worked well in many tests.

  • Molecular Fingerprints: These are binary or integer representations capturing structural aspects of molecules. We checked six types: Morgan, PubChem, Daylight, RDKit, ESPF, and ErG.
Visual representation of how molecular fingerprints work
  • CNN-based Molecular Embedding: CNNs are deep learning models adept at analyzing grid-like data, potentially adapted for molecular analysis.Think of this like turning molecule names into a language computers understand. We tried three types: Simple CNN, CNN-LSTM, and CNN-GRU.
Visual representation of how CNNs work
  • Graph Neural Networks (GNN): We used five GNN models with Life and DeepPurpose to compare. GNNs are neural networks tailored for graph-structured data, making them ideal for tasks involving molecular graphs. We looked at: GCN, NeuralFP, GIN-AttrMasking, GIN-ContextPred, and AttentiveFP.
Visual representation of how GNNs work

2.Molecular Embedding into Vectors: Following representation, molecules are seamlessly embedded into vectorized forms.

3.Classification Using Predictors: For the task of classification, we employed a multilayer perceptron (MLP). A Multilayer Perceptron (MLP) is a type of feedforward artificial neural network. It’s one of the simplest forms of neural networks, but it’s powerful enough to model complex non-linear relationships.

Think of a Multilayer Perceptron (MLP) as a company with specialized teams, each handling different tasks. Here’s a quick breakdown:

  1. Specialized Teams as Neurons: Each team in the company represents a neuron in the MLP, focusing on specific data.
  2. Team Roles (Activation Functions): Just as teams have roles, neurons have activation functions guiding their data processing.
  3. Collaborative Processing: Teams work together, sharing insights. Similarly, MLP layers collaboratively transform data, refining it at each step.
  4. Final Decision: As the company makes a collective decision, the MLP produces a prediction based on the refined data.

Comparison of Model Performance

Showcases of the performance of different molecular representation models in predicting sweet, bitter, and umami tastes, as indicated by their AUROC scores
Showcases of the performance of different molecular representation models in predicting sweet, bitter, and umami tastes, as indicated by their AUPRC scores

The AUROC metric assesses a model’s comprehensive discriminative ability and its balanced prediction performance. On a similar note, AUPRC acts as another balanced evaluation metric, especially valuable when dealing with datasets that have significant imbalances.

In the context of taste prediction, the findings indicate that models based on Graph Neural Networks (GNNs) consistently outshine other methodologies.

NotCo: Producing vegan products with AI

NotCo uses an AI platform called Giuseppe to develop their plant-based products. Giuseppe is the name of the Artificial Intelligence program developed by NotCo’s team to understand everything about the foods we love and recreate them by replacing all animal products with plant-based ingredients.

Giuseppe is tasked with matching the taste, texture, and aroma of animal-derived products. By delving into the molecular intricacies of the target item, it expertly replicates these attributes using exclusively plant-based ingredients. This sophisticated process is supported by a comprehensive library boasting 300,000 plants.

Giuseppe generates multiple recipes equivalent to the product we want to recreate. It looks for matches in flavor, texture, scent, and functionality. Each recipe is different and there are some that match better than others.

What to Expect in the Future

As we look to the future, we can expect continuous advancements in this technology. Our efforts will persistently focus on refining methods to replicate human taste and boosting the precision of the models designed for this task. Within the food industry, it’s foreseeable that an increasing number of products will be crafted using AI models.

Personal work in the future

Looking ahead, I see myself diving into the world of AI models trained to taste with a specific focus on the food industry. My goal is to use these AI systems to improve food security. I will also explore how these models can be applied in the field of robotics, furthering their potential to benefit other industries.

Now, when it comes to our robot chef

By employing sensors that replicate taste receptors and AI algorithms to mimic the taste processing mechanisms, we could develop robots with a greatly improved sense of taste.

We’re getting closer to having a smart robot chef right at home!

Thank you for reading this! If you want to see more of my work, connect with me on LinkedIn!

--

--