Quick And Easy Anomaly Detection App, For SQL Enthusiasts Using Snowflake Cortex ML Function

Tech Stack Used: Snowflake Cortex ML , Streamlit, Snowpark Container Services

Solving analytical problems with classical machine learning was limited to the data geeks having experience in programming languages like Python, R and Scala for over a decade. Data Analysts with SQL expertise were always dependant on the Data Scientist fraternity to help them generate predictions from data.

Snowflake Cortex ML functions broke this chain of dependency, empowering SQL users to use ML to solve their business problems.

“Snowflake Cortex is Snowflake’s intelligent, fully-managed service that enables organizations to quickly analyze data and build AI applications, all within Snowflake. These ML functions give you automated predictions and insights into your data using machine learning.”

Well, having set the context, let’s look at a very simple yet interesting scenario. Malfunction of sensor is very common scenario in any factory floor. You may replace ‘factory’ by a ‘hospital’ or any ‘high tech building’ for example. It’s a very generic problem statement that is industry agonistic. Engineers have always struggled to find such spells when the sensor has generated anomalous readings. Once you have identified such spells, you realise your problem is half solved till then. You need to identify the root cause for such erroneous readings.

Data scientists have been creating complex ML models for getting this job done. Now Snowflake cortex ML sql functions can empower a Data Analyst to do this as well. 1. Detect anomalies and 2. Identify some of the contributing features to the anomaly, all using SQL functions.

For our demostration lets take this try to understand our hypothetical factory floor :

  1. The factory has two sites: Site-A and Site-B
  2. Site-A has two sensors & Site-B has one sensor
  3. Every sensor has a set of dimensions: Manufacturer, Input Voltage, Age and Surrounding weather condition

Here is a quick snap of our labeled test dataset:

Sample Dataset (Labeled)

What are we going to solve ?

  1. We will try to identify anomaly in ‘measurement’ using SNOWFLAKE.ML.ANOMALY_DETECTION and
  2. Try to identify the contributing features that has lead to that anomaly using SNOWFLAKE.ML.TOP_INSIGHTS
  3. Finally we will create a quick Streamlit App and host it on Snowpark Container Services to review the Anomaly and its contributing features visually.

If you have been following this far and excited to see the entire process in action let us leap forward.

Steps To Generate Training Dataset And ML Model:

  1. We will generate categorial data for the dimensions : Manufacturer and Weather Conditions.
  2. Post that we will generate labeled sensor readings for a period of 10 days for 3 sensors located across 2 sites. First 9 days of readings will be used for training and the 10th day’s data will be used for testing our model.
  3. Once done we will try to ingest anomalies manually for a few spells during the first 9 days, so that our generated model is not biased.
  4. We will generate our ML model using snowflake.ml.anomaly_detection using the first 9 days of data.
use role accountadmin;

create role if not exists spcs_app_role;

grant create integration on account to role spcs_app_role;
grant create compute pool on account to role spcs_app_role;
grant create warehouse on account to role spcs_app_role;
grant create database on account to role spcs_app_role;
grant usage on integration allow_all_eai to role spcs_app_role;
grant bind service endpoint on account to role spcs_app_role;


declare
username varchar;
stmt varchar;
begin
Select current_user() into :username;
stmt := 'GRANT ROLE spcs_app_role TO USER ' || :username;
execute immediate stmt;
return 'role assigned';
end;


use role spcs_app_role;

create warehouse if not exists app_wh;
Use warehouse app_wh;

Create database if not exists detect_anomaly;
Use database detect_anomaly;
Create database if not exists detect_anomaly;

Use database detect_anomaly;


--- Create static reference data
Create or replace table sensor_manufacturers(
Id int,
Name Varchar(100)
);

Insert into sensor_manufacturers values(1,'Panasonic Corporation');
Insert into sensor_manufacturers values(2,'Qualcomm Technologies');
Insert into sensor_manufacturers values(3,'STMicroelectronics');
Insert into sensor_manufacturers values(4,'Sony Corporation');
Insert into sensor_manufacturers values(5,'TE Connectivity');
Insert into sensor_manufacturers values(6,'Texas Instruments');
Insert into sensor_manufacturers values(7,'Siemens');
Insert into sensor_manufacturers values(8,'Amphenol Corporation');
Insert into sensor_manufacturers values(9,'Dwyer Instruments, LLC');
Insert into sensor_manufacturers values(10,'Bosch Sensortec');
Insert into sensor_manufacturers values(11,'Honeywell International');
Insert into sensor_manufacturers values(12,'Sensirion AG');


Create or replace table weather_cond(
Id int,
Condition Varchar(100)
);

Insert into weather_cond values(1,'Windy');
Insert into weather_cond values(2,'Rainy');
Insert into weather_cond values(3,'Humid');
Insert into weather_cond values(4,'Snow');
Insert into weather_cond values(5,'Cold');
Insert into weather_cond values(6,'Cloudy');
Insert into weather_cond values(7,'Storm');
Insert into weather_cond values(8,'Hot');


-- Prepare Training Data ( 10 days of labeled training data )

Create or replace table ms_review_for_anomaly_with_add_dimensions
( ts timestamp,
site_id varchar,
sensor_id int,
row_id int,
measurement float,
anomaly_label boolean,
manufacturer varchar(100),
weather_condition varchar(100),
voltage float,
age int
);

Set days_of_data = 10; -- days
Set mean_temp = 100; -- degree
Set frequency = 1; -- min



--- Generating Data for 10 Days for Site -A
Insert into ms_review_for_anomaly_with_add_dimensions
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-01 00:00:00') time,
'Site-A' as site_id,
'1' as sensor_id,
row_number() over(order by seq) row_id,
$mean_temp + normal(0,1,random(1)) measurement,
False as anomaly_label,
(uniform(1, 12, random())) manufacturer,
(uniform(1, 8, random())) weather_cond,
(uniform(1.5::float, 3::float, random())) Voltage,
(uniform(1, 5, random())) Age,
from
(Select seq4() as seq from table (generator(rowcount => 1 + $days_of_data*24*60/$frequency)))
)
Select Time,site_id,sensor_id,row_id,measurement,anomaly_label,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id;


--- Generating Data for 30 Days for Site -A
Insert into ms_review_for_anomaly_with_add_dimensions
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-01 00:00:00') time,
'Site-A' as site_id,
'2' as sensor_id,
row_number() over(order by seq) row_id,
$mean_temp + normal(0,1,random(1)) measurement,
False as anomaly_label,
(uniform(1, 12, random())) manufacturer,
(uniform(1, 8, random())) weather_cond,
(uniform(1.5::float, 3::float, random())) Voltage,
(uniform(1, 5, random())) Age,
from
(Select seq4() as seq from table (generator(rowcount => 1 + $days_of_data*24*60/$frequency)))
)
Select Time,site_id,sensor_id,row_id,measurement,anomaly_label,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id;


--- Generating Data for 30 Days for Site -B
Insert into ms_review_for_anomaly_with_add_dimensions
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-01 00:00:00') time,
'Site-B' as site_id,
'1' as sensor_id,
row_number() over(order by seq) row_id,
$mean_temp + normal(0,1,random(1)) measurement,
False as anomaly_label,
(uniform(1, 12, random())) manufacturer,
(uniform(1, 8, random())) weather_cond,
(uniform(1.5::float, 3::float, random())) Voltage,
(uniform(1, 5, random())) Age,
from
(Select seq4() as seq from table (generator(rowcount => 1 + $days_of_data*24*60/$frequency)))
)
Select Time,site_id,sensor_id,row_id,measurement,anomaly_label,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id;



--- INGEST ANOMALY FOR TRANING

--- Update Anomaly Set 1

Update ms_review_for_anomaly_with_add_dimensions
set
measurement = cte1.measurement,
voltage = cte1.voltage,
manufacturer =cte1.manufacturer,
weather_condition = cte1.weather_cond,
age = cte1.age,
anomaly_label=TRUE
from (
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-02 06:00:00') time,
100 + normal(2,5,random(5)) measurement,
(2+ uniform(3.0::float, 5::float, random())) voltage,
(uniform(1,2, random())) manufacturer,
(uniform(1,2, random())) weather_cond,
(uniform(5,7, random())) Age
from
(Select seq4() as seq from table (generator(rowcount => 150)))
)
Select Time,measurement,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id
) CTE1
where ms_review_for_anomaly_with_add_dimensions.ts = CTE1.time
and site_id = 'Site-A'
and sensor_id = 1;


Select * from ms_review_for_anomaly_with_add_dimensions
where ts between '2023-01-02 06:00:00' and '2023-01-02 08:00:00' and site_id = 'Site-A'
and sensor_id = 1;


--- Update Anomaly Set 2

Update ms_review_for_anomaly_with_add_dimensions
set
measurement = cte1.measurement,
voltage = cte1.voltage,
weather_condition = cte1.weather_cond,
age = cte1.age,
anomaly_label=TRUE
from (
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-05 17:00:00') time,
100 + normal(2,5,random(5)) measurement,
(2+ uniform(3.0::float, 5::float, random())) voltage,
(uniform(7,10, random())) manufacturer,
(uniform(5,8, random())) weather_cond,
(uniform(5,7, random())) Age
from
(Select seq4() as seq from table (generator(rowcount => 61)))
)
Select Time,measurement,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id
) CTE1
where ms_review_for_anomaly_with_add_dimensions.ts = CTE1.time
and site_id = 'Site-A'
and sensor_id = 2;


Select * from ms_review_for_anomaly_with_add_dimensions
where ts between '2023-01-05 17:00:00' and '2023-01-05 18:00:00' and site_id = 'Site-A'
and sensor_id = 2;



--- Update Anomaly Set 3

Update ms_review_for_anomaly_with_add_dimensions
set
measurement = cte1.measurement,
voltage = cte1.voltage,
manufacturer =cte1.manufacturer,
weather_condition = cte1.weather_cond,
age = cte1.age,
anomaly_label=TRUE
from (
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-07 03:00:00') time,
100 + normal(2,5,random(5)) measurement,
(3+ uniform(3.0::float, 5::float, random())) voltage,
(uniform(7,12, random())) manufacturer,
(uniform(5,8, random())) weather_cond,
(uniform(5,7, random())) Age
from
(Select seq4() as seq from table (generator(rowcount => 61)))
)
Select Time,measurement,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id
) CTE1
where ms_review_for_anomaly_with_add_dimensions.ts = CTE1.time
and site_id = 'Site-B'
and sensor_id = 1;


Select * from ms_review_for_anomaly_with_add_dimensions
where ts between '2023-01-07 03:00:00' and '2023-01-07 04:00:00' and site_id = 'Site-B'
and sensor_id = 1;


-- Create training dataset view

Create or replace view training_data as
Select
ts,
[site_id,sensor_id] as item_id,
measurement,
anomaly_label
from ms_review_for_anomaly_with_add_dimensions
where ts <= '2023-01-09 23:59:00';

--Create the model. There are two options 1. With label 2 without label. Change it based on customer ask
--We will use supervised model. However you may change based on requirements

Create or replace SNOWFLAKE.ML.ANOMALY_DETECTION MS_DETECT_ANOMALY(
INPUT_DATA => SYSTEM$REFERENCE('VIEW','training_data'),
SERIES_COLNAME => 'ITEM_ID',
TIMESTAMP_COLNAME => 'TS',
TARGET_COLNAME => 'MEASUREMENT',
LABEL_COLNAME => 'ANOMALY_LABEL'
);


-- Create or replace SNOWFLAKE.ML.ANOMALY_DETECTION MS_DETECT_ANOMALY(
-- INPUT_DATA => SYSTEM$REFERENCE('VIEW','training_data'),
-- SERIES_COLNAME => 'ITEM_ID',
-- TIMESTAMP_COLNAME => 'TS',
-- TARGET_COLNAME => 'MEASUREMENT',
-- LABEL_COLNAME => '' --- unsupervised
-- );

-- show snowflake.ml.anomaly_detection;

Steps To Generate Test Dataset and Detect Anomaly:

  1. Inject 2 more spells of anomalies on the 10th day of sensor readings. On the first spell we will update all the dimensions that can lead to anomaly of our target metric (measurement) and on the second spell we will update only two dimensions that can lead to the anomaly. This will be a test for our snowflake.ml.top_insights function to rightly identify the contributing feature.
  2. Create a stored procedure to identify and report anomalies. This stored procedure can be scheduled at a specific time interval for continuous identification and report anomalies over email. This procedure is dynamic enough to be integrated as a part of your data engineering pipeline. If you are an ETL geek, I am sure you can find out how.
  3. For our demo scenario we will run the stored procedure from the 00:00AM to 23:59PM at an interval of 1 hour on the 10th day using a python script [Script is included in the code repo] .

use role spcs_app_role;
Use database detect_anomaly;
Use schema public;

--- Ingest Anomaly For Inferencing

-- Prep Inference Data

Set days_of_data = 1; -- days
Set mean_temp = 100; -- degree
Set frequency = 1; -- min


--- Update Anomaly Set 1 for day=10
--- Here we are updating the target metric and all features

Update ms_review_for_anomaly_with_add_dimensions
set
measurement = cte1.measurement,
voltage = cte1.voltage,
manufacturer =cte1.manufacturer,
weather_condition = cte1.weather_cond,
age = cte1.age,
anomaly_label=FALSE
from (
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-10 03:00:00') time,
100 + normal(2,7,random(5)) measurement,
(3 + uniform(3.0::float, 5::float, random())) voltage,
(uniform(1,2, random())) manufacturer,
(uniform(1,2, random())) weather_cond,
(uniform(5,7, random())) Age
from
(Select seq4() as seq from table (generator(rowcount => 61)))
)
Select Time,measurement,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id
) CTE1
where ms_review_for_anomaly_with_add_dimensions.ts = CTE1.time
and site_id = 'Site-A'
and sensor_id = 1;


Select * from ms_review_for_anomaly_with_add_dimensions
where ts between '2023-01-10 02:30:00' and '2023-01-10 04:30:00' and site_id = 'Site-A'
and sensor_id = 1;



--- Update Anomaly Set 2 for day=10
--- Here we are updating the target metric and one features

Update ms_review_for_anomaly_with_add_dimensions
set
measurement = cte1.measurement,
voltage = cte1.voltage
from (
With CTE1
as (
Select
dateadd(minute,(row_number() over(order by seq) - 1), '2023-01-10 15:00:00') time,
100 + normal(2,7,random(5)) measurement,
(3 + uniform(3.0::float, 5::float, random())) voltage,
(uniform(1,2, random())) manufacturer,
(uniform(1,2, random())) weather_cond,
(uniform(5,7, random())) Age
from
(Select seq4() as seq from table (generator(rowcount => 61)))
)
Select Time,measurement,name as manufacturer,condition as weather_cond ,voltage,age from CTE1
left join sensor_manufacturers sm
on CTE1.manufacturer = sm.id
left join weather_cond wc
on CTE1.weather_cond = wc.id
) CTE1
where ms_review_for_anomaly_with_add_dimensions.ts = CTE1.time
and site_id = 'Site-A'
and sensor_id = 2;

-- Create an email integration

CREATE NOTIFICATION INTEGRATION send_email_notification_int
TYPE=EMAIL
ENABLED=TRUE;


-- Create tables for saving and logging anomalies

CREATE OR REPLACE TABLE save_anomaly_detection (
series variant,
ts TIMESTAMP_NTZ,
y FLOAT,
forecast FLOAT,
lower_bound FLOAT,
upper_bound FLOAT,
is_anomaly BOOLEAN,
percentile FLOAT,
distance FLOAT
);


CREATE OR REPLACE TABLE log_anomaly_detection (
ts timestamp,
measurement_start timestamp,
measurement_end timestamp,
no_of_anomalies int
);



-- Create a stored procedure to monitor and report anomalies. This stored procedure can be scheduled on demand.
-- However for a quick demo we will run it for N number of times using a python program

CREATE OR REPLACE PROCEDURE ms_monitoring_anomalies(start_time varchar, end_time varchar)
RETURNS integer NOT NULL
LANGUAGE SQL
AS
$$
DECLARE
CREATE_SQL_STMT VARCHAR DEFAULT '';
EMAIL_TXT VARCHAR;
CNT INT;
INS_SQL_STMT VARCHAR;
BEGIN

CREATE_SQL_STMT := 'CREATE OR REPLACE VIEW TEST_DATA AS SELECT TS,[SITE_ID,SENSOR_ID] AS ITEM_ID,MEASUREMENT,ANOMALY_LABEL FROM MS_REVIEW_FOR_ANOMALY_WITH_ADD_DIMENSIONS WHERE TS BETWEEN '||''''||start_time||''' and '''||end_time||'''' ;
EXECUTE IMMEDIATE CREATE_SQL_STMT;

CALL MS_DETECT_ANOMALY!DETECT_ANOMALIES(
INPUT_DATA => SYSTEM$REFERENCE('VIEW','TEST_DATA'),
SERIES_COLNAME => 'ITEM_ID',
TIMESTAMP_COLNAME => 'TS',
TARGET_COLNAME => 'MEASUREMENT',
CONFIG_OBJECT => {'prediction_interval':0.999}
);

CREATE OR REPLACE TEMPORARY TABLE SAVE_ANOMALY_DETECTION_TEMP (SERIES,TS,Y,FORECAST,LOWER_BOUND,UPPER_BOUND,IS_ANOMALY,PERCENTILE,DISTANCE)
AS
SELECT * FROM TABLE(RESULT_SCAN(LAST_QUERY_ID()));

SELECT COUNT(1) INTO :CNT FROM SAVE_ANOMALY_DETECTION_TEMP WHERE IS_ANOMALY = TRUE;

IF (:CNT >= 0) THEN

INS_SQL_STMT := 'INSERT INTO LOG_ANOMALY_DETECTION (TS,MEASUREMENT_START,MEASUREMENT_END,NO_OF_ANOMALIES) VALUES (CURRENT_TIMESTAMP(), \''||TO_TIMESTAMP(start_time)||'\' , \''||TO_TIMESTAMP(end_time)||'\','||CNT||')';
EXECUTE IMMEDIATE INS_SQL_STMT;

IF (:CNT > 10) THEN

INSERT INTO SAVE_ANOMALY_DETECTION (SERIES,TS,Y,FORECAST,LOWER_BOUND,UPPER_BOUND,IS_ANOMALY,PERCENTILE,DISTANCE)
SELECT * FROM SAVE_ANOMALY_DETECTION_TEMP;

EMAIL_TXT := 'Anomalous Data Detected Between Measurent Timestamp:- ' ||start_time ||' and '||end_time||' with no of anomalies = '||cnt;
CALL SYSTEM$SEND_EMAIL(
'send_email_notification_int',
'ritabrata.saha@snowflake.com',
'Anomalous Measurement Detected',
:EMAIL_TXT
);
END IF;

END IF;

RETURN CNT;

END;
$$;

Before we start the process of building the Streamlit application let us review a sample result from the anomaly detection process. All identified anomalies are saved in a table : “save_anomaly_detection” by the procedure. Lets look at a graph from Snowsight created from the data (day=10) from this table.

Snowsight Graph Showing Detected Anomaly

Steps To Build The App And Test It Locally:

We will build a Streamlit app, dockerize it and run it locally. We will build the app in such a way so that we can test it locally before pushing the image to Snowflake Image Registry.

Quick snap of our project folder structure :

Project Folder Structure
Project folder Structure

The Streamlit code artefacts have been shared in the zip file at the end of the blog.

Let’s try to build a docker image locally and spin up a container using the docker-compose.yml to test the Streamlit app locally.

Please note the local container is invoked with Snowflake credentials passed as environmental variables from the terminal.

Local Testing Of The Docker Image
Local Test Of The Docker Image

This spins up a local container and we can ping the url locally as below:

Local Streamlit App In Action

Steps To Push The Image to Snowflake Image Registry and Instantiate A Container Service:

  1. Tag the image that you just built using :
docker tag <app_name> <repository_url>/<app_name>:v1

2. Docker Login to your Snowflake Account

docker login <snowflake_registry_hostname> -u <user_name>

3. Push the local docker image to snowflake private repository

docker push <repository_url>/<app_name>:v1

The image is available in Snowflake Image Registry now . Whats next ?

  1. Create a network rule allowing external network access.
  2. Create a compute pool.
  3. Create a service on the compute pool using the available image.

The script below does the job.


use role spcs_app_role;
use warehouse app_wh;
Use database detect_anomaly;
Use schema public;



CREATE SECURITY INTEGRATION IF NOT EXISTS snowservices_ingress_oauth
TYPE=oauth
OAUTH_CLIENT=snowservices_ingress
ENABLED=true;


CREATE OR REPLACE NETWORK RULE ALLOW_ALL_RULE
TYPE = 'HOST_PORT'
MODE = 'EGRESS'
VALUE_LIST= ('0.0.0.0:443', '0.0.0.0:80');


CREATE or REPLACE EXTERNAL ACCESS INTEGRATION ALLOW_ALL_EAI
ALLOWED_NETWORK_RULES = (ALLOW_ALL_RULE)
ENABLED = true;


create schema anomalyapp;

use schema anomalyapp;

CREATE or REPLACE IMAGE REPOSITORY app_image_repo;

SHOW IMAGE REPOSITORIES;
SELECT "repository_url" FROM table(result_scan(last_query_id()));

-- List the uploaded image
SELECT SYSTEM$REGISTRY_LIST_IMAGES('/detect_anomaly/anomalyapp/app_image_repo');


CREATE COMPUTE POOL if not exists app_compute_pool
MIN_NODES = 1
MAX_NODES = 1
INSTANCE_FAMILY = CPU_X64_M;

SHOW COMPUTE POOLS;

DESCRIBE COMPUTE POOL app_compute_pool;

show services;

Drop service if exists streamlit_spcs;


CREATE SERVICE streamlit_spcs
IN COMPUTE POOL app_compute_pool
FROM SPECIFICATION $$
spec:
containers:
- name: streamlit
image: <repository_url>/<app_name>:v1
env:
SNOWFLAKE_WAREHOUSE: app_wh
endpoints:
- name: streamlit
port: 8501
public: true
$$;



SELECT SYSTEM$GET_SERVICE_STATUS('streamlit_spcs');

SELECT system$get_service_logs('streamlit_spcs', 0, 'streamlit', 500);

SHOW ENDPOINTS IN SERVICE streamlit_spcs;

DESCRIBE SERVICE streamlit_spcs;

Lets get the url of the app hosted on SPCS and try to open it on the browser:

The App has been successfully hosted on SPCS. Well now lets review the insights.

First set of anomalies were detected by the model between 3AM and 4AM on the 10th of January 2023. Top_Sights has rightly predicted the contributing features to be :

A. "voltage > 6.4835675"  
B. "manufacturer = Panasonic Corporation"
C. "age > 4.5" and
D. "weathercondition = Windy"

Second set of anomalies were detected by the model between 3PM and 4PM on the 10th of January 2023. Top_Sights has rightly predicted the contributing features to be :

A. "voltage > 6.0020072"

Here is a quick video of the final app in action :

Please note opinions expressed in this article are solely my own and do not represent the views or opinions of my employer.

~Cheers

--

--