In this blog I’ll show how to build a natural language interface for a typical home light switch so that you could turn the lights on and off with simple commands like
Turn off all the lights, please or
Get the lights on in the kids bedroom.
We’ll concentrate on Natural Language Interface (NLI) part, and I’ll leave speech-to-text and the actual light controlling outside of the scope of this short blog. You can easily add speech-to-text with WebSpeech, if necessary, while Arduino/HomeKit can provide simple API to control the lights in your house.
For our implementation we’ll use NLPCraft and Scala language (NLPCraft also supports any JVM-based language of your choice like Java, Groovy or Kotlin). NLPCraft is a free open source project that allows you to quickly build domain-specific natural language interface to any device or software. It uses modern semantic modelling and deterministic intent-based matching of the user input instead of the traditional computational linguistics (i.e. neural networks). Because of that, it doesn’t require any pre-existing domain corpora or a lengthy model training and development.
Most of the work, when using NLPCraft, is around building a semantic model for your specific domain. A model defines a set of named entities (user-defined or 3rd party from spaCy, OpenNLP, Stanford CoreNLP or Google Natural Language) that should be detected in the user. NLPCraft also provides an advanced intent matching which we’ll use later in this example.
Let’s think about our task at hand of supporting a free-form natural language interface for a typical light switch. Here’s a sample of commands that we’d like to support:
Turn the lights off in the entire house
Switch on the illumination in the master bedroom closet
Get the lights on
Please, put the light out in the upstairs bedroom
Set the lights on in the entire house
Turn the lights off in the guest bedroom
Could you please switch off all the lights?
Dial off illumination on the 2nd floor
Please, no lights!
Kill off all the lights now!
No lights in the bedroom, please
By looking at these examples you can easily spot three distinct entities that we need to be able to detect in the user input that we’ll later use in our intents:
- An action to turn the light on
- An action to turn the light off
- Location of the light
Here’s the declarative part of the NLPCraft semantic model in YAML that defines these three entities:
Besides some syntactical peculiarities this model definition is pretty self explanatory:
- Lines 14, 21 and 29 define our three elements (i.e. named entities):
ls:off. Each element is defined through a macro-expanded set of synonyms.
- Line 5 provides a list of macros used later in elements definition.
What is remarkable about this model is how productive and economical these few dozens of YAML lines are:
When loaded by NLPCraft this model translates into > 100,000 different synonyms for each element — which in traditional approach would have to be created manually.
NLPCraft is smart about synonyms processing: apart from basic normalization, tokenization, stemmatization, stopword elimination, etc. it performs advanced shuffling and weighted selection algorithm. You can also use PoS tags, regular expressions or user-defined predicates when defining semantic elements for more advanced detection use cases.
Now that we have our declarative part of the model configuration we can finish the rest of the model definition by providing the intent matching logic. Technically, NLPCraft model is just an implementation of NCModel Java interface. We’ll use convenient NCModelFileAdapter adapter to implement our model with above YAML-based configuration:
- On line 1 we initialize the model from the external YAML-based configuration (that we discussed above).
- On line 2 we attached the intent to its
onMatch(…)callback. Note that we use NCIntent Java annotation and text-based intent DSL supported by NLPCraft. This intent will match any user input that has the following entities:
- exactly one action (i.e. any entity belonging to the group
- zero or more lights locations.
- Detected entities will be mapped to the callback method parameters via NCIntentTerm annotations when the intent is matched and the callback is called.
- Callback implementation simply returns the status of the lights (see line 13). You can add your Arduino, HomeKit, etc. integration at this point.
And we are done 🎉
Compile and deploy the model into the data probe, start the REST server (more details on this here) — and you have model ready to accept REST call and start controlling the lights with the natural language.
Although you can simply use any REST tool to send input into the model — we’ll use built-in JUnit 5 compatible test framework to get a bit more automation.
Create LightSwitchTest.java file with this code and run it:
Note that we use the our original sentences in out test that we wanted to support in the beginning. You can look at the test output and verify that everything works as expected.
Having an automated test allows you to quickly play with the model, make changes or iterate on your modifications without breaking something in the process.
Go ahead and play with this mode: you can add Arduino integration or adopt the model
ls:loc entity to your own home configuration; you can add some slang or quirky ways to operate your lights…