TinyML Implementation using Raspberry Pi Pico: Geometry Gesture Detection (Part-II)

Subir Maity
6 min readOct 3, 2021

--

In the 1st part of this tutorial, we have already trained our model and it is ready for deployment. Click on the Deployment option (left side of the window). On the next page, select the C++ library (shown below), and download the code.

Deployment of the model. Credit: Edge Impulse Inc.

Now, we have to compile the code and build it to generate the uf2 file required by Pico. This is not an easy task. Several configurations need to be done for this.

Make sure that you have already set up Visual Studio Code (Windows Platform) to program your Raspberry Pi Pico. You need to install several dependencies such as CMake, GCC ARM compiler, Library for Pico, and, VS code. A complete guideline can be found here. Test your installation by running a simple program such as blinking an LED. If you have already done it before, that's Great!

Create a folder and give a name such as “pico_geometry_detection”. Unzip the C++ library which you have already downloaded from the model deployment section. create an empty folder named build and source inside it. Also, don't forget to copy the pico_sdk_import.CMake file which is available inside the pico-examples folder in your Pico SDK installation directory. Inside the folder as shown below. Now, the folder should look like this:

In the next step, the CMakeLists file needs to be edited for the proper functionality of the compiler so that all subfolders are correctly included during compilation. In the tutorial named “Machine Learning Inference on Raspberry Pico 2040” by Dmitry Maslov, the required configurations of CMakeLists is listed and it is available in his GitHub repo, and once again mentioned here without any modification (just in the 1st line, mention the correct CMake version, for me it was 3.19, which is compatible with VS code-2019).

cmake_minimum_required(VERSION 3.19)set(MODEL_FOLDER .)
set(EI_SDK_FOLDER edge-impulse-sdk)
include(pico_sdk_import.cmake)project(hello_ml C CXX ASM)
set(CMAKE_C_STANDARD 11)
set(CMAKE_CXX_STANDARD 11)
pico_sdk_init()add_executable(hello_ml
source/main.cpp
source/ei_classifier_porting.cpp
)
include(${MODEL_FOLDER}/edge-impulse-sdk/cmake/utils.cmake)target_link_libraries(hello_ml pico_stdlib)# enable usb output, disable uart output
pico_enable_stdio_usb(hello_ml 1)
pico_enable_stdio_uart(hello_ml 0)
target_include_directories(hello_ml PRIVATE
${MODEL_FOLDER}
${MODEL_FOLDER}/classifer
${MODEL_FOLDER}/tflite-model
${MODEL_FOLDER}/model-parameters
)
target_include_directories(hello_ml PRIVATE
${EI_SDK_FOLDER}
${EI_SDK_FOLDER}/third_party/ruy
${EI_SDK_FOLDER}/third_party/gemmlowp
${EI_SDK_FOLDER}/third_party/flatbuffers/include
${EI_SDK_FOLDER}/third_party
${EI_SDK_FOLDER}/tensorflow
${EI_SDK_FOLDER}/dsp
${EI_SDK_FOLDER}/classifier
${EI_SDK_FOLDER}/anomaly
${EI_SDK_FOLDER}/CMSIS/NN/Include
${EI_SDK_FOLDER}/CMSIS/DSP/PrivateInclude
${EI_SDK_FOLDER}/CMSIS/DSP/Include
${EI_SDK_FOLDER}/CMSIS/Core/Include
)
include_directories(${INCLUDES})# find model source files
RECURSIVE_FIND_FILE(MODEL_FILES "${MODEL_FOLDER}/tflite-model" "*.cpp")
RECURSIVE_FIND_FILE(SOURCE_FILES "${EI_SDK_FOLDER}" "*.cpp")
RECURSIVE_FIND_FILE(CC_FILES "${EI_SDK_FOLDER}" "*.cc")
RECURSIVE_FIND_FILE(S_FILES "${EI_SDK_FOLDER}" "*.s")
RECURSIVE_FIND_FILE(C_FILES "${EI_SDK_FOLDER}" "*.c")
list(APPEND SOURCE_FILES ${S_FILES})
list(APPEND SOURCE_FILES ${C_FILES})
list(APPEND SOURCE_FILES ${CC_FILES})
list(APPEND SOURCE_FILES ${MODEL_FILES})
# add all sources to the project
target_sources(hello_ml PRIVATE ${SOURCE_FILES})
pico_add_extra_outputs(hello_ml)

Then, create a file named ei_classifier_porting.cpp and paste the following code (shown below) available in the GitHub repo mentioned by Dmitry Maslov. Place the .cpp file inside the source directory.

File: ei_classifier_porting.cpp

/* Edge Impulse inferencing library
* Copyright (c) 2021 EdgeImpulse Inc.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/

#include "porting/ei_classifier_porting.h"

#include "pico/stdlib.h"
#include <stdarg.h>
#include <stdlib.h>
#include <stdio.h>

#define EI_WEAK_FN __attribute__((weak))

EI_WEAK_FN EI_IMPULSE_ERROR ei_run_impulse_check_canceled() {
return EI_IMPULSE_OK;
}

EI_WEAK_FN EI_IMPULSE_ERROR ei_sleep(int32_t time_ms) {
sleep_ms(time_ms);
return EI_IMPULSE_OK;
}

uint64_t ei_read_timer_ms() {
return to_ms_since_boot(get_absolute_time());
}

uint64_t ei_read_timer_us() {
return to_us_since_boot(get_absolute_time());
}

/**
* Printf function uses vsnprintf and output using Arduino Serial
*/
__attribute__((weak)) void ei_printf(const char *format, ...) {
static char print_buf[1024] = { 0 };

va_list args;
va_start(args, format);
int r = vsnprintf(print_buf, sizeof(print_buf), format, args);
va_end(args);

if (r > 0) {
printf(print_buf);
}
}

__attribute__((weak)) void ei_printf_float(float f) {
ei_printf("%f", f);
}

__attribute__((weak)) void *ei_malloc(size_t size) {
return malloc(size);
}

__attribute__((weak)) void *ei_calloc(size_t nitems, size_t size) {
return calloc(nitems, size);
}

__attribute__((weak)) void ei_free(void *ptr) {
free(ptr);
}

#if defined(__cplusplus) && EI_C_LINKAGE == 1
extern "C"
#endif
__attribute__((weak)) void DebugLog(const char* s) {
ei_printf("%s", s);
}

After this, the next job is to create main.cc file. A sample format of main.cc for a similar type of project is available in the GitHub repo of MJRoBot (Marcelo Rovai). You can explore the tutorial named TinyML — Motion Recognition Using Raspberry Pi Pico for further insight on it. I have done the necessary editing in this code to make it compatible with this project (Link is mentioned at the end of this article).

So, inside the source folder, we have two files named main.cc and ei_classifier_porting.cpp. Then go to model-parameters folder and open model_metadata.h file in a text editor. Go to line no 79 where you will find a string something like #define EI_CLASSIFIER_SENSOR . By default, it is marked as EI_CLASSIFIER_UNKNOWN_SENSOR. (Because you are not using any supported board). Change it as shown below (in the same line):

#define EI_CLASSIFIER_SENSOR                     EI_CLASSIFIER_SENSOR_ACCELEROMETER

Now, copy the whole project folder inside the directory where pico-examples directory is present (do not place it inside pico-examples). Open Visual Studio code (if you are using Windows platform), and build the project. After successful building, the app.uf2 file will be generated inside the build folder.

Screenshot of Visual Studio Code with Project explorer.

Deployment in Pico and Testing the Project

While pressing the Bootshell button on the Pico board, connect the USB to the PC/laptop. The Removable drive will be mounted. Drag and drop the app.uf2 file. Pico will reboot and it will start the execution of the model. Open a serial terminal (TeraTerm or Serial Monitor of Arduino), select the proper COM port. Make sure that the Baud rate is set to 115200.

When the board is in rest position (no movement), the probability of finding all three gestures is equal (theoretically not correct, you may revise the code later and can give a name such as a Device is standby or Idle)and it is around 0.33 as shown below.

The output of Serial Monitor in Standby mode

The circle is detected correctly with almost 99% accuracy. It is the same for square and triangle patterns.

Serial monitor output for circular Gesture
Serial monitor output for Square Gesture
Serial monitor output for triangular Gesture

As mentioned earlier, sometimes there is confusion in the detection of square/triangle, especially when movement is a little bit irragular.

The source code, other files are available in my GitHub repository. If you do not want to build, you can transfer the app.uf2 file (available in the repository) to Pico and verify the design.

This Edge Impulse project is available here. You can clone it and can do necessary changes in the ML model for better performance.

Important Note: Hold the hardware with the accelerometer in the same pattern which was followed during the data collection phase. If you hold the board a little bit diagonally, then x, y, and z values will differ a lot for each shape and detection will not be accurate. More efficient coding is required to overcome this orientation dependency of hardware.

Acknowledgment: Thanks to Dmitry Maslov and Marcelo Rovai.

--

--

Subir Maity

Area of Specialization: VLSI Design, nano-scale MOSFET, Machine Learning for Microcontrollers