How to…

Build a Data Connector for DADI API

EDGE Network
Edge
10 min readMar 23, 2018

--

Overview

For the first two major versions of API, MongoDB was the only supported database engine. With the release of 3.0, we wanted to offer developers the flexibility to use the database engine that most suited their projects, regardless of size and scalability needs.

In line with the introduction of support for multiple template engines in Web 3.0, we moved all interactions with MongoDB from the app core into a plugin system — which we call data connectors — which allows API to work with virtually any database engine.

We’ve been hard at work creating data connectors for various vendors, like CouchDB, RethinkDB and even a flat file JSON filestore. In this tutorial, we’re going to show you how to build your own.

By the end of this tutorial we’ll have a partially-complete API Data Connector for MySQL, with a subset of the database operations implemented. Everything else can be finished for homework!

A Data Connector Template

During the development of API 3.0, the DADI engineering team put together a sample repository to make it easier to build your own connector. It even contains a test suite which should help you craft a robust connector for your chosen database engine.

The repository can be found here: https://github.com/dadi/api-connector-template. You can fork this repository and follow along.

Beginning the Data Connector

  • Fork the dadi/api-connector-template repository into your own profile or organization.
  • Rename it to something nicer — let’s call it api-mysql.
  • Clone the repository to your local machine.
  • Change into the repository directory and install the dependencies.
  • Now install any dependencies needed to interact with your chosen database engine. For our MySQL connector, we’ll need the Node.js MySQL driver.
  • To get a checklist of the things your connector must be able to do, run the test suite.
  • Create a MySQL database if one doesn’t already exist:
  • Create a MySQL user if one doesn’t already exist:

Configuration file

Each data connector normally requires its own configuration file containing things such as database host, database name and connection credentials. The configuration files follow the naming convention <connector>.<environment>.json. For example the MongoDB data connector uses a file called mongodb.development.jsonwhen running in development mode. These configuration files are placed in the config directory of your connector module and also in the config directory of your API application.

For the MySQL connector we’ll need a file called mysql.development.json. Because we're starting with the unit tests, let's create the test mode configuration first.

  • Rename config/apiConnector.test.json to config/mysql.test.json
  • Add the following configuration, changing values to match your setup:
  • Edit config.js so the line that loads the configuration file is correct:

Database operations

This tutorial will only focus on the bare minimum required for a data connector and as such will only implement three database operations: connect, insert and find.

To keep it simple, we’ll do all our work in the existing file lib/index.js.

Before adding code for the database operations, require the mysql2 dependency at the top of this file:

lib/index.js

Connect

When API calls the connector’s connect() method it expects the connector to create a connection to the database and assign it to the property this.database, before resolving the promise.

To connect to MySQL we call createConnection and supply the connection options from the configuration file:

Insert

All API data connectors should support inserting an array of documents. For this simple version of the MySQL connector we’re only going to handle inserting single documents.

In the sample connector repository, the insert() method returns an empty array:

Replace the code above with the following:

Two things you may notice in our new insert() method. The first is that we don't have to build a full INSERT statement to pass to the database. The Node.js MySQL module that we're using handles converting objects passed to the query method into SQL statements, saving us quite a lot of manual transformation. The following snippet is from the documentation for the mysql module, which the mysql2 module is based off.

The second is that we’re making a second call to the database to get the results of the insert. This is because API expects a result set back from insert() that contains the inserted documents, and MySQL only returns the number of affected rows.

Before we can run any tests for the insert() method, we have to remove the "skip" instruction in the test file. In test/index.js, change the following:

If you run npm test now the first thing you should notice (if the database has connected correctly) is an error saying that a table doesn't exist.

Uncaught Error: Table ‘my_database.users’ doesn’t exist

Automatic table creation

We really need data connectors to automatically create tables that don’t exist, so that new collections can be added to API without having to perform any maintenance on the underlying database.

Let’s add a createTable() method that can be called from each of the database operations to ensure the requested table exists. This method needs to read the schema for the requested collection and generate a CREATE TABLE query that includes the names of the columns and their data types. See the DADI API documentation for Collections for detailed information.

Now modify the insert() method to include a call to createTable():

Test again!

Running npm test now should see some of our insert tests passing:

Find

Having already implemented a find within the insert() method, it should be simple to make our connector's find() method work. Add the following to lib/index.js in the existing find() method:

Before we can run any tests for the find() method, we have to remove the "skip" instruction in the test file. In test/index.js, change the following:

When you run npm test the first find test should fail because it's expecting the database to contain only a couple of records. We should really be clearing the database before each test run. For this we'll use the convenient dropDatabase() method in the sample connector.

Find the beforeEach() method in test/index.js and modify it as follows:

Find the dropDatabase() method in lib/index.js and modify it as follows:

If we run the tests now, we should have one passing test (and six pending):

Tasks to complete

Before this connector can be finished and used with API, there are a number of problems to resolve. Obviously we’ve left the update() and delete() methods for you to implement, but the following information about "options" also needs to be considered.

Passing options to find()

API calls the find() method with an options object which contains values that instruct the connector how to do things such as sorting and limiting the records returned. See the API documentation and the JSON Filestoreconnector for ideas on implementing.

We’ve left these unit tests pending in the sample connector repository. Simply remove the .skip for each of these when you're ready to test your implementation.

Returning results & metadata

While we haven’t added this to the sample connector, API actually expects a result from the data connector’s find() method that contains both the result set and a "metadata" block that contains the total count of matching records and the number of pages. You can see how the MongoDB connector does it. This uses the NPM package @dadi/metadata which you can add to your project's dependencies.

Publishing the connector to NPM

Edit the package.json. We need to rename the package (it’s still called @dadi/api-connector-template). Change the name property to be api-mysql:

You should also change the repository property to reflect your own GitHub repository:

When you’re ready to publish the connector, execute the following command:

Testing with API

So far we’ve been testing the connector in isolation. Once all the tests are passing, it’s a good idea to use npm link to test the connector with your API application. This can be done before you publish to NPM, to ensure everything is working before you make it publicly available. See this excellent article regarding the use of npm link.

Usage with DADI CLI

The easiest way to install API is with DADI CLI, a command-line tool that allows developers to spin up an API instance in just a few seconds using a single command.

Now that your data connector is available on NPM, it can be pulled into CLI for new API installations. The following command will install the latest version of API in a directory called my-new-api using our fresh MySQL data connector, taking care of installing all the dependencies.

When users don’t supply a --database parameter, CLI will present them with a list of data connector modules straight from NPM.

To ensure an optimal and safe experience, we filter this list with modules that are built or trusted by DADI. If you think your data connector should be on this list, get in touch — we’ll be delighted to review your work and, if necessary, help you smooth any rough edges before adding it to the list.

That’s it for today!

The code for this article, forked from the sample data connector, can be found here.

For more information about the API Data Connectors, or help building your own, ask us in the usual channels. You can connect with the engineering team on Discord, via email (email addresses in our profiles) or send a tweet to @dadi.

Written by James Lambie. Jim is the CTO at DADI with overall responsibility for the development of the DADI network and its attached Web Services.

--

--

EDGE Network
Edge

Peer-to-peer serverless infrastructure, powered by blockchain technology and built using the spare capacity all around us. The future of the cloud is Edge.