A Layer Above Database Connectors that Provides a Common API, like PDO and JDBC, but for Node.js

Paleo
7 min readOct 6, 2018

--

Thanks to pixnio.

I created a common API for Node.js on top of two DBMS drivers. The API is inspired from PDO and JDBC. It’s named LADC for “a Layer Above Database Connectors”.

The project is still in alpha version. However I use it in my projects and it works fine. I need help from users to get feedback: Does the API really meet all common needs? Did you find any bugs? In addition, the help of contributors will of course be greatly appreciated.

Why a Common API Above DBMS drivers?

Above drivers? Unlike PDO and JDBC, the purpose is not to replace DBMS drivers. This project is designed to remain a light layer on top of the rock solid drivers we currently use in Node.js.

Why a common API? It brings to our programs an improved compatibility with various DBMS. It is also easier to learn.

Let’s Stop Talking! Show Me Some Code!

Here is an example of code that uses a LADC connection:

async function firstExample(cn) {
let result = await cn.exec(
"insert into test (message) values ('Hello, World!')"
)
let newId = result.getInsertedIdAsString()

let row = await cn.singleRow(
"select message, ts from test where test_id = $1",
[newId]
)
if (!row)
throw new Error("Who deleted the row I've just inserted?")

console.log(`Inserted row ${newId}:`, row)
}

A note about SQL parameters: LADC relies on the features of underlying connectors. This implies that there are different syntaxes. Postgresql requires a $ with a number, while SQLite allows several syntaxes including the $ with a number. I use this syntax here to make the code compatible with both.

Getting Started

LADC works with Node 8 or above. First, install packages:

npm install --save ladc @ladc/pg-adapter @ladc/sqlite3-adapter

We can then open connections to a Postgresql or a SQLite database:

const ladc = require("ladc").default
const pgAdapter = require("@ladc/pg-adapter").default
const sqlite3Adapter = require("@ladc/sqlite3-adapter").default
function createConnection(server) {
if (server === "postgres") {
return ladc({
adapter: pgAdapter({
pgConfig: {
host: "localhost",
database: "testdb",
user: "testdb",
password: "hophop"
},
inMemoryCursor: true,
useReturningAll: true, // helps to use 'getInsertedId()' with Postgresql
}),
})
}
if (server === "sqlite") {
return ladc({
adapter: sqlite3Adapter({ fileName: `${__dirname}/testdb.sqlite` }),
initConnection: async cn => {
await cn.exec("PRAGMA foreign_keys = ON")
}
})
}
}

Finally, here is how to run these samples:

async function createSchema(cn, dialect) {
let autoincPk = dialect === 'postgres' ? 'serial primary key' : 'integer not null primary key autoincrement'
await cn.script(`
create table if not exists test (
test_id ${autoincPk},
message varchar(250) not null,
ts timestamp not null default current_timestamp
);
`)
}
async function runTheseSamples(server) {
let cn = createConnection(server)
try {
await createSchema(cn, server)
await firstExample(cn)
} finally {
await cn.close()
}
}
runTheseSamples("postgres").catch(err => console.error(err))
runTheseSamples("sqlite").catch(err => console.error(err))

The Particular Case of Transactions in Asynchronous Programming

In asynchronous programming, it is common to open once a connection to a database. But we shouldn’t use a common connection for transactions, because other queries from other callbacks could be unintentionally executed in the transaction.

The same problem could occur with multithreading but it is rare to share a connection between several threads or it is easier to take care.

LADC provides a pool of connections for transactions. When you begin a transaction, you get an exclusive underlying connection. When the transaction is committed or rolled back, the underlying connection is released into the pool. In addition, the mechanism is optimized so that, if no operation has taken place simultaneously outside the transaction, then the transaction will have simply used the main underlying connection without opening a second one.

Here is an example of code with transactions:

async function transactionExample(cn) {
let tx = await cn.beginTransaction()
try {
let result = await tx.exec(
"insert into test (message) values ($1)",
["Message 1 of the transaction"]
)
let newId = result.getInsertedId()
await tx.exec(
"insert into test (message) values ($1)",
[`Message 2 related to ${newId}`]
)
await tx.commit() // A commit releases the underlying connection
} finally {
if (tx.inTransaction)
await tx.rollback() // A rollback releases the underlying connection
}
}

Prepared Statements

Drivers for Node.js allow to start several prepared statements on the same connection. But the way to proceed is very different from a DBMS to another. The LADC API provides a common way to use prepared statements:

async function exampleWithPreparedStatement(cn, dialect) {
let messages = ["Hello, World!", "Hi there!", "Hi!"]
let ps = await cn.prepare(`insert into test (message) values ($1)`)
for (let message of messages)
await ps.exec([message])
await ps.close()
}

Cursors

A LADC cursor implements the interfaces AsyncIterable and AsyncIterator. Here is how to use a cursor with Node.js 10 and above:

async function exampleWithCursor(cn) {
let cursor = await cn.cursor("select test_id, message from test")
for await (let row of cursor)
console.log(row)
}

Or, the old school way (Node 8):

async function example2WithCursor(cn) {
let cursor = await cn.cursor("select test_id, message from test")
while (true) {
let iterResult = await cursor.next()
if (iterResult.done)
break
console.log(iterResult.value)
}
// Warning: You have to explicitly call `cursor.return()` if the cursor has not been traversed to the end.
}

Notices:

  • I implemented a limitation of one cursor by underlying connection;
  • The Postgresql’s driver for Node.js doesn’t provide cursors yet. The Postgresql adapter provides “in-memory cursors” as a replacement, this feature can be enabled with the option inMemoryCursor.

How to Integrate a Query Builder

LADC will integrate well with your query builder. I personally use SQL Bricks and I’ve done a little plugin for that. Here is how to open a connection with the integration of SQL Bricks.

Install dependencies:

npm install --save @ladc/sql-bricks-modifier sql-bricks

Create the modified connection:

const sqlBricksModifier = require("@ladc/sql-bricks-modifier").default
const ladc = require("ladc").default
let cn = ladc({
adapter: /* … adapter to your DBMS here … */,
modifier: sqlBricksModifier()
}) // Tip for TypeScript users: import and cast to the type 'SBMainConnection'

Then, use it:

const { select, like } = require("sql-bricks")async function exampleWithSqlBricks(cn) {
let q = select("test_id, message")
.from("test")
.where(like("message", "Hi%"))
let rows = await cn.all(q)
console.log(rows)
}

Log Errors

Because LADC uses a pool of underlying connections, errors can occur independently of any query. By default, independant errors are logged with console.error(). But it is possible to log them where you want:

const ladc = require("ladc").defaultlet cn = ladc({
// …
logError: err => { /* Do something with the error. */ }
})

The Complete API

Members of a MainConnection

Common methods between MainConnection and TransactionConnection:

  • cn.prepare(sql, params) returns a promise of a PreparedStatement;
  • cn.exec(sql, params) executes the query and returns a promise of an ExecResult;
  • cn.all(sql, params) executes the select query and returns a promise of an array of rows;
  • cn.singleRow(sql, params) fetches with cn.all(sql) and returns the single row;
  • cn.singleValue(sql, params) fetches with cn.all(sql) and returns the single value of the single row;
  • cn.cursor(sql, params) opens a cursor and returns a promise of a AsyncIterableIterator.

Members that are specific to a MainConnection:

  • cn.beginTransaction() starts a transaction and returns a promise of a TransactionConnection;
  • cn.script(sql) executes a multi-line script;
  • cn.close() closes the LADC connection, this includes closing the pool of underlying connections.

Members of an ExecResult

  • result.affectedRows is a readonly number;
  • result.getInsertedId() returns the inserted identifier;
  • result.getInsertedIdAsNumber() returns the inserted identifier as a number;
  • result.getInsertedIdAsString() returns the inserted identifier as a string.

Members of a PreparedStatement

  • ps.bind(nbOrKey, value) binds a value to the specified parameter number;
  • ps.unbind(nbOrKey) unbinds a value to the specified parameter number;
  • ps.bindAll(params) binds a value to the specified parameter number;
  • ps.unbindAll() unbinds all the bound values;
  • ps.exec(params?) executes the query and returns a promise of an ExecResult;
  • ps.all(params?) executes the select query and returns a promise of an array of rows;
  • ps.singleRow(params?) fetches with cn.all(sql) and returns the single row;
  • ps.singleValue(params?) fetches with cn.all(sql) and returns the single value of the single row;
  • ps.cursor(params?) opens a cursor and returns a promise of a AsyncIterableIterator;
  • ps.close() closes the prepared statement.

Members of a TransactionConnection

Common methods between MainConnection and TransactionConnection:

  • tx.prepare(sql, params) returns a promise of a PreparedStatement;
  • tx.exec(sql, params) executes the query and returns a promise of an ExecResult;
  • tx.all(sql, params) executes the select query and returns a promise of an array of rows;
  • tx.singleRow(sql, params) fetches with cn.all(sql) and returns the single row;
  • tx.singleValue(sql, params) fetches with cn.all(sql) and returns the single value of the single row;
  • tx.cursor(sql, params) opens a cursor and returns a promise of a AsyncIterableIterator.

Members that are specific to a TransactionConnection:

  • tx.inTransaction is a readonly boolean;
  • tx.rollback() rollbacks the transaction, then releases the underlying connection to the pool;
  • tx.commit() commits the transaction, then releases the underlying connection to the pool.

Bonus: How to Retrieve the Last Inserted Identifier with Postgresql

Postgresql has a gotcha regarding autoincremented identifiers. The insert query must end with a non-standard returning statement. Then, the name of the autoincremented column is required to obtain its last inserted value.

The LADC adapter for Postgresql provides two options:

  1. The useReturningAll option: When an insert query is recognized, it automatically appends returning * at the end of the query. Then, when the method getInsertedId() is called, the adapter searches for a column named id or theTableName_id and returns its value.
  2. The autoincMapping option: You can provide a mapping of autoincremented column names for each table that has one. I recommend this way. Here is an example:
const ladc = require("ladc").default
const pgAdapter = require("@ladc/pg-adapter").default
const autoincMapping = {
"test": "test_id",
"category": "category_id",
"post": "post_id",
}
let cn = ladc({
adapter: pgAdapter({
pgConfig: {
/* … credentials … */
},
autoincMapping
})
})
// …let result = await cn.exec(
"insert into test (message) values ('Hello, World!')"
) // The adapter appends 'returning test_id'
let newId = result.getInsertedId() // Returns the value of 'test_id'

Or, if you prefer, it is of course still possible to manually write the returning statement then to get it:

let result = await cn.exec("insert into test(message) values ('Hi there!') returning test_id") // Postgres only
let newId = result.getInsertedId("test_id")

What do you think of this project?

Feel free to give your opinion. Do you think you could use LADC? Transactions and cursors are currently neglected by DBMS drivers for Node.js. Do they have a future in our asynchronous world?

--

--