Generator the client to sql database on Golang

Generator client to the database on the Golang based on interface.

To work with databases, Golang offers the `database/sql` package, which is an abstraction over the software interface of a relational database. On the one hand, the package includes powerful functionality to manage a pool of connections, working with prepared statements, transactions, a database query interface. On the other hand, you have to write a considerable amount of the same type of code in a web application to interact with the database. The `go-gad/sal` library offers a solution in the form of generating the same type of code based on the described interface.


Today there are a sufficient number of libraries that offer solutions in the form of ORMs, helpers for building queries, and generating helpers based on the database schema.

When I switched to Golang several years ago, I already had experience working with databases in different languages. Using ORM, for example, ActiveRecord, and without. Having gone from love to hate, having no problems with writing a few extra lines of code, interaction with the database in Golang came to something like a repository pattern. We describe the interface with the database, we implement using standard `db.Query`, `row.Scan`. To use additional wrappers simply did not make sense, it was opaque, forced to be on the alert.

The SQL language itself is already an abstraction between your program and the data in the repository. It always seemed illogical to me to try to describe a data scheme, and then build complex queries. The structure of the response in this case differs from the data scheme. It turns out that the contract should be described not at the data schema level, but at the request and response level. We use this approach in web development when describing API request and response data structures. When accessing the service via RESTful JSON or gRPC, we declare the contract at the request and response level using JSON Schema or Protobuf, and not the entity data schema within the services.

That is, the interaction with the database has been reduced to a similar method:

type User struct {
ID int64
Name string
type Store interface {
FindUser(id int64) (*User, error)
type Postgres struct {
DB *sql.DB
func (pg *Postgres) FindUser(id int64) (*User, error) {
var resp User
err := pg.DB.QueryRow("SELECT id, name FROM users WHERE id=$1", id).Scan(&resp.ID, &resp.Name)
if err != nil {
return nil, err
return &resp, nil
func HanlderFindUser(s Store, id int) (*User, error) {
// logic of service object
user, err := s.FindUser(id)

This method makes your program predictable. But let’s be honest, this is not a poet’s dream. We want to reduce the number of template code to compose a query, fill data structures, use prepared statements, named arguments, and so on. I tried to formulate a list of requirements that the desired set of utilities should satisfy.


  • Description of interaction in the form of an interface.
  • The interface is described by methods and messages of requests and responses.
  • Support for binding variables and prepared statements.
  • Support for named arguments.
  • Linking database response with message data structure fields.
  • Support for atypical data structures (array, json).
  • Transparent work with transactions.
  • Built-in support for intermediate processors (middleware).

We want to abstract implementation of interaction with the database, using the interface. This will allow us to implement something similar to such a design pattern as a repository. In the example above, we described the Store interface. Now we can use it as an addiction. At the testing stage, we can pass the stub object generated on the basis of this interface, and in production we will use our implementation based on the Postgres structure.

Each interface method describes one database request. Input and output parameters of the method should be part of the contract for the request. The query string must be able to format depending on the input parameters. This is especially true when compiling queries with a complex sample condition.

When making a query, we want to use placeholders and variable bindings. For example, in PostgreSQL, instead of a value, you write `$1`, and with the query you pass an array of arguments. The first argument will be used as the value in the converted query. Support for prepared statements will allow you not to worry about organising the storage of these expressions. The `database/sql` library provides a powerful tool for the support of prepared statements, it takes care of the connection pool, closed connections. But on the part of the user, it is necessary to take an additional action to reuse the prepared statement in the transaction.

Databases, such as PostgreSQL and MySQL, use different syntax for using placeholders and variable bindings. PostgreSQL uses the format `$1`,`$2`, … . MySQL uses `?` regardless of the location of the value. The `database/sql` library offered a universal format of named arguments Usage example:

db.ExecContext(ctx, `DELETE FROM orders WHERE created_at < @end`, sql.Named("end", endTime))

Support for this format is preferable to use compared to PostgreSQL or MySQL solutions.

The response from the database that processes the driver can be represented as follows:

dev > SELECT * FROM rubrics;
id | created_at | title | url
1 | 2012-03-13 11:17:23.609 | Tech | technology
2 | 2015-07-21 18:05:43.412 | Style | fashion
(2 rows)

From the user’s point of view, at the interface level, it is convenient to describe the output parameter as an array of structures of the form:

type GetRubricsResp struct {
ID int
CreatedAt time.Time
Title string
URL string

Next, assign the value of `id` to `resp.ID`, and so on. In general, this functionality covers most needs.

When declaring messages through internal data structures, the question arises about how to support non-standard data types. For example an array. If you use the `` driver when working with PostgreSQL, you can use helper functions like `pq.Array(&x)` when passing arguments to the query or scanning the response. Example from documentation:

db.Query(`SELECT * FROM t WHERE id = ANY($1)`, pq.Array([]int{235, 401}))
var x []sql.NullInt64
db.QueryRow('SELECT ARRAY[235, 401]').Scan(pq.Array(&x))

Accordingly, there should be ways to prepare data structures.

When executing any of the interface methods, a connection to the database can be used, in the form of an object `*sql.DB`. If it is necessary to execute several methods within one transaction, I would like to use transparent functionality with a similar approach of working outside the transaction, not to pass additional arguments.

When working with interface implementations, it is vital for us to be able to embed the toolkit. For example, logging all requests. The toolkit must access the request variables, the response error, the spent time, the interface method name.

For the most part, the requirements were formulated as a systematisation of scenarios for working with a database.

Solution: go-gad/sal

One way to deal with the template code is to generate it. Fortunately, Golang has tools and examples for this The GoMock approach was taken as an architectural solution for generation, where the analysis of the interface is carried out using reflection. Based on this approach, according to the requirements, the `salgen` utility and the `sal` library were written, which generate interface implementation code and provide a set of auxiliary functions.

In order to start using this solution, it is necessary to describe the interface that describes the behaviour of the interaction layer with the database. Specify the `go:generate` directive with a set of arguments and start the generation. A constructor and a bunch of template code will be received, ready to use.

package repo
import "context"
//go:generate salgen -destination=./postgres_client.go -package=dev/taxi/repo dev/taxi/repo Postgres
type Postgres interface {
CreateDriver(ctx context.Context, r *CreateDriverReq) error
type CreateDriverReq struct {
func (r *CreateDriverReq) Query() string {
return `INSERT INTO drivers(id, name) VALUES(@id, @name)`


It all starts with declaring the interface and a special command for the `go generate` utility:

//go:generate salgen -destination=./client.go Store
type Store interface {

Here it is described that for our interface `Store` from the package the console utility `salgen` will be called, with two options and two arguments. The first option `-destination` determines in which file the generated code will be written. The second option `-package` defines the full path (import path) of the library for the generated implementation. The following are two arguments. The first one describes the complete package path (``), where the interface is located, the second one indicates the interface name itself. Note that the command for `go generate` can be located anywhere, not necessarily next to the target interface.

After executing the `go generate` command, we get a constructor, whose name is constructed by adding the `New` prefix to the interface name. The constructor accepts a required parameter corresponding to the `sal.QueryHandler` interface:

type QueryHandler interface {
QueryContext(ctx context.Context, query string, args ...interface{}) (*sql.Rows, error)
ExecContext(ctx context.Context, query string, args ...interface{}) (sql.Result, error)
PrepareContext(ctx context.Context, query string) (*sql.Stmt, error)

This interface corresponds to the `*sql.DB` object.

connStr := "user=pqgotest dbname=pqgotest sslmode=verify-full"
db, err := sql.Open("postgres", connStr)
client := storage.NewStore(db)


Interface methods define a set of available database queries.

type Store interface {
CreateAuthor(ctx context.Context, req CreateAuthorReq) (CreateAuthorResp, error)
GetAuthors(ctx context.Context, req GetAuthorsReq) ([]*GetAuthorsResp, error)
UpdateAuthor(ctx context.Context, req *UpdateAuthorReq) error
  • The number of arguments is always strictly two.
  • The first argument is the context.
  • The second argument contains the data to bind the variables and defines the query string.
  • The first output parameter can be an object, an array of objects or missing.
  • Last output parameter is always an error.

The first argument is always a `context.Context` object. This context will be passed on to the database and toolkit calls. The second argument expects a parameter with a base type of `struct` (or a pointer to a `struct`). The parameter must satisfy the following interface:

type Queryer interface {
Query() string

The `Query()` method will be called before performing a database query. The resulting string will be converted to a database specific format. That is, for PostgreSQL, `@end` will be replaced with `$1`, and the value `&req.End` will be passed to the argument array.

Depending on the output parameters, which of the methods (Query/Exec) will be called is determined:

  • If the first parameter is with the base type `struct` (or a pointer to `struct`), then the `QueryContext` method will be called. If the response from the database does not contain any rows, then the error `sql.ErrNoRows` will be returned. That is, the behaviour is similar to `db.QueryRow`.
  • If the first parameter is with the base type `slice`, then the` QueryContext` method will be called. If the response from the database does not contain rows, then an empty list will be returned. The base type of the list item must be `stuct` (or a pointer to `struct`).
  • If the output parameter is one, with the type `error`, then the `ExecContext` method will be called.

Prepared statements

The generated code supports prepared statements. Prepared statements are cached after the first preparation. The `database/sql` library itself ensures that prepared statements are transparently applied to the desired database connection, including the processing of closed connections. In turn, the `go-gad/sal` library cares about reusing the prepared statement in the context of a transaction. When the prepared statement is executed, the arguments are passed using variable binding, transparently to the developer.

To support named arguments on the side of the `go-gad/sal` library, the query is converted to a form suitable for the database. Now there is a conversion support only for PostgreSQL. The field names of the query object are used as placeholders in named arguments. To specify a different name instead of the object field name, you must use the `sql` tag for the structure fields. Consider an example:

type DeleteOrdersRequest struct {
UserID int64 `sql:"user_id"`
CreateAt time.Time `sql:"created_at"`
func (r * DeleteOrdersRequest) Query() string {
return `DELETE FROM orders WHERE user_id=@user_id AND created_at<@end`

The query string will be converted, and the list will be passed to the query execution arguments using the matching table and variable binding:

// generated code:
db.Query("DELETE FROM orders WHERE user_id=$1 AND created_at<$2", &req.UserID, &req.CreatedAt)

Map structs to response messages

The `go-gad/sal` library cares about linking database response lines with response structures, table columns with structure fields:

type GetRubricsReq struct {}
func (r GetRubricReq) Query() string {
return `SELECT * FROM rubrics`
type Rubric struct {
ID int64 `sql:"id"`
CreateAt time.Time `sql:"created_at"`
Title string `sql:"title"`
type GetRubricsResp []*Rubric
type Store interface {
GetRubrics(ctx context.Context, req GetRubricsReq) (GetRubricsResp, error)

And if the database response is:

dev > SELECT * FROM rubrics;
id | created_at | title
1 | 2012-03-13 11:17:23.609 | Tech
2 | 2015-07-21 18:05:43.412 | Style
(2 rows)

Then the `GetRubricsResp` list will return to us, elements of which will be pointers to `Rubric`, where the fields are filled with values from the columns that correspond to the names of the tags.

If the database response contains columns with the same name, the corresponding structure fields will be selected in the order of the declaration.

dev > select * from rubrics, subrubrics;
id | title | id | title
1 | Tech | 3 | Politics

And structs:

type Rubric struct {
ID int64 `sql:"id"`
Title string `sql:"title"`
type Subrubric struct {
ID int64 `sql:"id"`
Title string `sql:"title"`
type GetCategoryResp struct {

Non-standard data types

The `database/sql` package provides support for basic data types (strings, numbers). In order to handle data types such as an `array` or `json` in a request or response, it is necessary to support the `driver.Valuer` and `sql.Scanner` interfaces. In various implementations of drivers there are special auxiliary functions. For example, `lib/pq.Array` (

func Array(a interface{}) interface {

By default, the `go-gad/sql` library for view structure fields

type DeleteAuthrosReq struct {
Tags []int64 `sql:"tags"`

will use the value `&req.Tags`. If the structure satisfies the `sal.ProcessRower` interface,

type ProcessRower interface {
ProcessRow(rowMap RowMap)

then the value used can be adjusted

func (r *DeleteAuthorsReq) ProcessRow(rowMap sal.RowMap) {
rowMap.Set("tags", pq.Array(r.Tags))
func (r *DeleteAuthorsReq) Query() string {
return `DELETE FROM authors WHERE tags=ANY(@tags::UUID[])`

This handler can be used for request and response arguments. In the case of a list in the response, the method must belong to the list item.


To support transactions, the interface (Store) must be extended with the following methods:

type Store interface {
BeginTx(ctx context.Context, opts *sql.TxOptions) (Store, error)

The implementation of the methods will be generated. The `BeginTx` method uses the connection from the current `sal.QueryHandler` object and opens the transaction `db.BeginTx(…)`; returns a new implementation object of the interface `Store`, but uses the resulting `*sql.Tx` object as a handle.

tx, err := client.BeginTx(ctx, nil)
_, err = tx.CreateAuthor(ctx, req1)
err = tx.UpdateAuthor(ctx, &req2)
err = tx.Tx().Commit(ctx)


Hooks are provided for embedding tools.

type BeforeQueryFunc func(ctx context.Context, query string, req interface{}) (context.Context, FinalizerFunc)
type FinalizerFunc func(ctx context.Context, err error)

The `BeforeQueryFunc` hook will be called before `db.PrepareContext` or `db.Query` is executed. That is, at the start of the program, when the cache of prepared statements is empty, when you call `store.GetAuthors`, the `BeforeQueryFunc` hook will be called twice. The `BeforeQueryFunc` hook can return the `FinalizerFunc` hook, which will be called before leaving the custom method, in our case `store.GetAuthors`, using `defer`.

When the hooks are executed, the context is filled with service keys with the following values:

  • `ctx.Value(sal.ContextKeyTxOpened)`, boolean indicates whether the method is called in the context of a transaction or not.
  • `ctx.Value(sal.ContextKeyOperationType)`, the string value of the operation type, `”QueryRow”`, `”Query”`, `”Exec”`, `”Commit”`, etc.
  • `ctx.Value(sal.ContextKeyMethodName)`, the string value of the interface method, for example, `”GetAuthors”`.

As arguments, the `BeforeQueryFunc` hook takes the sql string of the query and the argument `req` of the custom query method. The `FinalizerFunc` hook takes the variable` err` as an argument.

beforeHook := func(ctx context.Context, query string, req interface{}) (context.Context, sal.FinalizerFunc) {
start := time.Now()
return ctx, func(ctx context.Context, err error) {
"%q > Opeartion %q: %q with req %#v took [%v] inTx[%v] Error: %+v",
client := NewStore(db, sal.BeforeQuery(beforeHook))


"CreateAuthor" > Opeartion "Prepare": "INSERT INTO authors (Name, Desc, CreatedAt) VALUES($1, $2, now()) RETURNING ID, CreatedAt" with req <nil> took [50.819µs] inTx[false] Error: <nil>
"CreateAuthor" > Opeartion "QueryRow": "INSERT INTO authors (Name, Desc, CreatedAt) VALUES(@Name, @Desc, now()) RETURNING ID, CreatedAt" with req bookstore.CreateAuthorReq{BaseAuthor:bookstore.BaseAuthor{Name:"foo", Desc:"Bar"}} took [150.994µs] inTx[false] Error: <nil>

What’s next

  • Support for binding variables and prepared statements for MySQL.
  • Hook RowAppender to adjust the response.
  • Returns the value of `Exec.Result`.